WO2012144039A1 - 画像処理装置、および方法 - Google Patents
画像処理装置、および方法 Download PDFInfo
- Publication number
- WO2012144039A1 WO2012144039A1 PCT/JP2011/059759 JP2011059759W WO2012144039A1 WO 2012144039 A1 WO2012144039 A1 WO 2012144039A1 JP 2011059759 W JP2011059759 W JP 2011059759W WO 2012144039 A1 WO2012144039 A1 WO 2012144039A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- viewing zone
- viewer
- display
- image processing
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/305—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2300/00—Aspects of the constitution of display devices
- G09G2300/02—Composition of display devices
- G09G2300/023—Display panel composed of stacked panels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/068—Adjustment of display parameters for control of viewing angle adjustment
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
Definitions
- Embodiments described herein relate generally to an image processing apparatus and method.
- the viewer can observe the stereoscopic image with the naked eye without using special glasses.
- a stereoscopic image display device displays a plurality of images with different viewpoints, and controls these light beams by, for example, a parallax barrier, a lenticular lens, or the like.
- the controlled light beam is guided to the viewer's eyes, but the viewer can recognize the stereoscopic image if the viewer's observation position is appropriate.
- An area in which the viewer can observe a stereoscopic image is called a viewing area.
- Patent Literature 1 and Patent Literature 2 are disclosed as techniques for setting a viewing zone according to the position of a viewer.
- Patent Document 1 discloses that the position of the viewer is detected by a sensor, and the viewing zone position is realized by switching the right-eye image and the left-eye image in accordance with the viewer position.
- Patent Document 2 discloses that a signal emitted from a remote control device is detected and the display device is rotated in the direction in which the signal is emitted.
- the actual position of the viewer who views the stereoscopic image may deviate from the set viewing area, and it may be difficult for the viewer to observe the stereoscopic image.
- the problem to be solved by the present invention is to provide an image processing apparatus and method that enable a viewer to easily observe a good stereoscopic image.
- the image processing apparatus includes a display unit, a reception unit, a calculation unit, and a control unit.
- the display unit can display a stereoscopic image.
- the receiving unit receives a start signal for starting the setting of a viewing area in which a viewer can observe the stereoscopic image.
- the calculation unit calculates viewing zone information indicating the viewing zone based on the position information of the viewer.
- the control unit controls the display unit so that the viewing zone corresponding to the viewing zone information is set.
- FIG. 1 is a diagram of an image processing apparatus according to a first embodiment.
- FIG. 3 illustrates an example of a display portion in Embodiment 1;
- FIG. 4 is an example of a viewing zone according to the first embodiment.
- FIG. 3 is a diagram of viewing zone control according to the first embodiment.
- FIG. 3 is a diagram of viewing zone control according to the first embodiment.
- FIG. 3 is a diagram of viewing zone control according to the first embodiment.
- 5 is a flowchart of display control processing according to the first embodiment.
- FIG. 4 is a diagram of an image processing apparatus according to a second embodiment. 10 is a flowchart of display control processing according to the second embodiment.
- FIG. 9 is a diagram of an image processing apparatus according to a third embodiment. 10 is a flowchart of display control processing according to the third embodiment.
- FIG. 6 is a diagram of an image processing apparatus according to a fourth embodiment. 10 is a flowchart of display control processing according to the fourth embodiment.
- the image processing apparatus 10 is suitable for a TV, a PC (Personal Computer), or the like that allows a viewer to observe a stereoscopic image with the naked eye.
- a stereoscopic image is an image including a plurality of parallax images having parallax with each other.
- the image described in the embodiment may be either a still image or a moving image.
- FIG. 1 is a block diagram showing a functional configuration of the image processing apparatus 10.
- the image processing apparatus 10 can display a stereoscopic image. As illustrated in FIG. 1, the image processing apparatus 10 includes a reception unit 12, a calculation unit 14, a control unit 16, and a display unit 18.
- the receiving unit 12 receives a start signal for starting setting of a viewing zone in which one or a plurality of viewers can observe the stereoscopic image.
- the receiving unit 12 may receive a start signal from an external device (not shown) connected to the receiving unit 12 by wire or wirelessly. Examples of the external device include a known remote control device and an information terminal.
- the receiving unit 12 supplies the received start signal to the calculating unit 14.
- the viewing area indicates a range in which the viewer can observe the stereoscopic image displayed on the display unit 18.
- This observable range is a range (region) in real space.
- This viewing zone is determined by a combination of display parameters (details will be described later) of the display unit 18. For this reason, the viewing zone can be set by setting the display parameters of the display unit 18.
- the display unit 18 is a display device that displays a stereoscopic image. As shown in FIG. 2, the display unit 18 includes a display element 20 and an opening control unit 26. The viewer 33 observes the stereoscopic image displayed on the display unit 18 by observing the display element 20 through the opening control unit 26.
- the display element 20 displays a parallax image used for displaying a stereoscopic image.
- Examples of the display element 20 include a direct-view type two-dimensional display such as an organic EL (Organic Electro Luminescence), an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), and a projection display.
- the display element 20 may have a known configuration in which, for example, RGB sub-pixels are arranged in a matrix with RGB as one pixel.
- the RGB sub-pixels arranged in the first direction constitute one pixel
- the first direction is, for example, the column direction (vertical direction)
- the second direction is, for example, the row direction (horizontal direction).
- the arrangement of the subpixels of the display element 20 may be another known arrangement.
- the subpixels are not limited to the three colors RGB. For example, four colors may be used.
- the aperture control unit 26 emits light emitted from the display element 20 toward the front thereof in a predetermined direction through the opening.
- Examples of the opening control unit 26 include a lenticular lens and a parallax barrier.
- the opening of the opening control unit 26 is arranged so as to correspond to each element image 30 of the display element 20.
- a parallax image group (multi-parallax image) corresponding to a plurality of parallax directions is displayed on the display element 20.
- Light rays from the multi-parallax image are transmitted through the openings of the opening control unit 26.
- the viewer 33 located in the viewing zone observes different pixels included in the element image 30 with the left eye 33A and the right eye 33B, respectively.
- the viewer 33 can observe a stereoscopic image by displaying images with different parallaxes on the left eye 33A and the right eye 33B of the viewer 33, respectively.
- FIG. 3 is a schematic diagram illustrating an example of a viewing zone when a display parameter has a certain combination.
- the viewing area P is an area where the viewer 33 can observe the image displayed on the display unit 18.
- a plurality of white rectangular areas are the viewing area 32.
- the shaded area is a reverse viewing area 34 which is a range outside the viewing area. In the reverse vision region 34, it is difficult to observe a stereoscopic image satisfactorily due to occurrence of reverse vision or crosstalk.
- the viewing area 32 is determined by a combination of display parameters of the display unit 18. Returning to FIG. 2, as display parameters, the relative position between the display element 20 and the opening control unit 26, the distance between the display element 20 and the opening control unit 26, the angle of the display unit 18, the deformation of the display unit 18, and the display element 20 pixel pitch and the like.
- the relative position between the display element 20 and the opening control unit 26 indicates the position of the corresponding element image 30 with respect to the center of the opening of the opening control unit 26.
- the distance between the display element 20 and the opening control unit 26 indicates the shortest distance between the opening of the opening control unit 26 and the corresponding element image 30.
- the angle of the display unit 18 indicates a rotation angle with respect to a predetermined reference position when the display unit 18 is rotated about the vertical direction as a rotation axis.
- transformation of the display part 18 shows changing the display part 18 main body.
- the pixel pitch in the display element 20 indicates a pixel interval of each element image 30 of the display element 20. The combination of these display parameters uniquely determines the region where the viewing zone 32 is set in the real space.
- FIG. 4 to 7 are diagrams for explaining the control of the setting position and the setting range of the viewing zone 32 by adjusting each display parameter of the display unit 18.
- FIG. 4 to 7 are diagrams for explaining the control of the setting position and the setting range of the viewing zone 32 by adjusting each display parameter of the display unit 18.
- FIGS. 4 to 7 show the relationship between the display element 20 and the aperture control unit 26 in the display unit 18 and the viewing zone 32.
- FIGS. 4 to 7 show enlarged views of each element image 30 as appropriate.
- the position where the viewing zone 32 is set is controlled by adjusting the distance between the display element 20 and the opening control unit 26 and the relative position between the display element 20 and the opening control unit 26. The case where it does is demonstrated.
- FIG. 4A shows a basic positional relationship between the display unit 18 and its viewing area 32 (viewing area 32A).
- FIG. 4B shows a case where the distance between the display element 20 and the opening control unit 26 is shorter than that in FIG.
- the viewing zone 32 can be set at a position closer to the display unit 18 as the distance between the display element 20 and the opening control unit 26 is shortened (FIG. 4). 4 (A), viewing zone 32A, and FIG. 4 (B), viewing zone 32B). Conversely, as the distance between the display element 20 and the opening control unit 26 is increased, the viewing zone 32 can be set at a position further away from the display unit 18. The light density decreases as the viewing zone 32 is set closer to the display unit 18.
- FIG. 4C shows a case where the relative position of the display element 20 with respect to the opening control unit 26 is moved in the right direction (see the arrow R direction in FIG. 4C) compared to FIG. ing.
- the viewing zone 32 is in the left direction (in FIG. 4C).
- the viewing zone 32 moves to the right (not shown).
- FIG. 5 is an enlarged view of each pixel of the display element 20 and the opening control unit 26 in the display unit 18.
- FIG. 6A shows a basic positional relationship between the display unit 18 and its viewing area 32 (viewing area 32A).
- the position of each pixel of the display element 20 and the opening control unit 26 is closer to the end of the screen of the display element 20 (right end (end part in the arrow R direction in FIG. 5), left end (end part in the arrow L direction in FIG. 5)). Shift the position relatively large.
- the viewing zone 32 moves to a position closer to the display unit 18, and the width of the viewing zone 32 becomes narrower (see viewing zone 32D in FIG. 6B).
- the width of the viewing area 32 indicates the maximum horizontal length in each viewing area 32.
- the width of the viewing zone 32 may be referred to as a viewing zone setting distance.
- the amount by which the position of each pixel of the display element 20 and the opening control unit 26 is relatively shifted is reduced toward the edge of the screen of the display element 20. Then, the viewing zone 32 moves to a position farther from the display unit 18, and the width of the viewing zone 32 becomes wider (see the viewing zone 32E in FIG. 6C).
- FIG. 7A shows the basic positional relationship between the display unit 18 and its viewing zone 32 (viewing zone 32A).
- FIG. 7B shows a state where the display unit 18 is rotated (in the direction of arrow P in FIG. 7). As shown in FIGS. 7A and 7B, when the display unit 18 is rotated and the angle of the display unit 18 is adjusted, the position of the viewing zone 32 moves from the viewing zone 32A to the viewing zone 32F.
- FIG. 7C illustrates a state in which the position and direction of the display element 20 with respect to the opening control unit 26 are adjusted. As shown in FIG. 7C, when the position and direction of the display element 20 with respect to the aperture control unit 26 are changed, the viewing zone 32 moves from the viewing zone 32A to the viewing zone 32G.
- FIG. 7D shows a state in which the entire display unit 18 is deformed. As shown in FIGS. 7A and 7D, the viewing zone 32 changes from the viewing zone 32A to the viewing zone 32H by deforming the display unit 18.
- the region (position, size, etc.) where the viewing zone 32 is set in the real space is uniquely determined by the combination of the display parameters of the display unit 18.
- the calculation unit 14 when the calculation unit 14 receives a start signal from the reception unit 12, based on the position information indicating the position of the viewer 33, the viewer 33 can view the stereoscopic image at the position. Is calculated.
- the position information indicating the position of the viewer 33 is indicated by position coordinates in the real space.
- the center of the display surface of the display unit 18 is set as the origin, the X axis in the horizontal and horizontal directions, the Y axis in the vertical direction, and the Z axis in the normal direction of the display surface of the display unit 18 are set.
- the method of setting coordinate marks in real space is not limited to this.
- the position information of the position of the viewer 33 shown in FIG. 3 is indicated by (X1, Y1, Z1).
- position information indicating the position of the viewer 33 is stored in advance in a storage medium such as a memory (not shown). That is, the calculation unit 14 acquires position information from the memory.
- the viewer position information stored in the memory includes, for example, the position of a representative viewer 33 when the image processing apparatus 10 is used, the position registered in advance by the viewer 33, and the position of the image processing apparatus 10. It may be information indicating the position of the viewer 33 at the end of the previous use, the position preset in the manufacturing stage, and the like.
- the position information is not limited to these, and may be a combination of these pieces of information.
- This position information is preferably position information indicating a position in the viewing area P (see FIG. 3).
- the viewing area P is determined by the configuration of each display unit 18.
- Information indicating the viewing area P is also stored in advance in a storage medium such as a memory (not shown).
- the calculation unit 14 When the calculation unit 14 receives a start signal from the reception unit 12, the calculation unit 14 calculates viewing zone information indicating a viewing zone in which a stereoscopic image can be observed at the position of the viewer 33 indicated by the position information. For the calculation of the viewing zone information, for example, viewing zone information indicating the viewing zone 32 corresponding to the combination of the display parameters described above is stored in advance in a memory (not shown). Then, the calculation unit 14 calculates viewing area information by searching the memory for viewing area information including position information indicating the position of the viewer 33 in the viewing area 32.
- the calculation unit 14 may calculate viewing zone information by calculation.
- the calculation unit 14 stores in advance a calculation formula for calculating viewing area information from position information in a memory (not shown) so that position information indicating the position of the viewer 33 is included in the viewing area 32.
- the calculation unit 14 calculates viewing zone information using the position information and the arithmetic expression.
- the calculation unit 14 ensures that more viewers 33 are included in the viewing zone 32. It is preferable to calculate viewing zone information.
- the control unit 16 controls the display unit 18 to set the viewing zone 32 according to the viewing zone information calculated by the calculation unit 14. That is, the control unit 16 adjusts the display parameter of the display unit 18 and sets the viewing zone 32. Specifically, the display unit 18 is provided with a drive unit (not shown) for adjusting each display parameter described above. In addition, the control unit 16 stores the viewing zone information indicating the viewing zone 32 corresponding to the combination of the display parameters described above in advance in a memory (not shown). Then, the control unit 16 reads a combination of display parameters corresponding to the viewing zone information calculated by the calculation unit 14 from the memory, and controls a driving unit corresponding to each read display parameter.
- the display unit 18 displays a stereoscopic image of the viewing zone 32 corresponding to the viewing zone information calculated by the calculation unit 14.
- step S100: No It is determined whether the receiving unit 12 has received a start signal.
- step S100: Yes the present routine is terminated (step S100: No).
- step S100: Yes the calculation unit 14 calculates viewing area information from the position information of the viewer 33 (step S102).
- the control unit 16 controls the display unit 18 so that the viewing zone 32 corresponding to the viewing zone information calculated by the calculation unit 14 is set (step S104). Then, this routine ends.
- the calculating unit 14 determines from the position information of the viewer 33.
- the viewing zone information indicating the viewing zone 32 in which the viewer 33 can observe the stereoscopic image at the position is calculated.
- the control unit 16 controls the display unit 18 so that the viewing zone 32 corresponding to the calculated viewing zone information is set.
- the viewing zone 32 is not set (including changing) as needed, and the viewing zone 32 is set when the receiving unit 12 receives the start signal of the viewing zone 32. To do. This reduces the possibility that the viewer 33 perceives the reverse viewing state because the viewing zone 32 is changed due to a malfunction or the like during viewing of the stereoscopic image during a period other than when the start signal is received. it can. Further, in the image processing apparatus 10 of the present embodiment, the calculation unit 14 calculates viewing zone information indicating a viewing zone in which the viewer 33 can observe a stereoscopic image at the position from the position information of the viewer 33. Thereby, it can suppress that the viewing area 32 is set to the position which deviated from the position of the viewer 33.
- the viewer 33 can easily observe a good stereoscopic image.
- the position of the viewer 33 is detected by the detection unit.
- a determination unit that determines whether to change the viewing zone is provided.
- FIG. 9 is a block diagram illustrating a functional configuration of the image processing apparatus 10B according to the second embodiment.
- the image processing apparatus 10 ⁇ / b> B according to the present embodiment includes a reception unit 12 ⁇ / b> B, a calculation unit 14 ⁇ / b> B, a control unit 16 ⁇ / b> B, a display unit 18, a detection unit 40, and a determination unit 42. .
- the display unit 18 is the same as that in the first embodiment. Similarly to the reception unit 12 described in the first embodiment, the reception unit 12B receives a start signal from an external device (not shown) connected to the reception unit 12B by wire or wireless. In the present embodiment, the reception unit 12B supplies a signal indicating the received start signal to the detection unit 40.
- the detecting unit 40 detects the position of the viewer 33 located in the real space within the viewing area P (see FIG. 2). In the present embodiment, the detection unit 40 detects the position of the viewer 33 when the reception unit 12B receives a start signal.
- the detection unit 40 may be a device that can detect the position of the viewer 33 located in the real space within the viewing area P.
- the detection unit 40 may be an imaging device such as a visible camera or an infrared camera, or a device such as a radar or a sensor. In these devices, the position of the viewer 33 is detected from the obtained information (a captured image in the case of a camera) using a known technique.
- the detection unit 40 detects the viewer 33 and calculates the position of the viewer 33 by analyzing an image obtained by imaging. Thereby, the detection unit 40 detects the position of the viewer 33.
- a radar is used as the detection unit 40, the viewer 33 is detected and the position of the viewer 33 is calculated by performing signal processing on the obtained radar signal. Thereby, the detection unit 40 detects the position of the viewer 33.
- the detection unit 40 may detect any target part that can be determined to be a person, such as the face, head, whole person, marker, etc. of the viewer 33 when the position of the viewer 33 is detected. Such a method for detecting an arbitrary target region may be performed by a known method.
- the detection unit 40 supplies a signal indicating the detection result including the position information of the viewer 33 to the calculation unit 14B and the determination unit 42.
- the detection unit 40 may output a signal indicating a detection result including feature information indicating the characteristics of the viewer 33 to the calculation unit 14B in addition to the position information of the viewer 33.
- this feature information for example, there is information set as an extraction target in advance such as feature points of the face of the viewer 33.
- the calculation unit 14B calculates viewing area information that allows the viewer 33 to observe the stereoscopic image at the position from the position information indicating the position of the viewer 33 included in the signal indicating the detection result received from the detection unit 40. .
- the calculation method of the viewing zone information is the same as the calculation by the calculation unit 14 in the first embodiment.
- the calculation unit 14B executes the calculation of the viewing information when a signal indicating the detection result is received from the detection unit 40.
- the calculation unit 14 ⁇ / b> B ensures that at least the specific viewer 33 set in advance is included in the viewing zone 32.
- the viewing zone information may be calculated.
- the specific viewer 33 is different from other viewers 33 such as a viewer 33 registered in advance or a viewer 33 having a specific external device for transmitting the start signal. Is a viewer.
- the calculation unit 14B stores the characteristic information of the specific one or more viewers 33 in a memory (not illustrated) in advance. Then, the calculation unit 14B reads the feature information that matches the feature information stored in advance in the memory among the feature information included in the signal indicating the detection result received from the detection unit 40. Then, the calculation unit 14B extracts the position information of the viewer 33 corresponding to the read feature information from the detection result, and based on the extracted position information, the viewing area information that allows the stereoscopic image to be observed at the position of the position information. calculate.
- the determination unit 42 determines whether or not to set the viewing zone 32 (change from the current viewing zone 32) based on the position information of the viewer 33 detected by the detection unit 40.
- the current viewing zone 32 indicates the viewing zone 32 realized (set) by a combination of display parameters of the current display unit 18. Moreover, the present indicates that the reception unit 12B has received a signal indicating a start signal.
- the determination unit 42 performs this determination as follows. Specifically, it is assumed that the position of the position information of the viewer 33 is currently within the range of the viewing zone 32 set on the display unit 18. When the current viewing zone 32 is changed and the position of the viewer 33 is outside the range of the viewing zone 32, the determination unit 42 determines that the viewing zone is not set (changed). Whether the position of the viewer 33 falls outside the range of the viewing zone 32 when the current viewing zone 32 is changed may be determined as follows, for example. Specifically, the determination unit 42 calculates viewing zone information from the position information included in the detection result received from the detection unit 40 in the same manner as the calculation unit 14C described later. Then, the determination unit 42 performs the determination by determining whether or not the position of the position information is included in the viewing area 32 of the calculated viewing area information.
- the determination unit 42 determines that the viewing area is not set (changed). This is because the viewer 33 exists outside the viewing area P where the display unit 18 can be viewed.
- the determination as to whether or not it is outside the viewing area P is stored in advance in a memory (not shown) information indicating the viewing area P (for example, a set of position coordinates). It is determined whether or not the position information included in the signal indicating the detection result received from the detection unit 40 is outside the viewing zone region P.
- the determination unit 42 supplies a signal indicating the determination result to the control unit 16B.
- the signal indicating the determination result is information indicating whether the viewing zone has been changed or not.
- the control unit 16B sets the viewing zone 32 according to the viewing zone information calculated by the calculation unit 14B. Then, the display unit 18 is controlled. The control unit 16B adjusts the display parameters of the display unit 18 so that the viewing zone 32 is set as in the first embodiment. Thereby, the display unit 18 displays a stereoscopic image in the viewing zone 32 corresponding to the viewing zone information calculated by the calculation unit 14.
- the control unit 16B maintains the viewing zone 32 that has already been set.
- the control unit 16B controls the display unit 18 so that the viewing zone 32 is set to the reference state.
- the reference state may be a state based on recommended parameters set in the manufacturing stage.
- control unit 16B controls the display unit 18 to change the current viewing zone 32.
- control unit 16B controls the display unit 18 so as to maintain the previously set viewing zone 32 or to set the reference state. To do.
- step S200 It is determined whether or not the receiving unit 12B has received a start signal (step S200). When the receiving unit 12B determines that the start signal has not been received, this routine is terminated (step S200: No). When it is determined that the reception unit 12B has received the start signal (step S200: Yes), the detection unit 40 detects the position of the viewer 33 (step S202). Then, the detection unit 40 supplies a signal indicating the detection result to the calculation unit 14B.
- the calculation unit 14B calculates viewing zone information from the position information of the viewer 33 included in the signal indicating the detection result (step S204).
- the calculation unit 14B supplies the calculated viewing zone information to the determination unit 42 and the control unit 16B.
- the determination unit 42 determines whether or not to set the viewing zone 32 (change from the current viewing zone 32) (step S206). The determination unit 42 supplies the determination result to the control unit 16B.
- the control unit 16B When the determination unit 42 determines that the viewing zone has been changed (step S206: Yes), the control unit 16B outputs a determination result (step S208). Specifically, the control unit 16 ⁇ / b> B displays information indicating that there is a viewing zone change as a determination result on the display unit 18. In the present embodiment, the case where the control unit 16B displays information indicating the determination result by the determination unit 42 on the display unit 18 in step S208 and step S212 described later. However, the output destination of the determination result is not limited to the display unit 18. For example, the control unit 16B may output the determination result to a display device different from the display unit 18 or a known audio output device. Further, the control unit 16B may output the determination result to an external device connected to the control unit 16C wirelessly or by wire.
- the control unit 16B controls the display unit 18 so that the viewing zone 32 corresponding to the viewing zone information calculated by the calculation unit 14B is set (step S210).
- the control of the display unit 18 by the control unit 16B is the same as in the first embodiment. Then, this routine ends.
- step S206 determines that there is no viewing zone change
- step S212 the control unit 16B outputs information indicating that there is no viewing zone change as a determination result (step S212). Then, this routine ends.
- the determination unit 42 may be designed in advance so as to make a YES determination in step S201.
- the detection unit 40 detects the position of the viewer 33, and the calculation unit 14B calculates the viewing zone information based on the detected position information. To do. For this reason, the position of the viewer 33 can be obtained more accurately.
- the determination unit 42 determines whether or not to change the current viewing zone 32. Then, when the determination unit 42 determines that there is a change in the viewing zone, the control unit 16B controls the display unit 18 to change the current viewing zone 32. On the other hand, when the determination unit 42 determines that the viewing zone has not been changed, the control unit 16B controls the display unit 18 so as to maintain the previously set viewing zone 32 or to set the reference state. To do.
- the determination unit 42 performs the above determination, it is possible to suppress an unnecessary change of the viewing zone 32 and setting of the viewing zone 32 that causes the viewer 33 to deteriorate the observation state of the stereoscopic image.
- FIG. 11 is a block diagram illustrating a functional configuration of the image processing apparatus 10C according to the third embodiment.
- the image processing apparatus 10C according to the present embodiment includes a reception unit 12B, a calculation unit 14C, a control unit 16C, a display unit 18, a detection unit 40C, and a determination unit 42C. .
- the receiving unit 12B, the calculating unit 14C, the control unit 16C, the display unit 18, the detecting unit 40C, and the determining unit 42C are the receiving unit 12B, the calculating unit 14B, the control unit 16B, the detecting unit 40B, and the determining unit in the second embodiment. It is the same as each of 42B. The following points are different.
- the detection unit 40C supplies a signal indicating the detection result of the position of the viewer 33 to the determination unit 42C.
- the determination unit 42C determines whether to set the viewing zone 32 (change from the current viewing zone 32). Then, the determination unit 42C supplies a signal indicating the determination result to the calculation unit 14C.
- the calculation unit 14C calculates the viewing zone information when the signal indicating the determination result received from the determination unit 42C indicates that the viewing zone has been changed. Then, the control unit 16C controls the display unit 18 when a signal indicating the calculation result of the viewing zone information is received from the calculation unit 14C.
- the detection unit 40C detects the position of the viewer 33 (steps S200, S200: Yes, S202).
- the determination unit 42C determines whether or not to set the viewing zone 32 (changes from the current viewing zone 32) and determines that there is a viewing zone change
- the control unit 16C indicates that there is a viewing zone change as a determination result.
- Information is output (steps S206, S206: Yes, step S208).
- the calculation unit 14C calculates viewing zone information from the position information of the viewer 33 included in the detection result by the detection unit 40C (step S209).
- the detection unit 40C supplies the calculated viewing zone information to the control unit 16C.
- the control unit 16C controls the display unit 18 so that the viewing zone 32 corresponding to the viewing zone information calculated by the calculation unit 14C is set (step S210). Then, this routine ends.
- step S206 determines that there is no viewing zone change
- step S212 determines that there is no viewing zone change as a determination result
- the determination unit 42C determines whether or not to change the current viewing zone 32.
- the calculation unit 14C calculates the viewing zone information.
- the image processing apparatus 10C of the present embodiment it is possible to suppress an unnecessary change in the viewing zone 32 or a change in the viewing zone 32 that causes the viewer 33 to deteriorate the observation state of the stereoscopic image.
- FIG. 13 is a block diagram illustrating a functional configuration of the image processing apparatus 10D according to the fourth embodiment.
- the image processing apparatus 10D according to the present embodiment includes a receiving unit 12D, a calculating unit 14D, a control unit 16B, a display unit 18, a detecting unit 40, and a determining unit 42D. .
- the receiving unit 12D, the calculating unit 14D, the control unit 16B, the display unit 18, the detecting unit 40, and the determining unit 42D are the receiving unit 12B, the calculating unit 14B, the control unit 16B, the display unit 18, and the detecting unit in the second embodiment. 40 and the determination unit 42. The following points are different.
- the reception unit 12D supplies the received start signal to the calculation unit 14D, the detection unit 40, and the determination unit 42D.
- the calculation unit 14D receives a start signal from the reception unit 12D and receives a signal indicating a detection result from the detection unit 40
- the calculation unit 14D calculates the viewing zone information in the same manner as in the second embodiment.
- the determination unit 42D receives a start signal from the reception unit 12D and receives a signal indicating a detection result from the detection unit 40
- the determination unit 42D performs the determination in the same manner as in the second embodiment.
- step S2000 It is determined whether or not the receiving unit 12D has received a start signal (step S2000).
- the present routine is terminated (step S2000: No).
- step S2000: Yes the reception unit 12D supplies the start signal to the calculation unit 14D, the determination unit 42D, and the detection unit 40.
- the detection unit 40 detects the position of the viewer 33 (step S2020). Then, the detection unit 40 supplies the detection result to the calculation unit 14D and the determination unit 42D.
- the calculation unit 14D calculates viewing zone information from the position information of the viewer 33 included in the detection result (step S2040). .
- the detection unit 40 supplies the calculated viewing zone information to the determination unit 42D and the control unit 16B.
- the determining unit 42D sets the viewing zone 32 (changes from the current viewing zone 32). It is determined whether or not (step S2060).
- the determination unit 42D supplies a signal indicating the determination result to the control unit 16B.
- step S2060 When the determination unit 42D determines that there is a viewing zone change (step S2060: Yes), the control unit 16B outputs information indicating that there is a viewing zone change (step S2080).
- step S2080 The process in step S2080 is the same as step S208 in the second embodiment.
- control unit 16B controls the display unit 18 so that the viewing zone 32 corresponding to the viewing zone information calculated by the calculation unit 14D is set (step S2100).
- the control of the display unit 18 by the control unit 16B is the same as in the second embodiment. Then, this routine ends.
- step S2060 determines that there is no viewing zone change
- step S2120 the control unit 16B outputs information indicating that there is no viewing zone change as a determination result. Then, this routine ends.
- the detection unit 40 detects the position of the viewing zone 32 and the calculation unit 14D calculates the viewing zone information. And the determination by the determination unit 42D.
- the viewing zone 32 can be changed when the receiving unit 12D receives the start signal.
- an image processing program for executing display control processing executed by the image processing apparatuses 10, 10B, 10C, and 10D according to the first to fourth embodiments is provided by being incorporated in advance in a ROM or the like.
- the image processing program executed by the image processing apparatuses 10, 10B, 10C, and 10D according to the first to fourth embodiments is an installable format or executable file, and is a CD-ROM, a flexible disk (FD), a CD- You may comprise so that it may record and provide on computer-readable recording media, such as R and DVD (Digital Versatile Disk).
- the image processing programs executed by the image processing apparatuses 10, 10B, 10C, and 10D according to the first to fourth embodiments are stored on a computer connected to a network such as the Internet and provided by being downloaded via the network. You may comprise. Further, the image processing program executed by the image processing apparatuses 10, 10B, 10C, and 10D according to the first to fourth embodiments may be provided or distributed via a network such as the Internet.
- the image processing program executed by the image processing apparatuses 10, 10B, 10C, and 10D includes the above-described units (reception unit, calculation unit, control unit, detection unit, determination unit, display unit). It has a module configuration, and as actual hardware, a CPU (processor) reads out and executes an image processing program from the ROM, and the above-described units are loaded onto the main storage device, and a reception unit, a calculation unit, and a control unit The display unit, the detection unit, and the determination unit are generated on the main storage device.
- a CPU processor
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Devices For Indicating Variable Information By Combining Individual Elements (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Abstract
Description
実施の形態1の画像処理装置10は、視聴者が裸眼で立体画像を観察することが可能なTVやPC(Personal Computer)等に好適である。立体画像とは、互いに視差を有する複数の視差画像を含む画像である。
実施の形態2では、検出部で視聴者33の位置を検出する。また、実施の形態2では、視域を変更するか否かを判定する判定部を備える。
なお、画像処理装置10Bの初回使用時には、ステップS201において、判定部42はYES判定とするように予め設計されていても構わない。
図11は、実施の形態3の画像処理装置10Cの機能的構成を示すブロック図である。本実施の形態の画像処理装置10Cは、図11に示すように、受信部12Bと、算出部14Cと、制御部16Cと、表示部18と、検出部40Cと、判定部42Cと、を備える。
図13は、実施の形態4の画像処理装置10Dの機能的構成を示すブロック図である。本実施の形態の画像処理装置10Dは、図13に示すように、受信部12Dと、算出部14Dと、制御部16Bと、表示部18と、検出部40と、判定部42Dと、を備える。
12、12B、12D 受信部
14、14B、14C、14D 算出部
16、16B、16C 制御部
18 表示部
40、40C 検出部
42、42C、42D 判定部
Claims (6)
- 立体画像を表示可能な表示部と、
前記立体画像を視聴者が観察可能な視域の設定を開始させるための開始信号を受信する受信部と、
前記開始信号が受信された場合、前記視聴者の位置情報に基づいて、前記視域を示す視域情報を算出する算出部と、
前記視域情報に応じた前記視域が設定されるよう、前記表示部を制御する制御部と、
を備える画像処理装置。 - 前記視聴者の位置を検出する検出部を更に備え、
前記算出部は、前記検出部から前記位置情報を取得する、
請求項1に記載の画像処理装置。 - 前記位置情報に基づいて、前記視域を設定するか否かを判定する判定部を更に備え、
前記視域を設定すると判定された場合、前記制御部は、前記視域が設定されるように、前記表示部を制御する、
請求項1または請求項2に記載の画像処理装置。 - 前記位置情報に基づいて、前記視域を算出するか否かを判定する判定部を更に備え、
前記視域を設定すると判定された場合、前記算出部は、前記視域情報を算出する、
請求項1または請求項2に記載の画像処理装置。 - 前記視聴者の位置情報を記憶する記憶部を更に備え、
前記算出部は、前記記憶部から前記位置情報を取得する、
請求項1に記載の画像処理装置。 - 表示部に表示される立体画像を、視聴者が観察可能な視域の設定を開始させるための開始信号を受信し、
前記開始信号が受信された場合、前記視聴者の位置情報に基づいて、前記視域を示す視域情報を算出し、
前記視域情報に応じた前記視域が設定されるよう、前記表示部を制御する、
画像処理方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2011/059759 WO2012144039A1 (ja) | 2011-04-20 | 2011-04-20 | 画像処理装置、および方法 |
CN2011800030088A CN102860018A (zh) | 2011-04-20 | 2011-04-20 | 图像处理装置及方法 |
JP2011544729A JP5143291B2 (ja) | 2011-04-20 | 2011-04-20 | 画像処理装置、方法、および立体画像表示装置 |
TW100133439A TWI412267B (zh) | 2011-04-20 | 2011-09-16 | Image processing apparatus and method |
US13/360,080 US20120268455A1 (en) | 2011-04-20 | 2012-01-27 | Image processing apparatus and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2011/059759 WO2012144039A1 (ja) | 2011-04-20 | 2011-04-20 | 画像処理装置、および方法 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/360,080 Continuation US20120268455A1 (en) | 2011-04-20 | 2012-01-27 | Image processing apparatus and method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012144039A1 true WO2012144039A1 (ja) | 2012-10-26 |
Family
ID=47020959
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/059759 WO2012144039A1 (ja) | 2011-04-20 | 2011-04-20 | 画像処理装置、および方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20120268455A1 (ja) |
JP (1) | JP5143291B2 (ja) |
CN (1) | CN102860018A (ja) |
TW (1) | TWI412267B (ja) |
WO (1) | WO2012144039A1 (ja) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103096109B (zh) * | 2013-01-18 | 2015-05-06 | 昆山龙腾光电有限公司 | 多视图自动立体显示器及显示方法 |
JP2014241473A (ja) * | 2013-06-11 | 2014-12-25 | 株式会社東芝 | 画像処理装置、方法、及びプログラム、並びに、立体画像表示装置 |
CN104683786B (zh) * | 2015-02-28 | 2017-06-16 | 上海玮舟微电子科技有限公司 | 裸眼3d设备的人眼跟踪方法及装置 |
US10269279B2 (en) * | 2017-03-24 | 2019-04-23 | Misapplied Sciences, Inc. | Display system and method for delivering multi-view content |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11331876A (ja) * | 1998-05-11 | 1999-11-30 | Ricoh Co Ltd | マルチ画像表示装置 |
JP2008180860A (ja) * | 2007-01-24 | 2008-08-07 | Funai Electric Co Ltd | 表示システム |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10174127A (ja) * | 1996-12-13 | 1998-06-26 | Sanyo Electric Co Ltd | 立体表示方法および立体表示装置 |
US6593957B1 (en) * | 1998-09-02 | 2003-07-15 | Massachusetts Institute Of Technology | Multiple-viewer auto-stereoscopic display systems |
US6351280B1 (en) * | 1998-11-20 | 2002-02-26 | Massachusetts Institute Of Technology | Autostereoscopic display system |
US6288704B1 (en) * | 1999-06-08 | 2001-09-11 | Vega, Vista, Inc. | Motion detection and tracking system to control navigation and display of object viewers |
JP2001195582A (ja) * | 2000-01-12 | 2001-07-19 | Mixed Reality Systems Laboratory Inc | 画像検出装置、画像検出方法、立体表示装置、表示コントローラ、立体表示システムおよびプログラム記憶媒体 |
JP3450801B2 (ja) * | 2000-05-31 | 2003-09-29 | キヤノン株式会社 | 瞳孔位置検出装置及び方法、視点位置検出装置及び方法、並びに立体画像表示システム |
JP2001356298A (ja) * | 2000-06-12 | 2001-12-26 | Denso Corp | 立体映像表示装置 |
US6931596B2 (en) * | 2001-03-05 | 2005-08-16 | Koninklijke Philips Electronics N.V. | Automatic positioning of display depending upon the viewer's location |
JP3469884B2 (ja) * | 2001-03-29 | 2003-11-25 | 三洋電機株式会社 | 立体映像表示装置 |
JP2003107392A (ja) * | 2001-09-28 | 2003-04-09 | Sanyo Electric Co Ltd | 頭部位置追従型立体映像表示装置 |
CN1607502A (zh) * | 2003-10-15 | 2005-04-20 | 胡家璋 | 可利用肢体来控制光标的光标仿真器及其仿真方法 |
JP4508740B2 (ja) * | 2004-06-22 | 2010-07-21 | キヤノン株式会社 | 画像処理装置 |
KR100652157B1 (ko) * | 2004-11-26 | 2006-11-30 | 가부시키가이샤 엔.티.티.도코모 | 화상 표시 장치, 입체 화상 표시 장치 및 입체 화상 표시시스템 |
JP2009238117A (ja) * | 2008-03-28 | 2009-10-15 | Toshiba Corp | 多視差画像生成装置および方法 |
CN101750746B (zh) * | 2008-12-05 | 2014-05-07 | 财团法人工业技术研究院 | 立体影像显示器 |
JP4691697B2 (ja) * | 2009-01-27 | 2011-06-01 | Necカシオモバイルコミュニケーションズ株式会社 | 電子機器、および、プログラム |
US20110298795A1 (en) * | 2009-02-18 | 2011-12-08 | Koninklijke Philips Electronics N.V. | Transferring of 3d viewer metadata |
US20100225734A1 (en) * | 2009-03-03 | 2010-09-09 | Horizon Semiconductors Ltd. | Stereoscopic three-dimensional interactive system and method |
JP2010217996A (ja) * | 2009-03-13 | 2010-09-30 | Omron Corp | 文字認識装置、文字認識プログラム、および文字認識方法 |
JP2011030182A (ja) * | 2009-06-29 | 2011-02-10 | Sony Corp | 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 |
JP5521486B2 (ja) * | 2009-06-29 | 2014-06-11 | ソニー株式会社 | 立体画像データ送信装置および立体画像データ送信方法 |
JP5306275B2 (ja) * | 2010-03-31 | 2013-10-02 | 株式会社東芝 | 表示装置及び立体画像の表示方法 |
KR20120007289A (ko) * | 2010-07-14 | 2012-01-20 | 삼성전자주식회사 | 디스플레이 장치 및 그 입체감 설정 방법 |
-
2011
- 2011-04-20 JP JP2011544729A patent/JP5143291B2/ja not_active Expired - Fee Related
- 2011-04-20 CN CN2011800030088A patent/CN102860018A/zh active Pending
- 2011-04-20 WO PCT/JP2011/059759 patent/WO2012144039A1/ja active Application Filing
- 2011-09-16 TW TW100133439A patent/TWI412267B/zh not_active IP Right Cessation
-
2012
- 2012-01-27 US US13/360,080 patent/US20120268455A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11331876A (ja) * | 1998-05-11 | 1999-11-30 | Ricoh Co Ltd | マルチ画像表示装置 |
JP2008180860A (ja) * | 2007-01-24 | 2008-08-07 | Funai Electric Co Ltd | 表示システム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2012144039A1 (ja) | 2014-07-28 |
TWI412267B (zh) | 2013-10-11 |
TW201244461A (en) | 2012-11-01 |
JP5143291B2 (ja) | 2013-02-13 |
CN102860018A (zh) | 2013-01-02 |
US20120268455A1 (en) | 2012-10-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5881732B2 (ja) | 画像処理装置、立体画像表示装置、画像処理方法および画像処理プログラム | |
KR102030830B1 (ko) | 곡면형 다시점 영상 디스플레이 장치 및 그 제어 방법 | |
JP5050120B1 (ja) | 立体画像表示装置 | |
US9438893B2 (en) | Method for setting stereoscopic image data at a stereoscopic image display system by shifting data to a vertical direction | |
JP6142985B2 (ja) | 自動立体ディスプレイおよびその製造方法 | |
US8477181B2 (en) | Video processing apparatus and video processing method | |
US11095872B2 (en) | Autostereoscopic 3-dimensional display | |
US20170070728A1 (en) | Multiview image display apparatus and control method thereof | |
US9930322B2 (en) | Three-dimensional image display device | |
JP2013527932A5 (ja) | ||
CN104836998A (zh) | 显示设备及其控制方法 | |
KR20160058327A (ko) | 입체 영상 표시 장치 | |
JP5143291B2 (ja) | 画像処理装置、方法、および立体画像表示装置 | |
TWI500314B (zh) | A portrait processing device, a three-dimensional portrait display device, and a portrait processing method | |
JP5711104B2 (ja) | 画像表示装置、方法、プログラム、及び画像処理装置 | |
JP2014135590A (ja) | 画像処理装置、方法、及びプログラム、並びに、立体画像表示装置 | |
JP2004320781A (ja) | 立体映像表示装置 | |
US20140362197A1 (en) | Image processing device, image processing method, and stereoscopic image display device | |
KR20050076946A (ko) | 입체영상 표시장치 및 방법 | |
US9185399B2 (en) | Stereoscopic video receiver | |
US9151952B2 (en) | Method of driving a display panel, a display panel driving apparatus for performing the method and a display apparatus having the display panel driving apparatus | |
KR20120072178A (ko) | 무안경 멀티뷰 또는 수퍼멀티뷰 영상 구현 시스템 | |
JP2015144457A (ja) | 画像表示装置、方法、及びプログラム | |
JP5343157B2 (ja) | 立体画像表示装置、表示方法、およびテストパターン | |
KR20160081715A (ko) | 2d 및 3d 겸용 영상표시장치 및 이의 구동방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180003008.8 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2011544729 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11863734 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11863734 Country of ref document: EP Kind code of ref document: A1 |