USRE45785E1 - Virtual reality camera - Google Patents
Virtual reality camera Download PDFInfo
- Publication number
- USRE45785E1 USRE45785E1 US13/846,801 US201313846801A USRE45785E US RE45785 E1 USRE45785 E1 US RE45785E1 US 201313846801 A US201313846801 A US 201313846801A US RE45785 E USRE45785 E US RE45785E
- Authority
- US
- United States
- Prior art keywords
- camera
- images
- image
- view
- fields
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H04N5/23238—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/189—Recording image signals; Reproducing recorded image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0088—Synthesising a monoscopic image signal from stereoscopic images, e.g. synthesising a panoramic or high resolution monoscopic image
Definitions
- the present invention relates to the field of photography, and more particularly to a camera that combines images based on a spatial relationship between the images.
- a panoramic image of a scene has traditionally been created by rotating a vertical slit camera about an optical center. Using this technique, film at the optical center is continuously exposed to create a wide field of view (e.g., a 360° field of view). Because of their specialized design, however, vertical slit cameras are relatively expensive. Further, because the panoramic image is captured in a continuous rotation of the camera, it is difficult to adjust the camera to account for changes in the scene, such as lighting or focal depth, as the camera is rotated.
- image stitching In a more modern technique for creating panoramic images, called “image stitching”, a scene is photographed from different camera orientations to obtain a set of discrete images. The discrete images of the scene are then transferred to a computer which executes application software to blend the discrete images into a panoramic image.
- application software may be executed to render user-specified portions of the panoramic image onto a display.
- the effect is to create a virtual environment that can be navigated by a user. Using a mouse, keyboard, headset or other input device, the user can pan about the virtual environment and zoom in or out to view objects of interest.
- One disadvantage of existing image stitching techniques is that photographed images must be transferred from the camera to the computer before they can be stitched together to create a navigable panoramic image.
- photographed images must be transferred from the camera to the computer before they can be stitched together to create a navigable panoramic image.
- film must be exposed, developed, printed and digitized (e.g., using a digital scanner) to obtain a set of images that can be stitched into a panoramic image.
- digitized e.g., using a digital scanner
- Another disadvantage of existing image stitching techniques is that the orientation of the camera used to photograph each discrete image is typically unknown. This makes it more difficult to stitch the discrete images into a panoramic image because the spatial relationship between the constituent images of the panoramic image are determined, at least partly, based on the respective orientations of the camera at which they were captured.
- application software In order to determine the spatial relationship between a set of images that are to be stitched into a panoramic image, application software must be executed to prompt the user for assistance, hunt for common features in the images, or both.
- a method and apparatus for creating and rendering multiple-view images are disclosed. Images are received on the image sensor of a camera and digitized by sampling logic in the camera. The digitized images are combined by a programmed processor in the camera based upon a spatial relationship between the images.
- FIG. 1 is a block diagram of a virtual reality (VR) camera.
- VR virtual reality
- FIG. 2 illustrates the use of a VR camera to generate a panoramic image.
- FIG. 3 illustrates the use of a VR camera to generate a composite image of a surface.
- FIG. 4 illustrates the use of a VR camera to generate an object image.
- FIG. 5 illustrates control inputs on a VR camera according
- FIG. 6 illustrates the use of a VR camera to overlay a video feed over a previously recorded scene.
- FIG. 7 is a block diagram of a stereo VR camera.
- FIG. 8 is a diagram of a method according to one embodiment of the present invention.
- FIG. 9 is a diagram of a method according to an alternate embodiment of the present invention.
- a virtual reality (VR) camera is provided to create and render panoramic images and other multiple-view images.
- the VR camera includes a sensor to detect the camera orientation at which images in a scene are captured.
- a computer within the VR camera combines the images of the scene into a panoramic image based, at least partly, on the respective camera orientations at which the images were captured.
- a display in the VR camera is used to view the panoramic image.
- the orientation of the VR camera is used to select which portion of the panoramic image is displayed so that a user can effectively pan about the panoramic image by changing the orientation of the camera.
- FIG. 1 is a block diagram of a VR camera 12 according to one embodiment of the present invention.
- VR camera 12 may be either a video camera or a still-image camera and includes an optic 15 , an image acquisition unit (IAU) 17 , an orientation/position sensor (O/P sensor) 21 , one or more user input panels 23 , a processor 19 , a non-volatile program code storage 24 , a memory 25 , a non-volatile data storage 26 and a display 27 .
- IAU image acquisition unit
- O/P sensor orientation/position sensor
- the optic 15 generally includes an automatically or manually focused lens and an aperture having a diameter that is adjustable to allow more or less light to pass.
- the lens projects a focused image through the aperture and onto an image sensor in the IAU 17 .
- the image sensor is typically a charge-coupled device (CCD) that is sampled by sampling logic in the IAU 17 to develop a digitized version of the image.
- the digitized image may then be read directly by the processor 19 or transferred from the IAU 17 to the memory 25 for later access by the processor 19 .
- CCD charge-coupled device
- the processor 19 fetches and executes program code stored in the code storage 24 to implement a logic unit capable of obtaining the image from the IAU 17 (which may include sampling the image sensor), receiving orientation and position information from the O/P sensor 21 , receiving input from the one or more user input panels 23 and outputting image data to the display 27 .
- a logic unit capable of obtaining the image from the IAU 17 (which may include sampling the image sensor), receiving orientation and position information from the O/P sensor 21 , receiving input from the one or more user input panels 23 and outputting image data to the display 27 .
- the memory 25 is provided for temporary storage of program variables and image data
- the non-volatile image storage 26 is provided for more permanent storage of image data.
- the non-volatile storage 26 may include a removable storage element, such as a magnetic disk or tape, to allow panoramic and other multiple-view images created using the VR camera 12 to be stored indefinitely.
- the O/P sensor 21 is used to detect the orientation and position of the VR camera 12 .
- the orientation of the VR camera 12 i.e., pitch, yaw and roll
- the orientation of the VR camera 12 may be determined relative to an arbitrary starting orientation or relative to a fixed reference (e.g., earth's gravitational and magnetic fields).
- a fixed reference e.g., earth's gravitational and magnetic fields.
- an electronic level of the type commonly used in virtual reality headsets can be used to detect camera pitch and roll (rotation about horizontal axes), and an electronic compass can be used to detect camera yaw (rotation about a vertical axis).
- the VR camera 12 can automatically determine the spatial relationship between the discrete images and combine the images into a panoramic image, planar composite image, object image or any other type of multiple-view image.
- a panoramic image (or other multiple-view image) is displayed on display 27
- changes in camera orientation are detected via the O/P sensor 21 and interpreted by the processor 19 as requests to pan about the panoramic image.
- the VR camera's display 27 becomes, in effect, a window into a virtual environment that has been created in the VR camera 12 .
- the position of the VR camera 12 in a three-dimensional (3D) space is determined relative to an arbitrary or absolute reference. This is accomplished, for example, by including in the O/P sensor 21 accelerometers or other devices to detect translation of VR the camera 12 relative to an arbitrary starting point.
- the absolute position of the VR camera 12 may be determined including in the O/P sensor 21 a sensor that communicates with a global positioning system (GPS). GPS is well known to those of ordinary skill in the positioning and tracking arts. As discussed below, the ability to detect translation of the VR camera 12 between image capture positions is useful for combining discrete images to produce a composite image of a surface.
- GPS global positioning system
- the O/P sensor 21 need not include both an orientation sensor and a position sensor, depending on the application of the VR camera 12 .
- the O/P sensor 21 is an orientation sensor only.
- Other combinations of sensors may be used without departing from the scope of the present invention.
- the one or more user input panels 23 may be used to provide user control over such conventional camera functions as focus and zoom (and, at least in the case of a still camera, aperture size, shutter speed, etc.). As discussed below, the input panels 23 may also be used to receive user requests to pan about or zoom in and out on a panoramic image or other multiple-view image. Further, the input panels 23 may be used to receive user requests to set certain image capture parameters, including parameters that indicate the type of composite image to be produced, whether certain features are enabled, and so forth. It will be appreciated that focus and other camera settings may be adjusted using a traditional lens dial instead of an input panel 23 . Similarly, other types of user input devices and techniques, including, but not limited to, user rotation and translation of the VR camera 12 , may be used to receive requests to pan about or zoom in or out on an image.
- the display 27 is typically a liquid crystal display (LCD) but may be any type of display that can be included in the VR camera 12 , including a cathode-ray tube display. Further, as discussed below, the display 27 may be a stereo display designed to present left and right stereo images to the left and right eyes, respectively, of the user.
- LCD liquid crystal display
- the display 27 may be a stereo display designed to present left and right stereo images to the left and right eyes, respectively, of the user.
- FIG. 2 illustrates use of the VR camera 12 of FIG. 1 to generate a panoramic image 41 .
- a panoramic image is an image that represents a wide-angle view of a scene and is one of a class of images referred to herein as multiple-view images.
- a multiple-view image is an image or collection of images that is displayed in user-selected portions.
- a set of discrete images 35 is first obtained by capturing images of an environment 31 at different camera orientations.
- capturing images means taking photographs.
- capturing image refers to generating one or more video frames of each of the discrete images.
- the environment 31 is depicted in FIG. 2 as being an enclosed space but this is not necessary.
- the camera is oriented such that each captured image overlaps the preceding captured image. This is indicated by the overlapped regions 33 .
- the orientation of the VR camera is detected via the O/P sensor (e.g., element 21 of FIG. 1 ) and recorded for each of the discrete images 35 .
- the orientation sensor is monitored by the processor (e.g., element 19 of FIG. 1 ) to determine when the next photograph should be snapped. That is, the VR camera assists the photographer in determining the camera orientation at which each new discrete image 35 is to be snapped by signaling the photographer (e.g., by turning on a beeper or a light) when region of overlap 33 is within a target size.
- the VR camera may be programmed to determine when the region of overlap 33 is within a target size not only for camera yaw, but also for camera pitch or roll.
- the VR camera may be user-configured (e.g., via a control panel 23 input) to automatically snap a photograph whenever it detects sufficient change in orientation.
- the difference between camera orientations at which successive photographs are acquired may be input by the user or automatically determined by the VR camera based upon the camera's angle of view and the distance between the camera and subject.
- the orientation sensor may be used to control the rate at which video frames are generated so that frames are generated only when the O/P sensor indicates sufficient change in orientation (much like the automatic image acquisition mode of the still camera discussed above), or video frames may be generated at standard rates with redundant frames being combined or discarded during the stitching process.
- the overlapping discrete images 35 can be combined based on their spatial relationship to form a panoramic image 41 .
- the discrete images 35 are shown as being a single row of images (indicating that the images were all captured at approximately same pitch angle), additional rows of images at higher or lower pitch angles could also have been obtained.
- the VR camera will typically be hand held (although a tripod may be used), a certain amount of angular error is incurred when the scene is recorded. This angular error is indicated in FIG. 2 by the slightly different pitch and roll orientation of the discrete images 35 relative to one another, and must be accounted for when the images are combined to form the panoramic image 41 .
- program code is executed in the VR camera to combine the discrete images 35 into the panoramic image 41 . This is accomplished by determining a spatial relationship between the discrete images 35 based on the camera orientation information recorded for each image 35 , or based on common features in the overlapping regions of the images 35 , or based on a combination of the two techniques.
- One technique for determining a spatial relationship between images based on common features in the images is to “cross-correlate” the images.
- the images can be cross-correlated by “sliding” one image over the other image one step (e.g., one pixel) at a time and generating a cross-correlation value at each sliding step.
- Each cross-correlation value is generated by performing a combination of arithmetic operations on the pixel values within the overlapping regions of the two images.
- the offset that corresponds to the sliding step providing the highest correlation value is found to be the offset of the two images.
- Cross-correlation can be applied to finding offsets in more than one direction or to determine other unknown transformational parameters, such as rotation or scaling. Techniques other than cross-correlation, such as pattern matching, can also be used to find unknown image offsets and other transformational parameters.
- the images 35 are mapped onto respective regions of a smooth surface such as a sphere or cylinder.
- the regions of overlap 33 are blended in the surface mapping.
- pixels in the discrete images 35 must be repositioned relative to one another in order to produce a two-dimensional pixel-map of the panoramic image 41 .
- stitching the discrete images 35 together to generate a panoramic image 41 typically involves mathematical transformation of pixels to produce a panoramic image 41 that can be rendered without distortion.
- FIG. 3 illustrates the use of the VR camera 12 to generate a composite image of a surface 55 that is too detailed to be adequately represented in a single photograph.
- examples of such surfaces include a white-board having notes on it, a painting, an inscribed monument (e.g., the Viet Nam War Memorial), and so forth.
- multiple discrete images 57 of the surface 55 are obtained by translating the VR camera 12 between a series of positions and capturing a portion of the surface 55 at each position.
- the position of the VR camera 12 is obtained from the position sensing portion of the O/P sensor (element 21 of FIG. 1 ) and recorded for each discrete image 57 . This allows the spatial relationship between the discrete images 57 to be determined no matter the order in which the images 57 are obtained. Consequently, the VR camera is able to generate an accurate composite image 59 of the complete surface 55 regardless of the order in which the discrete images 57 are captured.
- the position sensor can be used to signal the user when the VR camera 12 has been sufficiently translated to take a new photograph.
- the VR camera may be user-configured to automatically snap photographs as the VR camera 12 is swept across the surface 55 .
- the position sensor can be used to control when each new video frame is generated, or video frames may be generated at the standard rate and then blended or discarded based on position information associated with each.
- program code can be executed to combine the images into a composite image 59 based on the position information recorded for each discrete image 57 , or based on common features in overlapping regions of the discrete images 57 , or both.
- the user may view different portions of the composite image 59 on the VR camera's display by changing the orientation of the VR camera 12 or by using controls on a user input panel. By zooming in at a selected portion of the image, text on a white-board, artwork detail, inscriptions on a monument, etc. may be easily viewed.
- the VR camera 12 provides a simple and powerful way to digitize and render high resolution surfaces with a lower resolution camera. Composite images of such surfaces are referred to herein as “planar composite images”, to distinguish them from panoramic images.
- FIG. 4 illustrates yet another application of the VR camera.
- the VR camera is used to combine images into an object image 67 .
- An object image is a set of discrete images that are spatially related to one another, but which have not been stitched together to form a composite image.
- the combination of images into an object image is accomplished by providing information indicating the location of the discrete images relative to one another and not by creating a separate composite image.
- images of an object 61 are captured from surrounding points of view 63 .
- the VR camera may also be moved over or under the object 61 , or may be raised or tilted to capture images of the object 61 at different heights.
- the first floor of a multiple-story building could be captured in one sequence of video frames (or photographs), the second floor in a second sequence of video frames, and so forth. If the VR camera is maintained at an approximately fixed distance from the object 61 , the orientation of the VR camera alone may be recorded to establish the spatial relationship between the discrete images 65 .
- the object is filmed (or photographed) from positions that are not equidistant to the object 61 , it may be necessary to record both the position and orientation of the VR camera for each discrete image 65 in order to produce a coherent objec image 67 .
- two or more discrete images 65 of object 61 can be combined based upon the spatial relationship between them to form an object image 67 .
- combining the discrete images 65 to form an object image 67 typically does not involve stitching the discrete images 65 and is instead accomplished by associating with each of the discrete images 65 information that indicates the image's spatial location in the object image 67 relative to other images in the object image 67 . This can be accomplished, for example, by generating a data structure having one member for each discrete image 65 and which indicates neighboring images and their angular or positional proximity.
- the object image 67 is created, the user can pan through the images 65 by changing the orientation of the camera. Incremental changes in orientation can be used to select an image in the object image 67 that neighbors a previously displayed image. To the user, rendering of the object image 67 in this manner provides a sense of moving around, over and under the object of interest.
- the relative spatial location of each image in the object image 67 an object image is provided by creating a data structure containing the camera orientation information recorded for each discrete image 65 .
- the user orients the VR camera in the direction that was used to capture the image.
- the VR camera's processor detects the orientation via the orientation sensor, and then searches the data structure to identify the discrete image 65 having a recorded orientation most nearly matching the input orientation.
- the identified image 65 is then displayed on the VR camera's display.
- FIG. 5 depicts a VR camera 12 that is equipped with a number of control buttons that are included in user input panels 23 a and 23 b.
- the buttons provided in user-input panel 23 a vary depending on whether VR camera 12 is a video camera or a still-image camera.
- panel 23 a may include shutter speed and aperture control buttons, among others, to manage the quality of the photographed image.
- user input panel 23 a may include, for example, zoom and focus control.
- User input panel 23 a may also include mode control buttons to allow a user to select certain modes and options associated with creating and rendering virtual reality images.
- mode control buttons may be used to select a panoramic image capture mode, planar composite image capture mode or object image capture mode.
- any feature of the VR camera that can be selected, enabled or disabled may be controlled using the mode control buttons.
- view control buttons Right/Left, Up/Down and Zoom are provided in user input panel 23 b to allow the user to select which portion of a panoramic image, planar composite image, object image or other multiple-view image is presented on display 27 .
- view control logic in the camera detects the input and causes the displayed view of a composite image or object image to pan right.
- the view control logic causes the displayed image to be magnified.
- the view control logic may be implemented by a programmed processor (e.g., element 19 of FIG. 1 ), or by dedicated hardware. In one embodiment of the present invention, the view control logic will respond either to user input via panel 23 b or to changes in camera orientation.
- the camera may be configured such that in one mode, view control is achieved by changing the VR camera orientation, and in another mode, view control is achieved via the user input panel 23 b. In both cases, the user is provided with alternate ways to select a view of a multiple-view image.
- FIG. 6 illustrates yet another application of the VR camera 12 of the present invention.
- a video signal captured via the IAU (element 17 of FIG. 1 ) a is superimposed on a previously recorded scene using a chroma-key color replacement technique.
- an individual 83 standing in front of a blue background 82 may be recorded using the VR camera 12 to generate a live video signal.
- Program code in the VR camera 12 may then be executed to implement an overlay function that replaces pixels in a displayed scene with non-blue pixels from the live video. The effect is to place the subject 83 of the live video in the previously generated scene.
- the user may pan about a panoramic image on display 27 to locate a portion of the image into which the live video is to be inserted, then snap the overlaid subject of the video image into the scene.
- the later received image is made part of the earlier recorded panoramic image (or other multiple-view image) and the combined images can be permanently stored as a single recorded video or still image.
- FIG. 7 is a block diagram of a VR camera 112 that is used to receive and process stereo images.
- the optic 115 includes both left and right channels ( 108 , 107 ) for receiving respective left and right images.
- the left and right images are of the same subject but from spatially differentiated viewpoints. This way a 3D view of the subject is captured.
- the left and right images 108 and 107 are projected onto opposing halves of an image sensor in the IAU 117 where they are sampled by the processor 19 and stored in memory 25 .
- multiple image sensors and associated sampling circuitry may be provided in the IAU 117 .
- the left and right images are associated with orientation/position information obtained from the O/P sensor 21 in the manner described above, and stored in the memory 25 .
- the processor may execute program code in the non-volatile code storage 24 to combine the left images into a left composite image and the right images into a right composite image.
- the processor combines the right and left images into respective right and left object images.
- a stereo display 127 is provided to allow a 3D view of a scene to be displayed.
- a polarized LCD display that relies on the different viewing angles of the left and right eyes of an observer may be used.
- the different viewing angles of the observer's left and right eyes causes different images to be perceived by the left and right eyes. Consequently, based on an orientation/position of the camera, or a view select input from the user, a selected portion of the left composite image (or object image) is presented to the left eye and a selected portion of the right composite image (or object image) is presented to the right eye.
- live stereo video received in the IAU 117 of the stereo VR camera 112 may be overlaid on a previously generated composite image or object image.
- the left and right video components of the live stereo video may be superimposed over the left and right composite or object images, respectively. Consequently, the user may view live video subjects in 3D as though they were present in the previously recorded 3D scene.
- a stereo photograph may also be overlaid on an earlier recorded composite image or object image.
- FIG. 8 is a diagram of a method according to one embodiment of the present invention.
- a set of discrete images are received in the camera.
- the images are digitized at step 143 .
- the digitized images are combined to produce a multiple-view image at step 143 .
- at step 145 at least a portion of the multiple-view image is displayed on a display of the camera.
- the steps of receiving ( 141 ), digitizing ( 143 ) and combining ( 145 ) may be performed on an image by image basis so that each image is received, digitized and combined with one or more previously received and digitized images before a next image is received and digitized.
- a method of generating of a multiple-view image on a discrete image by discrete image basis shown in FIG. 9 A method of generating of a multiple-view image on a discrete image by discrete image basis shown in FIG. 9 .
- a discrete image i is received, where i ranges from 0 to N.
- image i is digitized, and i is incremented at step 157 . If i is determined to be less than or equal to one at step 159 , execution loops back to step 151 to receive the next discrete image i . If i is greater than one, then at step 161 digitized image i is combined with one or more previously digitized images based on a spatial relationship between the digitized image i and the one or more previously digitized images to produce a multiple-view image.
- the method is exited. It will be appreciated that the determination as to whether a final image has been received may be made in a number of ways, including: detecting that a predetermined number of images have been received, digitized and combined; or receiving a signal from the user or an internally generated signal indicating that a desired or threshold number of images have been received, digitized and combined into the multiple-view image. Also, according to one embodiment of the present invention, the user may select a portion of the multiple-view image for viewing any time after an initial combining step 159 has been performed.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Studio Circuits (AREA)
- Image Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (67)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/846,801 USRE45785E1 (en) | 1997-09-26 | 2013-03-18 | Virtual reality camera |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/938,366 US6552744B2 (en) | 1997-09-26 | 1997-09-26 | Virtual reality camera |
US11/113,455 USRE43700E1 (en) | 1997-09-26 | 2005-04-22 | Virtual reality camera |
US201213562220A | 2012-07-30 | 2012-07-30 | |
US13/846,801 USRE45785E1 (en) | 1997-09-26 | 2013-03-18 | Virtual reality camera |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/938,366 Reissue US6552744B2 (en) | 1997-09-26 | 1997-09-26 | Virtual reality camera |
Publications (1)
Publication Number | Publication Date |
---|---|
USRE45785E1 true USRE45785E1 (en) | 2015-10-27 |
Family
ID=25471311
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/938,366 Ceased US6552744B2 (en) | 1997-09-26 | 1997-09-26 | Virtual reality camera |
US11/113,455 Expired - Lifetime USRE43700E1 (en) | 1997-09-26 | 2005-04-22 | Virtual reality camera |
US13/846,801 Expired - Lifetime USRE45785E1 (en) | 1997-09-26 | 2013-03-18 | Virtual reality camera |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/938,366 Ceased US6552744B2 (en) | 1997-09-26 | 1997-09-26 | Virtual reality camera |
US11/113,455 Expired - Lifetime USRE43700E1 (en) | 1997-09-26 | 2005-04-22 | Virtual reality camera |
Country Status (6)
Country | Link |
---|---|
US (3) | US6552744B2 (en) |
EP (1) | EP1023803A1 (en) |
JP (1) | JP2002503893A (en) |
AU (1) | AU8174898A (en) |
CA (1) | CA2296062A1 (en) |
WO (1) | WO1999017543A1 (en) |
Families Citing this family (168)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4332231B2 (en) * | 1997-04-21 | 2009-09-16 | ソニー株式会社 | Imaging device controller and imaging system |
US6466701B1 (en) * | 1997-09-10 | 2002-10-15 | Ricoh Company, Ltd. | System and method for displaying an image indicating a positional relation between partially overlapping images |
US6552744B2 (en) | 1997-09-26 | 2003-04-22 | Roxio, Inc. | Virtual reality camera |
JPH11232440A (en) * | 1998-02-18 | 1999-08-27 | Casio Comput Co Ltd | Image processing device |
US6486908B1 (en) * | 1998-05-27 | 2002-11-26 | Industrial Technology Research Institute | Image-based method and system for building spherical panoramas |
JP3485543B2 (en) * | 1998-06-22 | 2004-01-13 | 富士写真フイルム株式会社 | Imaging device and method |
AU759891B2 (en) * | 1999-01-06 | 2003-05-01 | Optware Corporation | Three-dimensional image sensing device and method, three-dimensional image displaying device and method, and three-dimensional image position changing device and method |
US20030133009A1 (en) * | 1999-04-09 | 2003-07-17 | Carl S Brown | System and method for detecting with high resolution a large, high content field |
CA2309459A1 (en) * | 1999-06-10 | 2000-12-10 | International Business Machines Corporation | System for personalized field of view in a broadcast environment |
JP4169462B2 (en) * | 1999-08-26 | 2008-10-22 | 株式会社リコー | Image processing method and apparatus, digital camera, image processing system, and recording medium recording image processing program |
US7477284B2 (en) * | 1999-09-16 | 2009-01-13 | Yissum Research Development Company Of The Hebrew University Of Jerusalem | System and method for capturing and viewing stereoscopic panoramic images |
EP1102211A3 (en) * | 1999-11-19 | 2006-09-13 | Matsushita Electric Industrial Co., Ltd. | Image processor, method of providing image processing services and order processing method |
US7064783B2 (en) * | 1999-12-31 | 2006-06-20 | Stmicroelectronics, Inc. | Still picture format for subsequent picture stitching for forming a panoramic image |
US7187412B1 (en) * | 2000-01-18 | 2007-03-06 | Hewlett-Packard Development Company, L.P. | Pointing device for digital camera display |
JP3849385B2 (en) * | 2000-02-03 | 2006-11-22 | コニカミノルタホールディングス株式会社 | Image processing apparatus, image processing method, and computer-readable recording medium recording image processing program |
US20010026684A1 (en) * | 2000-02-03 | 2001-10-04 | Alst Technical Excellence Center | Aid for panoramic image creation |
US6978051B2 (en) * | 2000-03-06 | 2005-12-20 | Sony Corporation | System and method for capturing adjacent images by utilizing a panorama mode |
US7133068B2 (en) * | 2000-03-06 | 2006-11-07 | Sony Corporation | System and method for creating still images by utilizing a video camera device |
US6930703B1 (en) * | 2000-04-29 | 2005-08-16 | Hewlett-Packard Development Company, L.P. | Method and apparatus for automatically capturing a plurality of images during a pan |
US7092014B1 (en) * | 2000-06-28 | 2006-08-15 | Microsoft Corporation | Scene capturing and view rendering based on a longitudinally aligned camera array |
US7313289B2 (en) * | 2000-08-30 | 2007-12-25 | Ricoh Company, Ltd. | Image processing method and apparatus and computer-readable storage medium using improved distortion correction |
US6895126B2 (en) * | 2000-10-06 | 2005-05-17 | Enrico Di Bernardo | System and method for creating, storing, and utilizing composite images of a geographic location |
US6937272B1 (en) * | 2000-11-08 | 2005-08-30 | Xerox Corporation | Display device for a camera |
IL139995A (en) | 2000-11-29 | 2007-07-24 | Rvc Llc | System and method for spherical stereoscopic photographing |
US6975352B2 (en) * | 2000-12-18 | 2005-12-13 | Xerox Corporation | Apparatus and method for capturing a composite digital image with regions of varied focus and magnification |
US7101045B2 (en) * | 2001-03-23 | 2006-09-05 | Panavision Inc. | Automatic pan and tilt compensation system for a camera support structure |
US6820980B1 (en) * | 2001-03-23 | 2004-11-23 | Panavision, Inc. | Automatic pan and tilt compensation system for a camera support structure |
JP3531003B2 (en) * | 2001-03-30 | 2004-05-24 | ミノルタ株式会社 | Image processing apparatus, recording medium on which image processing program is recorded, and image reproducing apparatus |
US20040201748A1 (en) * | 2001-09-12 | 2004-10-14 | Tim Goldstein | Extended image digital photography |
US20030076408A1 (en) * | 2001-10-18 | 2003-04-24 | Nokia Corporation | Method and handheld device for obtaining an image of an object by combining a plurality of images |
US7064779B1 (en) * | 2001-10-23 | 2006-06-20 | Ess Technology, Inc. | Imaging system combining multiple still images for higher resolution image output |
US7224387B2 (en) * | 2002-01-09 | 2007-05-29 | Hewlett-Packard Development Company, L.P. | Method and apparatus for correcting camera tilt distortion in panoramic images |
US6759979B2 (en) * | 2002-01-22 | 2004-07-06 | E-Businesscontrols Corp. | GPS-enhanced system and method for automatically capturing and co-registering virtual models of a site |
EP1497791B1 (en) * | 2002-04-23 | 2010-09-01 | Palm, Inc. | Production of a complete image by scanning partial areas of a pattern |
KR100965947B1 (en) * | 2002-04-25 | 2010-06-24 | 소니 주식회사 | Image processing apparatus and image processing method and image processing program recording medium |
US20040001145A1 (en) * | 2002-06-27 | 2004-01-01 | Abbate Jeffrey A. | Method and apparatus for multifield image generation and processing |
JP3744002B2 (en) * | 2002-10-04 | 2006-02-08 | ソニー株式会社 | Display device, imaging device, and imaging / display system |
JP4048907B2 (en) * | 2002-10-15 | 2008-02-20 | セイコーエプソン株式会社 | Panorama composition of multiple image data |
EP1553521A4 (en) * | 2002-10-15 | 2006-08-02 | Seiko Epson Corp | PANORAMA SYNTHESIS PROCESSING OF A PLURALITY OF IMAGE DATA |
JP4178987B2 (en) * | 2003-02-14 | 2008-11-12 | 株式会社ニコン | Electronic camera |
JP4474106B2 (en) * | 2003-02-27 | 2010-06-02 | キヤノン株式会社 | Image processing apparatus, image processing method, recording medium, and program |
US7495694B2 (en) * | 2004-07-28 | 2009-02-24 | Microsoft Corp. | Omni-directional camera with calibration and up look angle improvements |
US7236199B2 (en) * | 2003-07-07 | 2007-06-26 | Toshikazu Hori | Multi-tap camera |
US8294712B2 (en) * | 2003-09-19 | 2012-10-23 | The Boeing Company | Scalable method for rapidly detecting potential ground vehicle under cover using visualization of total occlusion footprint in point cloud population |
US20050063608A1 (en) * | 2003-09-24 | 2005-03-24 | Ian Clarke | System and method for creating a panorama image from a plurality of source images |
US20050083417A1 (en) * | 2003-10-21 | 2005-04-21 | Battles Amy E. | System and method for providing image orientation information of a video clip |
US7746375B2 (en) * | 2003-10-28 | 2010-06-29 | Koninklijke Philips Electronics N.V. | Digital camera with panorama or mosaic functionality |
CN100440944C (en) * | 2003-11-11 | 2008-12-03 | 精工爱普生株式会社 | Image processing device and image processing method |
JP4614653B2 (en) * | 2003-12-12 | 2011-01-19 | ソニー株式会社 | Monitoring device |
KR20070007059A (en) | 2003-12-26 | 2007-01-12 | 미코이 코포레이션 | Multi-Dimensional Imaging Devices, Systems, and Methods |
US20050168589A1 (en) * | 2004-01-30 | 2005-08-04 | D. Amnon Silverstein | Method and system for processing an image with an image-capturing device |
DE102004004806B4 (en) * | 2004-01-30 | 2012-04-19 | Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg | Electronic motion picture camera |
GB2410639A (en) * | 2004-01-30 | 2005-08-03 | Hewlett Packard Development Co | Viewfinder alteration for panoramic imaging |
US7436438B2 (en) * | 2004-03-16 | 2008-10-14 | Creative Technology Ltd. | Digital still camera and method of forming a panoramic image |
US10721405B2 (en) | 2004-03-25 | 2020-07-21 | Clear Imaging Research, Llc | Method and apparatus for implementing a digital graduated filter for an imaging apparatus |
US8331723B2 (en) | 2004-03-25 | 2012-12-11 | Ozluturk Fatih M | Method and apparatus to correct digital image blur due to motion of subject or imaging device |
US9826159B2 (en) | 2004-03-25 | 2017-11-21 | Clear Imaging Research, Llc | Method and apparatus for implementing a digital graduated filter for an imaging apparatus |
US20050237631A1 (en) * | 2004-04-16 | 2005-10-27 | Hiroyuki Shioya | Image pickup apparatus and image pickup method |
JP2005311789A (en) * | 2004-04-22 | 2005-11-04 | Fuji Photo Film Co Ltd | Digital camera |
JP4293053B2 (en) * | 2004-05-19 | 2009-07-08 | ソニー株式会社 | Imaging apparatus and method |
US20070182812A1 (en) * | 2004-05-19 | 2007-08-09 | Ritchey Kurtis J | Panoramic image-based virtual reality/telepresence audio-visual system and method |
JP3971783B2 (en) * | 2004-07-28 | 2007-09-05 | 松下電器産業株式会社 | Panorama image composition method and object detection method, panorama image composition device, imaging device, object detection device, and panorama image composition program |
KR100683850B1 (en) * | 2004-08-20 | 2007-02-16 | 삼성전자주식회사 | Recording device and shooting method that supports panorama shooting function |
US7646400B2 (en) * | 2005-02-11 | 2010-01-12 | Creative Technology Ltd | Method and apparatus for forming a panoramic image |
US7860301B2 (en) * | 2005-02-11 | 2010-12-28 | Macdonald Dettwiler And Associates Inc. | 3D imaging system |
US20060197851A1 (en) * | 2005-03-07 | 2006-09-07 | Paul Vlahos | Positioning a subject with respect to a background scene in a digital camera |
JP2006309626A (en) * | 2005-04-28 | 2006-11-09 | Ntt Docomo Inc | Arbitrary viewpoint image generation device |
US7872665B2 (en) | 2005-05-13 | 2011-01-18 | Micoy Corporation | Image capture and processing |
US20070081081A1 (en) * | 2005-10-07 | 2007-04-12 | Cheng Brett A | Automated multi-frame image capture for panorama stitching using motion sensor |
US20080007617A1 (en) * | 2006-05-11 | 2008-01-10 | Ritchey Kurtis J | Volumetric panoramic sensor systems |
US10614513B2 (en) * | 2006-07-07 | 2020-04-07 | Joseph R. Dollens | Method and system for managing and displaying product images with progressive resolution display |
SE532236C2 (en) * | 2006-07-19 | 2009-11-17 | Scalado Ab | Method in connection with taking digital pictures |
US20080030429A1 (en) * | 2006-08-07 | 2008-02-07 | International Business Machines Corporation | System and method of enhanced virtual reality |
US20080077597A1 (en) * | 2006-08-24 | 2008-03-27 | Lance Butler | Systems and methods for photograph mapping |
JP4725526B2 (en) * | 2006-08-28 | 2011-07-13 | ソニー株式会社 | Information processing apparatus, imaging apparatus, information processing system, apparatus control method, and program |
US20080170222A1 (en) * | 2007-01-16 | 2008-07-17 | Strege Timothy A | Methods and systems for determining vehicle wheel alignment |
US20090086021A1 (en) * | 2007-09-27 | 2009-04-02 | Rockwell Automation Technologies, Inc. | Dynamically generating real-time visualizations in industrial automation environment as a function of contect and state information |
JP4851412B2 (en) | 2007-09-27 | 2012-01-11 | 富士フイルム株式会社 | Image display apparatus, image display method, and image display program |
US20090094188A1 (en) * | 2007-10-03 | 2009-04-09 | Edward Covannon | Facilitating identification of an object recorded in digital content records |
US8582805B2 (en) * | 2007-11-05 | 2013-11-12 | California Institute Of Technology | Synthetic foveal imaging technology |
US7961224B2 (en) * | 2008-01-25 | 2011-06-14 | Peter N. Cheimets | Photon counting imaging system |
US8174561B2 (en) * | 2008-03-14 | 2012-05-08 | Sony Ericsson Mobile Communications Ab | Device, method and program for creating and displaying composite images generated from images related by capture position |
US8497905B2 (en) * | 2008-04-11 | 2013-07-30 | nearmap australia pty ltd. | Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features |
US8675068B2 (en) * | 2008-04-11 | 2014-03-18 | Nearmap Australia Pty Ltd | Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features |
US8326022B2 (en) * | 2008-05-22 | 2012-12-04 | Matrix Electronic Measuring Properties, Llc | Stereoscopic measurement system and method |
US9449378B2 (en) | 2008-05-22 | 2016-09-20 | Matrix Electronic Measuring Properties, Llc | System and method for processing stereoscopic vehicle information |
US8249332B2 (en) | 2008-05-22 | 2012-08-21 | Matrix Electronic Measuring Properties Llc | Stereoscopic measurement system and method |
RU2452992C1 (en) * | 2008-05-22 | 2012-06-10 | МАТРИКС ЭЛЕКТРОНИК МЕЖЕРИНГ ПРОПЕРТИЗ, ЭлЭлСи | Stereoscopic measuring system and method |
US8345953B2 (en) * | 2008-05-22 | 2013-01-01 | Matrix Electronic Measuring Properties, Llc | Stereoscopic measurement system and method |
US20090309853A1 (en) * | 2008-06-13 | 2009-12-17 | Polyvision Corporation | Electronic whiteboard system and assembly with optical detection elements |
JP4770924B2 (en) * | 2008-12-17 | 2011-09-14 | ソニー株式会社 | Imaging apparatus, imaging method, and program |
JP5347716B2 (en) * | 2009-05-27 | 2013-11-20 | ソニー株式会社 | Image processing apparatus, information processing method, and program |
JP5338498B2 (en) * | 2009-06-09 | 2013-11-13 | ソニー株式会社 | Control device, camera system and program used in surveillance camera system |
CN102714690A (en) * | 2009-09-04 | 2012-10-03 | 布瑞特布里克有限公司 | Mobile wide-angle video recording system |
WO2011040864A1 (en) | 2009-10-01 | 2011-04-07 | Scalado Ab | Method relating to digital images |
CN102035987A (en) * | 2009-10-08 | 2011-04-27 | 鸿富锦精密工业(深圳)有限公司 | Photograph synthesizing method and system |
KR101631912B1 (en) * | 2009-11-03 | 2016-06-20 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
KR20110052124A (en) * | 2009-11-12 | 2011-05-18 | 삼성전자주식회사 | Panorama image generation and inquiry method and mobile terminal using the same |
JP5446794B2 (en) * | 2009-12-04 | 2014-03-19 | ソニー株式会社 | Imaging apparatus, data processing method, and program |
US9766089B2 (en) * | 2009-12-14 | 2017-09-19 | Nokia Technologies Oy | Method and apparatus for correlating and navigating between a live image and a prerecorded panoramic image |
KR101630290B1 (en) * | 2009-12-24 | 2016-06-16 | 한화테크윈 주식회사 | Photographing Method For Producing Image Of Traced Moving Path And Apparatus Using The Same |
WO2011093031A1 (en) * | 2010-02-01 | 2011-08-04 | 日本電気株式会社 | Portable terminal, action history depiction method, and action history depiction system |
US20110234750A1 (en) * | 2010-03-24 | 2011-09-29 | Jimmy Kwok Lap Lai | Capturing Two or More Images to Form a Panoramic Image |
JP5663934B2 (en) * | 2010-04-09 | 2015-02-04 | ソニー株式会社 | Image processing apparatus, imaging apparatus, image processing method, and program |
US9635251B2 (en) * | 2010-05-21 | 2017-04-25 | Qualcomm Incorporated | Visual tracking using panoramas on mobile devices |
US8933986B2 (en) * | 2010-05-28 | 2015-01-13 | Qualcomm Incorporated | North centered orientation tracking in uninformed environments |
JP5506589B2 (en) * | 2010-08-02 | 2014-05-28 | キヤノン株式会社 | Imaging apparatus, control method therefor, program, and recording medium |
WO2012039669A1 (en) | 2010-09-20 | 2012-03-29 | Scalado Ab | Method for forming images |
US8866890B2 (en) * | 2010-11-05 | 2014-10-21 | Teledyne Dalsa, Inc. | Multi-camera |
US8810598B2 (en) | 2011-04-08 | 2014-08-19 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
KR20140053885A (en) * | 2011-04-18 | 2014-05-08 | 아이시360, 인코포레이티드 | Apparatus and method for panoramic video imaging with mobile computing devices |
US20120300020A1 (en) * | 2011-05-27 | 2012-11-29 | Qualcomm Incorporated | Real-time self-localization from panoramic images |
JP2013013065A (en) * | 2011-06-01 | 2013-01-17 | Ricoh Co Ltd | Image processor, image processing method and image processing program |
US8666145B2 (en) * | 2011-09-07 | 2014-03-04 | Superfish Ltd. | System and method for identifying a region of interest in a digital image |
US8553942B2 (en) | 2011-10-21 | 2013-10-08 | Navteq B.V. | Reimaging based on depthmap information |
US9047688B2 (en) | 2011-10-21 | 2015-06-02 | Here Global B.V. | Depth cursor and depth measurement in images |
US9116011B2 (en) | 2011-10-21 | 2015-08-25 | Here Global B.V. | Three dimensional routing |
JP6056127B2 (en) * | 2011-10-31 | 2017-01-11 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
US9729788B2 (en) | 2011-11-07 | 2017-08-08 | Sony Corporation | Image generation apparatus and image generation method |
US9894272B2 (en) * | 2011-11-07 | 2018-02-13 | Sony Interactive Entertainment Inc. | Image generation apparatus and image generation method |
WO2013069049A1 (en) | 2011-11-07 | 2013-05-16 | 株式会社ソニー・コンピュータエンタテインメント | Image generation device, and image generation method |
US9171384B2 (en) * | 2011-11-08 | 2015-10-27 | Qualcomm Incorporated | Hands-free augmented reality for wireless communication devices |
EP2783340A4 (en) | 2011-11-21 | 2015-03-25 | Nant Holdings Ip Llc | Subscription bill service, systems and methods |
TW201324023A (en) * | 2011-12-13 | 2013-06-16 | Fih Hong Kong Ltd | System and method for taking panoramic photos |
US9047692B1 (en) * | 2011-12-20 | 2015-06-02 | Google Inc. | Scene scan |
US8941750B2 (en) * | 2011-12-27 | 2015-01-27 | Casio Computer Co., Ltd. | Image processing device for generating reconstruction image, image generating method, and storage medium |
US9404764B2 (en) | 2011-12-30 | 2016-08-02 | Here Global B.V. | Path side imagery |
US9024970B2 (en) | 2011-12-30 | 2015-05-05 | Here Global B.V. | Path side image on map overlay |
CA2763649A1 (en) * | 2012-01-06 | 2013-07-06 | 9237-7167 Quebec Inc. | Panoramic camera |
GB2515684B (en) * | 2012-04-02 | 2019-05-01 | Panasonic Corp | Image generation device, camera device, image display device, and image generation method |
US9277013B2 (en) | 2012-05-10 | 2016-03-01 | Qualcomm Incorporated | Storing local session data at a user equipment and selectively transmitting group session data to group session targets based on dynamic playback relevance information |
US9444564B2 (en) | 2012-05-10 | 2016-09-13 | Qualcomm Incorporated | Selectively directing media feeds to a set of target user equipments |
US20130300821A1 (en) * | 2012-05-10 | 2013-11-14 | Qualcomm Incorporated | Selectively combining a plurality of video feeds for a group communication session |
US9256983B2 (en) * | 2012-06-28 | 2016-02-09 | Here Global B.V. | On demand image overlay |
US9256961B2 (en) | 2012-06-28 | 2016-02-09 | Here Global B.V. | Alternate viewpoint image enhancement |
US9488489B2 (en) | 2012-09-28 | 2016-11-08 | Google Inc. | Personalized mapping with photo tours |
US9071756B2 (en) | 2012-12-11 | 2015-06-30 | Facebook, Inc. | Systems and methods for digital video stabilization via constraint-based rotation smoothing |
GB201305402D0 (en) * | 2013-03-25 | 2013-05-08 | Sony Comp Entertainment Europe | Head mountable display |
US9350916B2 (en) * | 2013-05-28 | 2016-05-24 | Apple Inc. | Interleaving image processing and image capture operations |
US9384552B2 (en) | 2013-06-06 | 2016-07-05 | Apple Inc. | Image registration methods for still image stabilization |
US9491360B2 (en) | 2013-06-06 | 2016-11-08 | Apple Inc. | Reference frame selection for still image stabilization |
US9262684B2 (en) | 2013-06-06 | 2016-02-16 | Apple Inc. | Methods of image fusion for image stabilization |
US20150071547A1 (en) | 2013-09-09 | 2015-03-12 | Apple Inc. | Automated Selection Of Keeper Images From A Burst Photo Captured Set |
US9244940B1 (en) | 2013-09-27 | 2016-01-26 | Google Inc. | Navigation paths for panorama |
US9582516B2 (en) | 2013-10-17 | 2017-02-28 | Nant Holdings Ip, Llc | Wide area augmented reality location-based services |
US9784836B2 (en) * | 2013-11-08 | 2017-10-10 | Sharper Shape Oy | System for monitoring power lines |
US9508172B1 (en) * | 2013-12-05 | 2016-11-29 | Google Inc. | Methods and devices for outputting a zoom sequence |
US9436278B2 (en) * | 2014-04-01 | 2016-09-06 | Moju Labs, Inc. | Motion-based content navigation |
US9189839B1 (en) | 2014-04-24 | 2015-11-17 | Google Inc. | Automatically generating panorama tours |
US9002647B1 (en) | 2014-06-27 | 2015-04-07 | Google Inc. | Generating turn-by-turn direction previews |
US9418472B2 (en) | 2014-07-17 | 2016-08-16 | Google Inc. | Blending between street view and earth view |
US9350924B2 (en) | 2014-08-25 | 2016-05-24 | John G. Posa | Portable electronic devices with integrated image/video compositing |
US11205305B2 (en) | 2014-09-22 | 2021-12-21 | Samsung Electronics Company, Ltd. | Presentation of three-dimensional video |
US10257494B2 (en) | 2014-09-22 | 2019-04-09 | Samsung Electronics Co., Ltd. | Reconstruction of three-dimensional video |
US9851299B2 (en) | 2014-10-25 | 2017-12-26 | Isle Management Co. | Method of analyzing air quality |
PT3326358T (en) * | 2015-02-27 | 2022-11-25 | Leia Inc | Multiview camera |
WO2017026193A1 (en) * | 2015-08-12 | 2017-02-16 | ソニー株式会社 | Image processing device, image processing method, program, and image processing system |
US9843725B2 (en) * | 2015-12-29 | 2017-12-12 | VideoStitch Inc. | Omnidirectional camera with multiple processors and/or multiple sensors connected to each processor |
US9787896B2 (en) | 2015-12-29 | 2017-10-10 | VideoStitch Inc. | System for processing data from an omnidirectional camera with multiple processors and/or multiple sensors connected to each processor |
US10761303B2 (en) | 2016-07-19 | 2020-09-01 | Barry Henthorn | Simultaneous spherical panorama image and video capturing system |
KR102560780B1 (en) * | 2016-10-05 | 2023-07-28 | 삼성전자주식회사 | Image processing system including plurality of image sensors and electronic device including thereof |
US10594995B2 (en) * | 2016-12-13 | 2020-03-17 | Buf Canada Inc. | Image capture and display on a dome for chroma keying |
US10973391B1 (en) * | 2017-05-22 | 2021-04-13 | James X. Liu | Mixed reality viewing of a surgical procedure |
WO2019070317A1 (en) | 2017-10-02 | 2019-04-11 | Leia Inc. | Multiview camera array, multiview system, and method having camera sub-arrays with a shared camera |
US10666863B2 (en) * | 2018-05-25 | 2020-05-26 | Microsoft Technology Licensing, Llc | Adaptive panoramic video streaming using overlapping partitioned sections |
US10764494B2 (en) | 2018-05-25 | 2020-09-01 | Microsoft Technology Licensing, Llc | Adaptive panoramic video streaming using composite pictures |
CN112689088A (en) * | 2020-12-21 | 2021-04-20 | 维沃移动通信(杭州)有限公司 | Image display method and device and electronic equipment |
US20240406564A1 (en) * | 2023-06-02 | 2024-12-05 | SkyScope Corporation | Methods and systems for creating panoramic images for digitally exploring locations |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0537021A2 (en) | 1991-10-11 | 1993-04-14 | Gec-Marconi Limited | Camera-calibration in a computer vision system |
JPH05227457A (en) | 1992-02-15 | 1993-09-03 | Kanegafuchi Chem Ind Co Ltd | Generation method for panorama electronic photograph |
US5262867A (en) * | 1990-06-20 | 1993-11-16 | Sony Corporation | Electronic camera and device for panoramic imaging and object searching |
US5424773A (en) | 1993-01-29 | 1995-06-13 | Kawai Musical Inst. Mfg. Co., Ltd. | Apparatus and method for generating a pseudo camera position image from a plurality of video images from different camera positions using a neural network |
US5444478A (en) * | 1992-12-29 | 1995-08-22 | U.S. Philips Corporation | Image processing method and device for constructing an image from adjacent images |
US5528290A (en) * | 1994-09-09 | 1996-06-18 | Xerox Corporation | Device for transcribing images on a board using a camera based board scanner |
JPH08163433A (en) | 1994-11-29 | 1996-06-21 | Sony Corp | Panoramic still image generating device |
JPH08223481A (en) | 1995-02-16 | 1996-08-30 | Sony Corp | Panorama still image generating device |
US5625409A (en) * | 1992-10-14 | 1997-04-29 | Matra Cap Systemes | High resolution long-range camera for an airborne platform |
US5646679A (en) | 1994-06-30 | 1997-07-08 | Canon Kabushiki Kaisha | Image combining method and apparatus |
US5650814A (en) | 1993-10-20 | 1997-07-22 | U.S. Philips Corporation | Image processing system comprising fixed cameras and a system simulating a mobile camera |
WO1999017543A1 (en) | 1997-09-26 | 1999-04-08 | Live Picture, Inc. | Virtual reality camera |
US5907353A (en) * | 1995-03-28 | 1999-05-25 | Canon Kabushiki Kaisha | Determining a dividing number of areas into which an object image is to be divided based on information associated with the object |
US6009190A (en) * | 1997-08-01 | 1999-12-28 | Microsoft Corporation | Texture map construction method and apparatus for displaying panoramic image mosaics |
US6011558A (en) * | 1997-09-23 | 2000-01-04 | Industrial Technology Research Institute | Intelligent stitcher for panoramic image-based virtual worlds |
US6078701A (en) * | 1997-08-01 | 2000-06-20 | Sarnoff Corporation | Method and apparatus for performing local to global multiframe alignment to construct mosaic images |
-
1997
- 1997-09-26 US US08/938,366 patent/US6552744B2/en not_active Ceased
-
1998
- 1998-06-25 JP JP2000514469A patent/JP2002503893A/en active Pending
- 1998-06-25 AU AU81748/98A patent/AU8174898A/en not_active Abandoned
- 1998-06-25 EP EP98931698A patent/EP1023803A1/en not_active Withdrawn
- 1998-06-25 CA CA002296062A patent/CA2296062A1/en not_active Abandoned
- 1998-06-25 WO PCT/US1998/013465 patent/WO1999017543A1/en not_active Application Discontinuation
-
2005
- 2005-04-22 US US11/113,455 patent/USRE43700E1/en not_active Expired - Lifetime
-
2013
- 2013-03-18 US US13/846,801 patent/USRE45785E1/en not_active Expired - Lifetime
Patent Citations (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5262867A (en) * | 1990-06-20 | 1993-11-16 | Sony Corporation | Electronic camera and device for panoramic imaging and object searching |
EP0537021A2 (en) | 1991-10-11 | 1993-04-14 | Gec-Marconi Limited | Camera-calibration in a computer vision system |
JPH05227457A (en) | 1992-02-15 | 1993-09-03 | Kanegafuchi Chem Ind Co Ltd | Generation method for panorama electronic photograph |
US5625409A (en) * | 1992-10-14 | 1997-04-29 | Matra Cap Systemes | High resolution long-range camera for an airborne platform |
US5444478A (en) * | 1992-12-29 | 1995-08-22 | U.S. Philips Corporation | Image processing method and device for constructing an image from adjacent images |
US5424773A (en) | 1993-01-29 | 1995-06-13 | Kawai Musical Inst. Mfg. Co., Ltd. | Apparatus and method for generating a pseudo camera position image from a plurality of video images from different camera positions using a neural network |
US5650814A (en) | 1993-10-20 | 1997-07-22 | U.S. Philips Corporation | Image processing system comprising fixed cameras and a system simulating a mobile camera |
US5646679A (en) | 1994-06-30 | 1997-07-08 | Canon Kabushiki Kaisha | Image combining method and apparatus |
US5528290A (en) * | 1994-09-09 | 1996-06-18 | Xerox Corporation | Device for transcribing images on a board using a camera based board scanner |
JPH08163433A (en) | 1994-11-29 | 1996-06-21 | Sony Corp | Panoramic still image generating device |
JPH08223481A (en) | 1995-02-16 | 1996-08-30 | Sony Corp | Panorama still image generating device |
US5907353A (en) * | 1995-03-28 | 1999-05-25 | Canon Kabushiki Kaisha | Determining a dividing number of areas into which an object image is to be divided based on information associated with the object |
US6009190A (en) * | 1997-08-01 | 1999-12-28 | Microsoft Corporation | Texture map construction method and apparatus for displaying panoramic image mosaics |
US6078701A (en) * | 1997-08-01 | 2000-06-20 | Sarnoff Corporation | Method and apparatus for performing local to global multiframe alignment to construct mosaic images |
US6011558A (en) * | 1997-09-23 | 2000-01-04 | Industrial Technology Research Institute | Intelligent stitcher for panoramic image-based virtual worlds |
CA2296062A1 (en) | 1997-09-26 | 1999-04-08 | Live Picture, Inc. | Electronic camera for virtual reality imaging |
AU8174898A (en) | 1997-09-26 | 1999-04-23 | Live Picture, Inc. | Virtual reality camera |
WO1999017543A1 (en) | 1997-09-26 | 1999-04-08 | Live Picture, Inc. | Virtual reality camera |
EP1023803A1 (en) | 1997-09-26 | 2000-08-02 | Live Picture, Inc. | Virtual reality camera |
JP2002503893A (en) | 1997-09-26 | 2002-02-05 | エムジイアイ ソフトウエア コーポレイション | Virtual reality camera |
US6552744B2 (en) | 1997-09-26 | 2003-04-22 | Roxio, Inc. | Virtual reality camera |
USRE43700E1 (en) | 1997-09-26 | 2012-10-02 | Intellectual Ventures I Llc | Virtual reality camera |
Non-Patent Citations (71)
Title |
---|
"Casio QV700 Digital Camera & DP-8000 Digital Photo Printer", Joe Farace, 3 pp., Nov. 6, 1997. |
"Round Shot Model Super 35", Seitz, , 1997, 4 pp. |
"Round Shot Model Super 35", Seitz, <http://www.roundshot.com/rssup35.htm>, 1997, 4 pp. |
"Round Shot Model Super 35", Seitz, 4 pp., 1997. |
"Round Shot", 4 pp. |
"Round Shot," , accessed Jan. 27, 1998, 4 pp. |
"Round Shot," <www.spectraweb.ch/˜mltrad/rs35mm.htm>, accessed Jan. 27, 1998, 4 pp. |
Amended Answer of Defendants to Plaintiffs' Amended Complaint, Intellectual Ventures v. Nikon, No. 11-CV-01025-SLR, (D. Del. filed Dec. 19, 2013), at 624-33. |
Amended Complaint, Intellectual Ventures v. Nikon, No. 11-CV-01025-SLR, (D. Del. filed Feb. 10, 2012) at 198-207 with Exhibits A-E, 111 p. |
Answer of Defendants to Plaintiffs' Amended Complaint, Intellectual Ventures v. Nikon, No. 11-CV-01025-SLR, (D. Del. filed Apr. 15, 2013), at 503-12. |
Answer of Defendants to Plaintiffs' Second Amended Complaint, Intellectual Ventures v. Nikon, No. 11-CV-01025-SLR, (D. Del. filed Mar. 3, 2014), at 982-94. |
Answering Brief in Opposition, Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Jan. 8, 2015, 19 pages. |
Complaint, Intellectual Ventures v. Nikon, No. 11-CV-01025-SLR, (D. Del. filed Oct. 26, 2011), at 1-9 with Exhibits A-E, 111 p. |
Declaration (including Exhibit 1), Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Dec. 3, 2014, 96 pages. |
Declaration (including Exhibit A), Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Dec. 3, 2014, 10 pages. |
Declaration (including Exhibit A-D), Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Dec. 18, 2014, 77 pages. |
Declaration (including Exhibits 8 and 9), Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Dec. 3, 2014, 12 pages. |
Declaration (including Exhibits M-Q), Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Dec. 19, 2014, 45 pages. |
Declaration (including Exhibits N25-N30), Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Dec. 19, 2014, 93 pages. |
Declaration (including Exhibits P4-P7), Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Dec. 3, 2014, 23 pages. |
English translation of Japanese Publication No. 08-163433 published on Jun. 21, 1996; Inventor: Hamada Toshimichi Iijima Koji; Applicant: Sony Corp.; 15 pages. |
Exhibit H to Intellectual Ventures' Initial Infringement Charts to Nikon, Mar. 28, 2014, 11 pages. |
FUJIFILM Corporation, English translation of Fujifilm's Non-Infringement and Invalidity Arguments against Intellectual Ventures (Intellectual Ventures Patents) [PowerPoint slides], Jun. 5, 2014, pp. 1 and 36-38. |
FUJIFILM Corporation, English translation of Fujifilm's Non-Infringement and Invalidity Arguments against Intellectual Ventures (Intellectual Ventures Patents) [PowerPoint slides], Sep. 22, 2014, pp. 1 and 55-57. |
FUJIFILM Corporation, Fujifilm's Non-Infringement and Invalidity Arguments against Intellectual Ventures (Intellectual Ventures Patents) [PowerPoint slides], Jun. 5, 2014, pp. 1 and 36-38. |
FUJIFILM Corporation, Fujifilm's Non-Infringement and Invalidity Arguments against Intellectual Ventures (Intellectual Ventures Patents) [PowerPoint slides], Sep. 22, 2014, pp. 1 and 55-57. |
Halfhill, Tom R. "See You Around," Byte, May 1995, pp. 85-90. |
Initial Invalidity Contentions for New Patents (non-redacted), Intellectual Ventures v. Nikon, C.A. No. 11-1025-SLR, Apr. 25, 2014, 35 pages. |
Initial Invalidity Contentions, Intellectual Ventures v. Nikon, No. 11-CV-01025-SLR, (D. Del. filed Apr. 25, 2014), 125 p. |
International Search Report, Application PCT/US1998/013465, mailed Oct. 19, 1998, 5 pp. |
Karney, J. "Casio QV-200, QV-700," PC Magazine, Feb. 10, 1998, 2 pp. |
Memorandum Opinion, Intellectual Ventures v. Nikon, No. 11-CV-01025-SLR, (D. Del. filed Apr. 1, 2013), at 488-501. |
Memorandum Order, Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Mar. 10, 2015, 14 pages. |
Nikon's Second Supplemental Invalidity Contentions, Exhibit E, Intellectual Ventures v. Nikon, C.A. No. 11-1025-SLR, Sep. 2, 2014, 113 pages. |
Nikon's Second Supplemental Invalidity Contentions, Intellectual Ventures v. Nikon, C.A. No. 11-1025-SLR, Sep. 2, 2014, 51 pages. |
Opening Brief in Support of Motion, Intellectual Ventures v. Nikon, No, 11-CV-01025-SLR, (D. Del. filed Mar. 5, 2012), at 337-52. |
Opening Brief in Support of Motion, Intellectual Ventures v. Nikon, No. 11-CV-01025-SLR, (D. Del. filed Jan. 17, 2012), at 163-76. |
Opening Brief in Support, Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Dec. 18, 2014, 15 pages. |
PCT Search Report, PCT/US98/13465, mailed Oct. 19, 1998. |
Plaintiffs' Opposition, Intellectual Ventures v. Nikon, No. 11-CV-01025-SLR, (D. Del. filed Mar. 26, 2012), at 357-70. |
Redacted Declaration (including Exhibit B), Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Jan. 21, 2015, 15 pages. |
Redacted Declaration (including Exhibits A-D and F-K), Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Jan. 15, 2015, 96 pages. |
Reply Brief in Support, Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Jan. 15, 2015, 15 pages. |
Reply Claim Construction Brief, Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Dec. 3, 2014, 19 pages. |
Reply Claim Costruction Brief, Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Dec. 3, 2014, 16 pages. |
Report on the Filing, Intellectual Ventures v. Nikon, No. 11-CV-01025-SLR, (D. Del. filed Jan. 2, 2014), at 973-74. |
Ryer, K. "Casio Adds New Camera To Its Lineup," MacWeek, Oct. 2, 1997, vol. 11, Issue 38, 1 page. |
Second Amended Complaint, Intellectual Ventures v. Nikon, No. 11-CV-01025-SLR, (D. Del. filed Jan. 2, 2014), at 814-28 with Exhibits A-H, 159 p. |
Sinclair, M. and B. Erickson. "Round Shot Super 35," May 13, 1996, 1 page. |
Stipulation of Dismissal, Intellectual Ventures v. Nikon, No. 11-CV-01025-SLR, (D. Del. filed May 15, 2013), at 540-41. |
Surreply Claim Construction Brief, Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Dec. 19, 2014, 17 pages. |
Surreply Claim Construction Brief, Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Dec. 19, 2014, 19 pages. |
Transcript, Intellectual Ventures I LLC et al. v. Nikon Corporation et al., Case No. 11-cv-01025-SLR, U.S. District Court for the District of Delaware, filed Jan. 19, 2015, 144 pages. |
U.S. Appl. No. 11/468,828, filed Aug. 31, 2006, Chen. |
U.S. Appl. No. 13/562,220, filed Jul. 20, 2012, Chen. |
United States Patent and Trademark Office, Advisory Action, U.S. Appl. No. 08/938,366, mailed Nov. 29, 2001, 3 pages. |
United States Patent and Trademark Office, Advisory Action, U.S. Appl. No. 11/113,455, mailed Mar. 16, 2010, 3 pages. |
United States Patent and Trademark Office, Final Office Action, U.S. Appl. No. 08/938,366, mailed Aug. 16, 2000, 8 pages. |
United States Patent and Trademark Office, Final Office Action, U.S. Appl. No. 08/938,366, mailed Aug. 27, 1999, 7 pages. |
United States Patent and Trademark Office, Final Office Action, U.S. Appl. No. 08/938,366, mailed Sep. 21, 2001, 6 pages. |
United States Patent and Trademark Office, Final Office Action, U.S. Appl. No. 11/113,455, mailed Dec. 11, 2009, 5 pages. |
United States Patent and Trademark Office, Non-Final Office Action, U.S. Appl. No. 08/938,366, mailed Feb. 22, 2000, 6 pages. |
United States Patent and Trademark Office, Non-Final Office Action, U.S. Appl. No. 08/938,366, mailed Jan. 31, 2001, 8 pages. |
United States Patent and Trademark Office, Non-Final Office Action, U.S. Appl. No. 08/938,366, mailed Mar. 11, 2002, 5 pages. |
United States Patent and Trademark Office, Non-Final Office Action, U.S. Appl. No. 08/938,366, mailed Oct. 2, 1998, 9 pages. |
United States Patent and Trademark Office, Non-Final Office Action, U.S. Appl. No. 11/113,455, mailed Apr. 3, 2009, 9 pages. |
United States Patent and Trademark Office, Non-Final Office Action, U.S. Appl. No. 11/113,455, mailed May 2, 2011, 4 pages. |
United States Patent and Trademark Office, Non-Final Office Action, U.S. Appl. No. 11/113,455, mailed Nov. 26, 2010, 5 pages. |
United States Patent and Trademark Office, Notice of Allowance, U.S. Appl. No. 08/938,366, mailed Aug. 26, 2002, 5 pages. |
United States Patent and Trademark Office, Notice of Allowance, U.S. Appl. No. 11/113,455, mailed Dec. 22, 2011, 6 pages. |
United States Patent and Trademark Office, Notice of Allowance, U.S. Appl. No. 11/113,455, mailed May 14, 2012, 7 pages. |
Also Published As
Publication number | Publication date |
---|---|
CA2296062A1 (en) | 1999-04-08 |
US20010010546A1 (en) | 2001-08-02 |
AU8174898A (en) | 1999-04-23 |
JP2002503893A (en) | 2002-02-05 |
USRE43700E1 (en) | 2012-10-02 |
US6552744B2 (en) | 2003-04-22 |
EP1023803A1 (en) | 2000-08-02 |
WO1999017543A1 (en) | 1999-04-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
USRE45785E1 (en) | Virtual reality camera | |
US7187412B1 (en) | Pointing device for digital camera display | |
US6023588A (en) | Method and apparatus for capturing panoramic images with range data | |
JP5740884B2 (en) | AR navigation for repeated shooting and system, method and program for difference extraction | |
US7583858B2 (en) | Image processing based on direction of gravity | |
JP5659305B2 (en) | Image generating apparatus and image generating method | |
JP5865388B2 (en) | Image generating apparatus and image generating method | |
WO2013069047A1 (en) | Image generation device, and image generation method | |
EP1074943A2 (en) | Image processing method and apparatus | |
WO2014023231A1 (en) | Wide-view-field ultrahigh-resolution optical imaging system and method | |
JP2000283720A (en) | Method and device for inputting three-dimensional data | |
US20230328400A1 (en) | Auxiliary focusing method, apparatus, and system | |
CN111818270B (en) | Automatic control method and system for multi-camera shooting | |
US20090059018A1 (en) | Navigation assisted mosaic photography | |
JP3307075B2 (en) | Imaging equipment | |
CN110796690B (en) | Image matching method and image matching device | |
JP4046973B2 (en) | Information processing method and image mixing apparatus | |
US6169858B1 (en) | Panoramic image capture aid | |
US10708493B2 (en) | Panoramic video | |
JPH1023465A (en) | Image pickup method and its device | |
CN109698903A (en) | Image acquiring method and image acquiring device | |
JP4776983B2 (en) | Image composition apparatus and image composition method | |
JP2005175852A (en) | Photographing apparatus and method of controlling photographing apparatus | |
Kudinov et al. | The algorithm for a video panorama construction and its software implementation using CUDA technology | |
JP2004128588A (en) | Imaging device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ROXIO, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MGI SOFTWARE CORPORATION;REEL/FRAME:030041/0886 Effective date: 20020703 Owner name: SONIC SOLUTIONS, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ROXIO, INC.;REEL/FRAME:030042/0047 Effective date: 20041217 Owner name: INTELLECTUAL VENTURES I LLC, DELAWARE Free format text: MERGER;ASSIGNOR:KWOK, CHU & SHINDLER LLC;REEL/FRAME:030042/0474 Effective date: 20110718 Owner name: LIVE PICTURE, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CHEN, SHENCHANG ERIC;REEL/FRAME:030041/0590 Effective date: 19970926 Owner name: KWOK, CHU & SHINDLER LLC, NEVADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SONIC SOLUTIONS;REEL/FRAME:030042/0250 Effective date: 20050421 Owner name: MGI SOFTWARE CORPORATION, CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LIVE PICTURE, INC.;REEL/FRAME:030041/0744 Effective date: 20001117 |
|
AS | Assignment |
Owner name: HANGER SOLUTIONS, LLC, GEORGIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTELLECTUAL VENTURES ASSETS 161 LLC;REEL/FRAME:052159/0509 Effective date: 20191206 |
|
AS | Assignment |
Owner name: INTELLECTUAL VENTURES ASSETS 161 LLC, DELAWARE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTELLECTUAL VENTURES I LLC;REEL/FRAME:051945/0001 Effective date: 20191126 |
|
AS | Assignment |
Owner name: TUMBLEWEED HOLDINGS LLC, NEW JERSEY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HANGER SOLUTIONS, LLC;REEL/FRAME:059620/0066 Effective date: 20210303 |