EP3008693A1 - System for tracking the position of the shooting camera for shooting video films - Google Patents
System for tracking the position of the shooting camera for shooting video filmsInfo
- Publication number
- EP3008693A1 EP3008693A1 EP14736893.0A EP14736893A EP3008693A1 EP 3008693 A1 EP3008693 A1 EP 3008693A1 EP 14736893 A EP14736893 A EP 14736893A EP 3008693 A1 EP3008693 A1 EP 3008693A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- camera
- data
- sensor
- optical
- module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2224—Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
Definitions
- the present invention relates to systems for locating the position of the filming camera for shooting video films.
- Devices comprising a system for locating in the space used during the shooting via optical sensors.
- a system marketed under the name of Lightcraft is an example.
- a sensor acquires sights placed on the ceiling with great precision.
- it is sought to be able to determine the position of the filming camera with greater flexibility.
- the subject of the invention is a video film shooting system in a real space defined in a real repository, comprising: a filming camera adapted to record a real image for a plurality of distinct time frames; a sensor system comprising a first optical sensor system comprising at least one optical sensor and adapted to recording data in an optical mode, and a second sensor system comprising at least one sensor, adapted to record data; a computerized positioning module adapted to integrate data from at least one sensor of the first optical sensor system and to determine location data of the filming camera in real space from these data, the computer module of locating being adapted to integrate the data from at least one sensor of the second sensor system and to determine location data of the filming camera in real space from these data; a combination computer module adapted to repeatedly determine location data in the actual reel of the camera from both location data determined in the optical mode and location data determined in the second mode.
- the computerized combination module comprises a computer adapted to determine a difference between the location data obtained in the optical mode and the second mode, thereby generating a result function
- the computer combination module also comprises a comparator adapted to compare the function at a threshold value, thus generating a comparison function, and for which the comparison function takes a value in a list of values
- the combination computer module also comprises a selector with as input the comparison function and at the output the mode selection signal in a list comprising at least the optical mode and the second mode respectively corresponding to values of the comparison function, the weighting respectively taking the value 0 or 1.
- the system includes a button adapted to mechanically select a mode from the list.
- the first optical sensor system comprises an evaluator adapted to evaluate a number of detectable points of natural topographic information detected by the optical sensor, and a reliability module adapted to integrate the data of the evaluator and to output a coefficient of reliability on the data recorded in optical mode, for determining the weighting of the location data from the optical sensor and the sensor.
- the selector is adapted to also receive the input signal reliability coefficient.
- the second sensor system comprises at least one optical field change sensor, adapted to determine a mechanical movement leading to an optical field change of the filming camera, and adapted to record optical field change data in a mechanical mode.
- the first optical sensor system of the spatial turning camera comprises at least one optical sensor, presenting location data with respect to the turning camera known for each time frame, and adapted to transmit to the computerized tracking module the natural topographic information detected by the optical sensor.
- a computerized locating module compares the natural topographic information detected by the optical sensor with a predetermined three-dimensional model of the actual space.
- the tracking system comprises a computerized generation module adapted to generate a predefined three-dimensional model of the real space, and in which the optical sensor is adapted to transmit topographic information detected by said optical sensor to the computerized generation module.
- the optical sensor is adapted to simultaneously transmit to the computerized registration module and the computerized generation module natural topographic information detected by said optical sensor, and wherein the computerized generation module is adapted to enrich said predetermined three-dimensional model of the real space from the natural topographic information detected by the optical sensor.
- the filming camera and the optical sensor are fixedly attached to each other.
- the optical field change sensor is an inertial sensor integral with the filming camera and adapted to record the data concerning the evolution of the position of the filming camera.
- the inertial sensor comprises a gyroscope or an inertial cube.
- the shooting camera is carried by a mobile support on a base and the optical field change sensor comprises a mechanical encoder fixed on the support of the filming camera and adapted to record the data concerning the evolution of the position of the support of the filming camera.
- the system includes an external mechanical encoder of the internal parameters of the camera adapted to record the data on the evolution of the internal optical acquisition parameters of the camera such as zoom, iris, focal length.
- the data on the evolution of the internal optical acquisition parameters of the camera are integrated into the input data of the computerized registration module.
- the computerized filming module is adapted to integrate the data from the camera signal and the internal optical acquisition parameters of the filming camera.
- the system comprises a device adapted to correct any deformation of the optical field, this device being adapted to integrate the camera data and to output the camera data to the computerized turning module.
- Figure 1 is a view of real space
- FIG. 2 is a view of the turning system
- FIG. 3 is a view of the two sensor systems.
- Figure 4 is an operating diagram of the turning system
- FIG. 5 is a view of the operation of the optical sensor system
- FIG. 6 is an operating diagram of the combination computer module
- FIG. 7 is an operating diagram of the turning system.
- the same references designate identical or similar elements.
- the real space 1 comprises a certain number of natural topographic information 2.
- This information is for example related to geometrical objects of the real space 1, such as points, lines, surfaces and / or volumes.
- a line we can for example consider edges of a structure, and as point intersections of two of these edges.
- a surface it may for example consider solid surfaces, such as a car hood, or other.
- a volume it will be possible for example to refer to objects, such as a car, or another object present in real space 1.
- the real reference is a system for locating real space 1.
- a video film shooting system is described in a filming configuration.
- a video film is a sequence of images broadcast at a fast rate (several frames per second, for example 24 (cinema), 25 (PAL) or 30 (NTSC) images per second) to a viewer.
- This series of images is for example projected or broadcast as part of a cinematographic film, a TV movie, an informative message, a video game, or other.
- this diffusion or projection can be delayed in time with respect to the turning.
- This sequence of images relates an event taking place in a real space 1.
- a filming camera 3 of any type suitable for filming such a scene is used.
- a digital camera that can acquire several images per second, for example 24 images per second, is used.
- the camera 3 comprises an optics capable of acquiring images in an optical field 4, and connected to a computerized turning module 40.
- This connection is for example made by a suitable cable, or without a cable, for example by radio or other transmission.
- the filming camera 3 is of any known type suitable, but the invention is particularly suitable if it is possible to vary the optical field 4 during shooting. In particular, the optical field 4 can be varied by moving the filming camera 3 in the real space 1.
- the filming camera 3 is movable, in a guided manner, in the real space 1, for example being mounted on a rail 50 or a crane 52 having an arm 4 '' articulated with respect to a support 4 '' 'according to one, two or three degrees of freedom, and defining a location of possible for the filming camera 3.
- a filming camera 3 compact enough to be movable in space real 1 simply being carried by an operator.
- the filming camera 3 comprises a monitor mounted on the housing of the camera 3 and having a control screen 6 visible by the film operator, and on which the optical field 4 acquired by the camera is displayed ( shown closed in Figure 2).
- the turning system also comprises a sensor system 7 of the turning camera 3 in the real space 1, shown in FIG. 3.
- the sensor system 7 comprises two sensor systems 9, 10.
- the first optical sensor system 9 comprises an optical sensor 11 which is an optical camera for example as shown in FIG.
- the optical sensor 11 has the particularity of presenting a known location at any time with respect to the filming camera 3.
- location here means that the position and the orientation of the optical sensor 11 relative to the filming camera 3 are known at all times. These are in particular the positions and relative orientations of the acquisition systems of the optical sensor 11 and the camera 3 (CCD matrix for it). This can for example be achieved in a simple manner by fixing the optical sensor 11 rigidly to the filming camera 3, for example by means of a flange or any other suitable mechanical system.
- the optical sensor 11 is characterized in particular by an acquisition field 13.
- the optical sensor 11 may be placed so that no part of the filming camera 3 blocks part of the acquisition field 13, and no part of the optical sensor 11 obstructs part of the optical field 4.
- an optical sensor 11 specifically dedicated to the tracking task is used, and having acquisition characteristics that are distinct from the filming camera 3.
- the filming camera 3 can be dedicated to its own task. which is to film, and the optical sensor 11 to its own task, which is to locate.
- the optical sensor 11 can be provided with an optical camera of small dimensions, in particular with a space at least two times smaller than the size of the filming camera 3. Thus, the inconvenience for the operator will be minimal. .
- an optical camera specifically dedicated to obtaining the position of the filming camera in real space 1, which has an acquisition frequency at least twice that of the filming camera 3, for example the order of 100 images per second, thereby smoothing by calculation the position of the filming camera 3 in the real space 1 for each frame of time.
- an optical camera having an optical field (solid angle of view) 20 times greater than the optical field 4 of the filming camera, in order to maximize the information acquired from the real space that can be used for calculation. positioning of the filming camera.
- a wide angle lens (“fish eye” or "fish eye” in English) having an acquisition angle greater than 160 degrees.
- the optical sensor 11 is adapted to acquire information relating to the real space 1, so as to be able to determine the position of the optical sensor 11 in the real space 1.
- the first optical sensor system 9 can comprise several optical sensors used successively or simultaneously.
- the turning system also comprises a computerized registration module 8.
- the computerized registration module 8 is adapted to determine location data in the real repository 1 of the filming camera 3 from the location data from the various sensors of the camera. sensor system 7, as shown in Figure 4.
- the computerized registration module 8 receives as input the signal coming from a sensor and generates data concerning the position of the filming camera 3 at the output.
- the computerized registration module 8 is connected to the sensor by a cable or without cable. Alternatively it can receive data from different sensors at a time.
- the computerized registration module 8 receives location data 11 'originating from an optical sensor 11 of the first optical sensor system 9.
- the computerized registration module 8 can receive location data from several optical sensors successively or simultaneously.
- the computerized registration module 8 can determine, for an acquisition made by the optical sensor. 11, using a predefined three-dimensional model 14 of the real space 1, the position of the optical sensor 11 in the real space 1 (see Figure 5).
- the computerized registration module 8 will determine the most probable location of the optical sensor 11 in the real space, which makes it possible to match the data acquired by the optical sensor 11 and the predefined three-dimensional model of the real space 1, such as as shown in Figure 5.
- the computerized registration module 8 can thus determine the location data of the filming camera 3 in the real repository 1 '.
- the position of the filming camera 3 is directly determined without an explicit determination of the location of the optical sensor 11.
- the predefined three-dimensional model of the real space 1 comprises, for example, natural topographical information 2 of the real space 1.
- This space is for example available by any appropriate means.
- the three-dimensional model 14 is generated by the computerized generation module 33 during a learning phase, as shown in FIG. 5. This step is for example carried out shortly before the shooting, so that the real space 1, at the time of shooting, corresponds to the pre-established model.
- the three-dimensional model 14 thus generated is imported into the computerized registration module 8, and the latter makes a comparison between the natural topographic information 2 detected by the optical sensor 11. and the predetermined three-dimensional model 14 of the real space 1 in order to identify at any time, in a turning configuration, the actual position in the real space 1 of the filming camera 3 as represented in FIG.
- the optical sensor 11 transmits topographic information 2 detected by said optical sensor 11 to the computerized generation module 33.
- a particular embodiment has just been described, making it possible to determine the position of the location of the filming camera 3 from a dedicated optical sensor 11. This can be oriented towards the real space 1 filmed by the filming camera 3. It is also possible to use various optical sensors 11 having various orientations.
- the optical sensor 11 may alternatively be the same as the filming camera 3. In such a case, the filming camera 3 itself is used to determine its position from natural topographic data 2.
- the computerized registration module 8 stores in memory the identity and the shape of each marker, and its position in the real space 1. The computerized registration module 8 determines the position of the filming camera 3 from the image acquired from the marker, the memory, and the respective positions of the optical sensor 11 and the filming camera 3.
- the second sensor system 10 comprises an optical field change sensor 12 as represented in FIG.
- This optical field change sensor 12 makes it possible to determine a movement of the filming camera 3.
- the optical field change sensor 12 may be for example an inertial sensor 15 such as an inertial cube or a gyroscope.
- the inertial sensor 15 is fixed to the filming camera 3 as represented in FIG. 2.
- the second sensor system 10 may comprise a plurality of optical field change sensors 12 used successively or simultaneously.
- the filming camera 3 is carried by a support mounted on a crane 52 carrying several joints and sliding on a rail 50 as shown in FIG. 2.
- the computerized registration module 8 can calculate the position of the filming camera at from the information given by the mechanical encoders 16 for each degree of freedom, and the configuration of the system (for example the length of the articulated arm, or the distance between the pivot point of the crane and the reference point of the camera shooting) .
- the data of the optical field change sensor 12 ' relating to a physical movement of the camera 3, are directly integrated into the input data of the computer module of FIG. location 8 of the position of the filming camera 3.
- the computerized registration module 8 can receive location data from several optical field change sensors successively or simultaneously.
- the advantage of working also with a mechanical sensor is that in a space without a topographic marker such as the desert, the optical sensor is of low efficiency.
- this information are sent to the input of the computerized registration module 8 and integrated in the procedure for identifying the position of the filming camera 3 as shown in Figure 4.
- These two sensor systems 9, 10 are therefore dedicated to be used alternately.
- the computerized registration module 8 may have several options to determine at any time the position in the real space 1 of the filming camera 3. For example, in the case where the computerized registration module 8 can not locate sufficient topographical information 2 to determine for sure the position in the real space 1 of the filming camera 3, it can by default consider that the filming camera 3 is stationary during this time. In reality, if the optical sensor 11 is not capable of determining the topographic information 2, it is because the optical field 4 of the filming camera 3 is probably closed by a very close real object. At the next time frame where the optical sensor 11 can determine sufficient topographic information 2 the position in the three-dimensional space of the filming camera 3 can again be determined.
- the optical field change sensor 12 can relay the optical sensor 11 and provide information on the position of the filming camera 3.
- the video film shooting system comprises a computerized combination module 21 which makes it possible to switch from the first optical sensor system 9 to the second sensor system 10 or to combine the two registration systems simultaneously as shown in FIG. 4.
- the location data obtained with the first optical sensor system 9 in the optical mode, and the location data obtained with the second sensor system 10 in the mechanical mode via the computer registration module 8 are integrated with the computerized combination module 21.
- the computerized combination module 21 comprises a computer 19. This computer receives at the input of the computerized registration module 8 the location data obtained in these two modes and determines the difference in the form of a result function 20.
- This result function 20 is compared via the comparator 22 integrated in the computerized combination module 21, to a threshold value 23.
- the comparison function 24 which evaluates the difference between the data indicated by the optical and mechanical sensors is generated by the comparator 22 and takes a value in a list of two values, each value being assigned to a respective sensor.
- the selector 25 also integrated in the computerized combination module 21 integrates as input data comparison function 24 and output transmits the selection signal 26 of the mode chosen from the optical mode and the mechanical mode. For example, if the location data from the two modes are very close, we may prefer to use the optical mode if we know that optimal operation of both modes, it gives a better accuracy. If the location data from the two modes are very different, we may prefer to choose the mechanical mode, if we know that there is a lower probability that it gives a false result.
- the user can manually switch from the first optical sensor system 9 to the second sensor system 10, and vice versa.
- the first optical sensor system 9 comprises an evaluator 42 (shown in FIG. 5) adapted to evaluate the number of detectable points of the detected natural topographic information 2, and a reliability module 44 adapted to integrate the data of the 1 ' evaluator 42 and output a reliability coefficient 46 on the data recorded in optical mode.
- the computerized combination module 21 is adapted to select a combination of the optical mode and the mechanical mode and comprises a weighting device 48 as represented in FIG. 4, adapted to weight the location data coming from the optical sensor 11 and the image change sensor. optical field 12 in the location process of the filming camera 3.
- the weighting "a” can be determined either by choice of the user, or by processing the image obtained by the optical sensor 11, or by difference between the two position data obtained, see examples described above, or other .
- the weighting "a” may be modified over time, either at will, or for each time frame or for each shooting, for example.
- the computerized registration module 8 which receives and processes the sensor data provides information on the location of the filming camera 3 to the computerized turning module 40 as shown in FIG. 7, to enable the position of the filming camera to be followed. 3 throughout the shooting by the camera of Turning 3.
- the computerized registration module 8 communicates with the computerized turning module 40 with a cable or without cable.
- the system may also include an external mechanical encoder 17, as shown in FIG. 4, which records the data on the evolution of the internal optical acquisition parameters 18 of the camera 3 such as zoom, iris or focus .
- the computerized film video shooting module 40 thus receives as input the data recorded by the filming camera 3 and by the computerized registration module 8.
- the computerized turning module 40 can also integrate the internal optical acquisition parameters 18. These internal optical acquisition parameters characterize the filming camera 3 as an optical sensor. They are available for a given optical configuration of the filming camera 3. They are for example provided in the form of metadata multiplexed with the video stream coming from the filming camera 3.
- the turning system also comprises a device 30 adapted to correct any deformation of the optical field, this device being adapted to integrate the camera data 3 'and to output the camera data 3' to the computerized turning module 40.
- the computerized turning module 40 also comprises a computer animation module 27.
- This animation module may comprise an animation database 28 comprising one or more virtual animations 29.
- Each animation comprises, for example, for each of a set of time frames corresponding to all or part of the duration of the video film to be rotated. , characteristics of three-dimensional objects (point, line, surface, volume, texture, ...) expressed in a virtual repository.
- each animation represents an augmented virtual reality event.
- the computerized turning module 40 comprises a composition module 30.
- the composition module 30 imports an animation 29 of the animation module 27 along a link 30.
- the computerized composition module generates, for the time frame in question, a composite image 31 of the real image acquired by the filming camera 3, and a projection of a virtual image 32, corresponding to the virtual object. 31 for this same time frame, the projection being generated according to the location data in the real repository 1 'of the filming camera 3.
- the composite image 31 comprises the superposition of the real image, and the virtual image 32, as if this virtual image 32 was the image of an object present in the real space 1, acquired for this time frame by the filming camera 3.
- the composite image 31 is then displayed on the control screen.
- the operator, filming can, for each time frame, visualize, on its control screen 6, the position and orientation of the virtual object in real space 1, according to its own angle of view, as if this virtual object was present in front of him. It can thus adapt if necessary the position of the filming camera 3 with respect to the objects.
- missing sequences are reconstructed according to the sequences filmed just before and just after the instant of the missing sequence, and the exact position of the filming camera 3.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Electromagnetism (AREA)
- Remote Sensing (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1355510A FR3007175B1 (en) | 2013-06-13 | 2013-06-13 | TURNING CAMERA POSITIONING SYSTEMS FOR TURNING VIDEO FILMS |
PCT/FR2014/051423 WO2014199085A1 (en) | 2013-06-13 | 2014-06-12 | System for tracking the position of the shooting camera for shooting video films |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3008693A1 true EP3008693A1 (en) | 2016-04-20 |
Family
ID=49876721
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP14736893.0A Withdrawn EP3008693A1 (en) | 2013-06-13 | 2014-06-12 | System for tracking the position of the shooting camera for shooting video films |
Country Status (8)
Country | Link |
---|---|
US (1) | US20160127617A1 (en) |
EP (1) | EP3008693A1 (en) |
KR (1) | KR20160031464A (en) |
CN (1) | CN105637558A (en) |
AU (1) | AU2014279956A1 (en) |
CA (1) | CA2914360A1 (en) |
FR (1) | FR3007175B1 (en) |
WO (1) | WO2014199085A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR3007175B1 (en) * | 2013-06-13 | 2016-12-09 | Solidanim | TURNING CAMERA POSITIONING SYSTEMS FOR TURNING VIDEO FILMS |
US10432915B2 (en) * | 2016-03-22 | 2019-10-01 | The Sanborn Map Company, Inc. | Systems, methods, and devices for generating three-dimensional models |
EP3529982B1 (en) * | 2017-01-31 | 2023-10-11 | Hewlett-Packard Development Company, L.P. | Video zoom controls based on received information |
CN108171749A (en) * | 2018-02-12 | 2018-06-15 | 中南大学湘雅二医院 | A kind of mechanical arm heat source tracking auxiliary system and its method based on gyroscope |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014199085A1 (en) * | 2013-06-13 | 2014-12-18 | Solidanim | System for tracking the position of the shooting camera for shooting video films |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6922632B2 (en) * | 2002-08-09 | 2005-07-26 | Intersense, Inc. | Tracking, auto-calibration, and map-building system |
US7231063B2 (en) * | 2002-08-09 | 2007-06-12 | Intersense, Inc. | Fiducial detection system |
US20100045701A1 (en) * | 2008-08-22 | 2010-02-25 | Cybernet Systems Corporation | Automatic mapping of augmented reality fiducials |
-
2013
- 2013-06-13 FR FR1355510A patent/FR3007175B1/en active Active
-
2014
- 2014-06-12 CA CA2914360A patent/CA2914360A1/en not_active Abandoned
- 2014-06-12 CN CN201480044654.2A patent/CN105637558A/en active Pending
- 2014-06-12 WO PCT/FR2014/051423 patent/WO2014199085A1/en active Application Filing
- 2014-06-12 KR KR1020157036849A patent/KR20160031464A/en not_active Withdrawn
- 2014-06-12 AU AU2014279956A patent/AU2014279956A1/en not_active Abandoned
- 2014-06-12 US US14/897,806 patent/US20160127617A1/en not_active Abandoned
- 2014-06-12 EP EP14736893.0A patent/EP3008693A1/en not_active Withdrawn
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014199085A1 (en) * | 2013-06-13 | 2014-12-18 | Solidanim | System for tracking the position of the shooting camera for shooting video films |
Non-Patent Citations (4)
Title |
---|
CHANDARIA J ET AL: "REAL-TIME CAMERA TRACKING IN THE MATRIS PROJECT", INTERNATIONAL BROADCASTING CONFERENCE 2006; 7-9-2006 - 11-9-2006; AMSTERDAM,, 7 September 2006 (2006-09-07), XP030081513 * |
J. CHANDARIA ET AL: "Realtime Camera Tracking in the MATRIS Project", SMPTE MOTION IMAGING JOURNAL, vol. 116, no. 7-8, 1 July 2007 (2007-07-01), US, pages 266 - 271, XP055495012, ISSN: 1545-0279, DOI: 10.5594/J11426 * |
JIGNA CHANDARIA ET AL: "The MATRIS project: real-time markerless camera tracking for Augmented Reality and broadcast applications", JOURNAL OF REAL-TIME IMAGE PROCESSING, vol. 2, no. 2-3, 18 October 2007 (2007-10-18), DE, pages 69 - 79, XP055344364, ISSN: 1861-8200, DOI: 10.1007/s11554-007-0043-z * |
See also references of WO2014199085A1 * |
Also Published As
Publication number | Publication date |
---|---|
KR20160031464A (en) | 2016-03-22 |
CN105637558A (en) | 2016-06-01 |
AU2014279956A1 (en) | 2015-12-24 |
FR3007175A1 (en) | 2014-12-19 |
US20160127617A1 (en) | 2016-05-05 |
WO2014199085A1 (en) | 2014-12-18 |
FR3007175B1 (en) | 2016-12-09 |
CA2914360A1 (en) | 2014-12-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP0661672B1 (en) | Picture processing process and device for building a target image from a source image with perspective change | |
WO2013088076A1 (en) | System for filming a video movie | |
FR2675977A1 (en) | METHOD FOR MODELING A VIEWING SYSTEM AND METHOD AND SYSTEM FOR REALIZING COMBINATIONS OF REAL IMAGES AND SYNTHESIS IMAGES. | |
EP3005296B1 (en) | Merging of a plurality of video flows | |
EP2385405B1 (en) | Panomaric projection device and method implemented by said device | |
WO2008099092A2 (en) | Device and method for watching real-time augmented reality | |
FR2836215A1 (en) | SYSTEM AND METHOD FOR THREE-DIMENSIONAL MODELING AND RENDERING OF AN OBJECT | |
FR2821167A1 (en) | PHOTOGRAPHIC SUPPORT DEVICE | |
FR3004565A1 (en) | FUSION OF SEVERAL VIDEO STREAMS | |
CN105165000A (en) | Panoramic-imaging digital camera, and panoramic imaging system | |
JP2013027021A (en) | Omnidirectional imaging device and omnidirectional imaging method | |
EP3008693A1 (en) | System for tracking the position of the shooting camera for shooting video films | |
FR2858692A1 (en) | SYSTEM FOR VISUALIZATION OF IMAGES IN THREE DIMENSIONS WITH A RENDER IN RELIEF OVER 36O DEGREES | |
JP5248951B2 (en) | CAMERA DEVICE, IMAGE SHOOTING SUPPORT DEVICE, IMAGE SHOOTING SUPPORT METHOD, AND IMAGE SHOOTING SUPPORT PROGRAM | |
FR3027144A1 (en) | METHOD AND DEVICE FOR DETERMINING MOVEMENT BETWEEN SUCCESSIVE VIDEO IMAGES | |
FR2821156A1 (en) | METHOD AND DEVICE FOR OBTAINING A DIGITAL PANORAMIC IMAGE WITH CONSTANT TINT | |
EP3473000B1 (en) | Image capture method and system using a virtual sensor | |
FR2910648A1 (en) | Object's e.g. building, geometrical data capturing method for e.g. online sale application, involves capturing images representing object, from two different view points, and measuring distance between one of view points and point of object | |
JP2011244184A (en) | Image input apparatus, image input method and image input program | |
KR20190112407A (en) | Operating method for Holoportation contents | |
JP2008152374A (en) | Image system, photographing direction specifying device, photographing direction specifying method and program | |
JP7518695B2 (en) | Shooting metadata recording device and program | |
US20240251157A1 (en) | Imaging apparatus | |
FR3057430A1 (en) | DEVICE FOR IMMERSION IN A REPRESENTATION OF AN ENVIRONMENT RESULTING FROM A SET OF IMAGES | |
FR3048786A1 (en) | DYNAMIC ADJUSTMENT OF THE SHARPNESS OF AT LEAST ONE IMAGE PROJECTED ON AN OBJECT |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
TPAC | Observations filed by third parties |
Free format text: ORIGINAL CODE: EPIDOSNTIPA |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20151209 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20171006 |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: SZLAPKA, JEAN-FRANCOIS Inventor name: LINOT, ROBERT-EMMANUEL Inventor name: PARTOUCHE, ISAAC |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20181211 |