WO2013088076A1 - Système de tournage de film vidéo - Google Patents
Système de tournage de film vidéo Download PDFInfo
- Publication number
- WO2013088076A1 WO2013088076A1 PCT/FR2012/052916 FR2012052916W WO2013088076A1 WO 2013088076 A1 WO2013088076 A1 WO 2013088076A1 FR 2012052916 W FR2012052916 W FR 2012052916W WO 2013088076 A1 WO2013088076 A1 WO 2013088076A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- sensor
- module
- filming
- real
- Prior art date
Links
- 239000002131 composite material Substances 0.000 claims abstract description 16
- 230000003287 optical effect Effects 0.000 claims description 48
- 239000000203 mixture Substances 0.000 claims description 16
- 239000007787 solid Substances 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 2
- 238000012544 monitoring process Methods 0.000 abstract 2
- 239000011159 matrix material Substances 0.000 description 5
- 238000000034 method Methods 0.000 description 4
- 230000003190 augmentative effect Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 241000251468 Actinopterygii Species 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 241000282320 Panthera leo Species 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 230000008570 general process Effects 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B31/00—Arrangements for the associated working of recording or reproducing apparatus with related apparatus
- G11B31/006—Arrangements for the associated working of recording or reproducing apparatus with related apparatus with video camera or receiver
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2224—Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/87—Regeneration of colour television signals
- H04N9/8715—Regeneration of colour television signals involving the mixing of the reproduced video signal with a non-recorded signal, e.g. a text signal
Definitions
- the present invention relates to video film shooting systems.
- a classic method for building a video film with such an augmented reality sequence is to start by shooting actors in a calibrated neutral environment, for example in a studio with a monochrome background. A few weeks or months later, in post-production, three-dimensional animations are added, which give the illusion of interacting with the filmed actor.
- the present invention is intended to overcome these disadvantages.
- a system for shooting a video film in a real space defined in a real repository comprising:
- a shooting camera adapted to record a real image for a plurality of distinct time frames
- a tracking system in space comprising:
- At least one sensor presenting location data with respect to the turning camera known for each time frame, and adapted to transmit to a computer module for locating the natural topographic information detected by the sensor
- a computerized tracking module adapted to determine, for each time frame, location data in the actual repository of the camera from the sensor location data, and a comparison between the natural topographic information and a model three-dimensional preset of real space
- a computerized composition module adapted to generate on the control screen, for each time frame, a composite image of the real image, and a projection of a virtual image, derived from a database of virtual animations, projection generated according to the location data in the real repository of the filming camera.
- the display on the control screen is generated almost instantly, for example in the second, taking into account the possible processing time and latency due to the various components of the system.
- the tracking system comprises a computerized generation module adapted to generate said predetermined three-dimensional model of the real space, and said sensor is adapted to transmit topographic information detected by the sensor to the generation computer module;
- the sensor is adapted to transmit simultaneously with the computerized locating module and the computer module for generating the natural topographic information detected by the sensor, and wherein the computerized generation module is adapted to enrich said predetermined three-dimensional model of the real space from the natural topographical information detected by the sensor;
- the topographic information includes information relating to real space geometric objects selected from points, lines, areas and volumes;
- the filming camera and the sensor are fixedly attached to each other;
- the system further comprises a location system comprising a pattern adapted to be detected simultaneously by the filming camera and the sensor in a location configuration, and a computerized location module adapted to determine the respective location data of the sensor and the camera shooting from the simultaneous detection of the test pattern;
- system further comprises an optical calibration system comprising an optical calibration pattern adapted to be detected by the filming camera, in an optical calibration configuration, and the computerized registration module is adapted to determine, for each time frame, the location data in the actual repository of the filming camera further from optical calibration data of the filming camera determined by the optical calibration system;
- system further comprises at least one of the following entities:
- an inertial sensor fixed on the filming camera adapted to determine a movement of the filming camera
- the computerized marking module being adapted to determine the location data in the actual repository of the filming camera further from data provided by the inertial sensor
- the computerized composition module being adapted to put a virtual image at the scale of the real space on the basis of an image of the standard acquired by the filming camera;
- the computerized composition module being adapted to generate the composite image by taking into account said parameter
- the system further comprises a computerized animation module, comprising a database of virtual animations, each animation comprising, for each of a set of time frames, a three-dimensional image expressed in a virtual repository, the computerized module of animations being adapted to transmit said three-dimensional images to the composition module;
- a computerized animation module comprising a database of virtual animations, each animation comprising, for each of a set of time frames, a three-dimensional image expressed in a virtual repository, the computerized module of animations being adapted to transmit said three-dimensional images to the composition module;
- the computerized tracking module is adapted to transmit the predetermined three-dimensional model of the real space to the computerized animation module;
- the computerized composition module is adapted to generate on the control screen, for each time frame, a shadow of the virtual image, shadow generated according to the location data in the real repository of the filming camera, and data location of lighting in the real repository.
- the computerized locating module is adapted to determine location data in the real repository of the filming camera also from location data in the real repository of the filming camera for a frame previous time;
- the computerized registration module comprises a selection module adapted to select from the geometric patterns geometric patterns of the three-dimensional model used to find the position of the filming camera in the 3D space;
- the selection module compares geometric patterns of a subsequent image with geometric patterns of a previous image, associates geometric patterns present on the two images and immobile in the real space, and does not retain the other geometric patterns for comparison with the three-dimensional model;
- the tracking system further comprises a second sensor having at least one characteristic different from the first sensor, chosen from the position, the orientation, the solid angle of view, the acquisition frequency, the optical axis, the optical field.
- FIG. 1 is a schematic view of a real space
- FIG. 2 is a schematic view of a turning system according to one embodiment of the invention.
- FIG. 3 is a schematic view showing a use of the system of FIG. 2 in a learning configuration
- FIG. 4 is a perspective view of a three-dimensional model of real space
- FIG. 5 is a view similar to FIG. 2 in a location configuration
- FIG. 5a is a view similar to FIG. 5 for scaling
- FIG. 6a is a schematic view of the system in a turning configuration, at a first instant
- FIG. 6b is a representative diagram of an acquisition made by the filming camera at the instant shown in FIG. 6a,
- FIG. 6c is a schematic view of a composite image produced on the control screen at the same time
- FIGS. 7a, 7b and 7c respectively correspond to FIGS. 6a, 6b and 6c for a second instant
- FIG. 8 is a schematic view of a screen of a programmable machine comprising an animation computer module
- FIG. 9 is a flowchart of a method for producing a video film using the objects described above.
- Figure 10 is a schematic view of an acquisition system according to an alternative embodiment.
- Figure 1 schematically shows a part 1 of the real space.
- Figure 1 provides a very specific example of a real space 1.
- the present invention could be applied in a very large number of different real spaces.
- An actual repository 2 is attached to the real space 1, and comprises for example an origin 0 and three orthonormal axes X, Y and Z. Thus, each point of the real space 1 has a unique set of coordinates in the Real repository 2.
- the real space 1 is an outdoor outdoor space, comprising a horizontal road 3 extending substantially along the Y axis and a building 4 located in the depth.
- Building 4 may include various windows 5a, 5b, 5c and doors 6a, 6b, 6c and the like.
- a sidewalk 7 extends for example between the road 3 and the building 4. It can for example note a car 8 parked.
- Real space 1 has a number of natural topographic information. This information is for example related to geometric objects of real space, such as points, lines, surfaces and / or volumes. As a line, we can for example consider edges of a structure, and as point intersections of two of these edges. As a surface, it may for example consider solid surfaces, such as a car hood, or other. As volume, we can for example refer to objects, such as a car, or another object present in real space.
- the natural topographic information is distinguished from calibration markers reported by the fact (s) that:
- a video film shooting system in a filming configuration.
- the video film it is a series of images broadcast at a fast frequency (several images per second, for example 24 (cinema), 25 (PAL) or 30 (NTSC) images per second) to a spectator.
- This series of images is for example projected or broadcast as part of a film, a TV movie, an informative message, a video game, or other.
- this diffusion or projection can be delayed in time with respect to the turning.
- This sequence of images relates an event taking place in real space 1.
- a filming camera 9 of any type suitable for filming such a scene is used.
- a digital camera that can acquire several images per second, for example 24 images per second, is used.
- the camera 9 comprises an optical 10 that can acquire images in an optical field 11, and connected to a computer system 12. This connection is for example made by a cable 13 adapted or without cable, for example by radio transmission or other.
- the turning camera 9 is of any known type suitable, but the invention is particularly suitable if it is possible to vary the optical field 11 during shooting.
- the optical field 11 can be varied by moving the filming camera 9 in the real space 1. This is particularly the case if the filming camera 9 is movable, in a guided manner, in the real space 1 , for example by being mounted on a rail or a crane having an articulated arm (not shown) defining a location of possibilities for the filming camera 9.
- a filming camera 9 is used which is sufficiently compact to be movable in the real space 1 simply by an operator (not shown).
- the filming camera 9 comprises a monitor 14 mounted on the housing of the camera and having a control screen 15 visible by the film operator, and on which the optical field 11 acquired by the camera is displayed.
- the turning system also comprises a system for locating in the space comprising, on the one hand, a sensor 16 and, on the other hand, a computerized registration module 17 of the computer system 12, and connected to the sensor 16 by a cable 18 or without a cable. , as indicated previously.
- the sensor 16 has the particularity of having a known location at any time with respect to the filming camera 9.
- location here means that the position and the orientation of the sensor 16 with respect to the filming camera 9 are known to any moment. These are in particular the positions and relative orientations of the acquisition systems of the sensor and the camera 9 (CCD matrix for it). This can for example be achieved in a simple manner by fixing the sensor 16 rigidly to the turning camera 9, for example by means of a flange 19 or any other suitable mechanical system.
- the sensor 16 is characterized in particular by an acquisition field 20. It is possible, for example, to place the sensor 16 so that no part of the filming camera 9 blocks a portion of the acquisition field 20, and that no part of the sensor 16 obstructs part of the optical field 11, as shown in FIG.
- the sensor 16 is adapted to acquire information relating to the real space 1, so as to be able to determine the position of the sensor 16 in the real space, using the computerized registration module 17.
- it can be provided in a turning configuration, acquiring location data in the real space 1 with the sensor 16, and that the computer registration module 17 can determine, for an acquisition made by the sensor 16, with the aid of a predefined three-dimensional model 21 of the real space, the position of the sensor 16 in the real space.
- the registration module 17 will determine the most probable location of the sensor 16 in the real space, which makes it possible to match the data acquired by the sensor 16 and the predefined three-dimensional model 21 of the real space.
- the tracking module 17 can thus determine the location data of the filming camera in the real repository.
- the filming camera 9 can be dedicated to its own task, which is to film , and the sensor 16 to its own task, which is to locate.
- the senor 16 is a sensor optical. If it is intended to fix the sensor 16 to the filming camera 9, it is possible to provide in particular for the sensor 16 an optical camera of small size, in particular of a space at least two times smaller than the bulk of the filming camera 9. Thus, the inconvenience for the operator will be minimal.
- an optical camera specifically dedicated to obtaining the position of the filming camera 9 in real space.
- an optical camera having an acquisition frequency of at least an integer multiple of that of the filming camera 9, for example of the order of 100 images per second, thus making it possible to smooth out by calculation the position of the filming camera 9 in real space for each time frame.
- an optical camera having an optical field (solid angle of view) greater than that of the filming camera 11, in order to maximize the information acquired from the real space 1 that can be used to calculate the positioning of the filming camera.
- an optical camera having an optical field (solid angle of view) greater than that of the filming camera 11, in order to maximize the information acquired from the real space 1 that can be used to calculate the positioning of the filming camera.
- a wide angle lens (“fish eye” or "fish eye” in English) having an acquisition angle greater than 160 degrees.
- the predefined three-dimensional model of the real space comprises, for example, natural topographical information of the real space 1.
- This space is for example available by any appropriate means. However, as shown in FIGS. 3 and 4, it is possible, for example, to use certain elements of the system that has just been described to generate the pre-established three-dimensional model of real space.
- the three-dimensional model 21 is established. This step is for example performed shortly before the shooting, so that the real space , at the time of shooting, corresponds to the pre-established model.
- a learning sensor 22 is moved into the real space 1.
- the learning sensor 22 transmits to the computer system 12, by any appropriate means, information acquired by the learning sensor 22.
- the computer system 12 comprises a computerized generation module 23 which, receiving information from the learning sensor 22 according to different angles of view, is capable of determining the three-dimensional model 21 ( to a scale factor close).
- the generation module 23 is able to determine the three-dimensional position of a set of geometric objects of real space. As shown in FIG.
- the three-dimensional model 21, represented as displayed projected in another perspective on a computer screen consists of a set of geometric patterns (here dots). These points can be represented in any orientation, as in Figure 4, a perspective view of real space.
- the three-dimensional model 21 could also consist of a set of other geometric objects, such as straight or curved lines, planar or non-planar surfaces, volumes, which are determined either by the generation module 23 itself, or by assistance of an operator of the generation module, the operator indicating to the generation module that a set of geometric objects belong to the same line / surface / volume.
- the same sensor 16 is used in a turning configuration.
- the same algorithm is used to determine the three-dimensional position of a geometric object in real space in learning configuration, and to determine the position in real space of the location camera 16 from the positions in the space. real space geometric objects determined with this same camera.
- the learning mode continues during filming.
- the predetermined three-dimensional model 21 can, if necessary, be made to a scale factor close.
- a location configuration in order to determine before shooting the respective location data of the filming camera 9 and the registration sensor 16.
- the sensor 16 is rigidly attached to the filming camera 9.
- a computerized location module 26 adapted to determine their relative position from the acquired images of the same pattern 27 by the two tools.
- the computer system 12 also includes a computer animation module 28.
- This animation module 28 may for example include an animation database 29 comprising one or more virtual animations.
- Each animation comprises for example, for each of a set of time frames corresponding to all or part of the duration of the video film to be rotated, characteristics of three-dimensional objects (point, line, surface, volume, texture, etc.). ) expressed in a virtual repository U, V, W 30.
- Each animation represents for example an augmented virtual reality event.
- FIG. 2 shows, for a given time frame, a virtual object 31, characterized by data expressed in the virtual space, identified by the virtual repository U, V, W.
- a virtual object 31 characterized by data expressed in the virtual space, identified by the virtual repository U, V, W.
- illustrative example very simple we used a vertical column with a square base, fixed in time but, in practice, whether it will be for example a lion, walking, or other ....
- the computer system 12 includes a composition module 32.
- the composition module 32 imports an animation of the module animation 28 along a link 33. If necessary, if the animation is not already expressed in the real repository 2, the composition module 32 mathematically links the virtual repositories U, V, W and the real repository X, Y, Z by a suitable passage matrix (an example is described later).
- the computerized composition module 32 generates, for the time frame in question, a composite image of the real image acquired by the filming camera 9, and a projection of a virtual image, corresponding to the virtual object 31 for this same time frame, the projection being generated according to the location data in the real repository of the filming camera 9.
- the composite image comprises the superimposition of the real image, and the virtual image, as if this virtual image was the image of an object present in the real space, acquired, for this time frame, by the filming camera 9.
- the composite image is then displayed on the control screen 15.
- the operator, filming can, for each frame of time, visualize, on its screen of control, the position and the orientation of the virtual object in the real space, according to its own angle of view, as if this virtual object was present in front of him. It can thus adapt if necessary the position of the filming camera with respect to the objects.
- the computer system 12 also comprises a monitor 15 'of a monitor 14' allowing, for the director, or for anyone interested, in real time, to view the composite image from the viewing angle of the filming camera.
- FIGS. 6a to 6c correspond to a first instant, in which an operator, not shown, films a portion 34 of the real space corresponding to the lower part rear of the car 8.
- the image 35 acquired by the filming camera 9 for this moment can be seen in Figure 6b.
- the position of the filming camera 9 for this time frame is determined by the tracking system.
- the composite image 36 generated on the control screen 15, 15 comprises the superposition of the real image, and of the virtual object 31 seen according to the angle of acquisition of the camera 9. To do this, as explained above, knowing the positions, in real space, of the filming camera 9, and the virtual object 31, at this given instant, we can calculate a projection in the image 35 of this object.
- FIGS. 7a to 7c show a subsequent time frame (directly subsequent), and are explained with reference to Figs. 6a to 6c.
- the events shown in FIGS. 7a to 7c occur approximately at 1/24 second after those of the preceding figures.
- the viewing angle of a filming camera 9 has changed, so that the filming camera 9 now points more towards the top of the car 8.
- the portion 34 'imaged is also represented on the Figure 7a.
- the actual image acquired by the filming camera 9 is represented by reference 35 'in FIG. 7b.
- FIG. 7c represents the composite image 36 'corresponding to the superposition of the real image 35' and of the virtual object 31, expressed as a function of the location of the filming camera 9 for this time frame.
- the virtual object 31 may be identical, on both time frames. Its projected representation for the two frames of time differs because of the difference of angle of view. However, in the case of an animation, the virtual object 31 may also be slightly different for the two time frames.
- the above steps can be repeated in real time for each frame of time of the shoot and, where appropriate suitable for several filming cameras.
- the registration image 37 acquired by the sensor 16 may correspond to a larger volume of the real space
- the computerized registration module is adapted to extracting from this registration image 37 natural topographic information, and to determine the position in the real repository 2 of the filming camera 9, as explained above, from this detected natural topographical information, and the three-dimensional model 21.
- optical markers fixed in real space 1 can not be used, for a great simplicity of use. Only natural topographic information is then used, which avoids cluttering the turning space with artificial markers.
- the system described here is also compatible with artificial markers.
- the computerized locating module may have several options to determine at any time. the position in the actual space of the filming camera 9. For example, in the case where the computerized locating module fails to locate enough topographic information to determine for sure the position in the actual space of the camera. the filming camera 9, it can by default consider that the filming camera 9 is stationary during this time. In reality, when the two devices 9 and 16 are very close to each other, as in the embodiment presented, if the sensor 16 is not able to determine the topographic information, it is because the Optical field of the filming camera 9 is probably closed by a very close real object. At the next time frame where the sensor 16 can determine enough information in order to determine the position in the three-dimensional space of the filming camera 9, a composite image may again be generated for this position.
- the computerized registration module comprises a selection module adapted to select the geometric patterns of the three-dimensional model that can be used to find the position of the filming camera in the 3D space.
- the geometric patterns that may be in the field of the sensor 16 are selected, for example by means of an approximate knowledge of the position of the sensor from a previous time frame.
- the set of identified geometric patterns is too different from the three-dimensional model, these patterns are not taken into account for determining the position of the camera. shooting.
- the geometric patterns present on the two images and immobile in the real space are associated in pairs.
- Other geometric patterns are considered moving in real space and are not preserved for comparison with the three-dimensional model.
- an inertial sensor 38 adapted to provide the computerized locating module with additional information on the position of the filming camera 9.
- the inertial sensor 38 is attached to the filming camera 9, or to the sensor 16 if it is attached to the filming camera 9.
- a matrix of passage between the filming camera and the sensor is associated with each magnification. In filming configuration, we use the information from the encoder to select the appropriate pass matrix.
- the composition module 32 can also be adapted to generate a projected shadow of the virtual object 31 in the real space 1.
- FIG. 6a shows artificial (as shown) or natural lighting 39 whose position in the real reference frame 2 is known.
- the real image 35 comprises, in addition to an image of the object 8, an image 40 of its real shadow.
- the personalized three-dimensional model may comprise surface information on which the shadow 41 of the virtual object 31 will be projected, viewed from the shooting angle of the filming camera 9
- the shadows of the virtual objects are calculated by taking into account the position in the real space of the virtual object 31, and the position in the real space of a surface on which the shadow of the virtual object is projected. 31, the position of the filming camera, and the position of the lights. Real and virtual shadows are also visible in Figure 7c.
- the system which has just been described is of particular interest when the animation is displaced with respect to the optical field of the filming camera 9.
- a fixed plane of a space will be rotated. real immobile, on which we will generate an animation whose form changes over time.
- Another example is to move the filming camera 9 in the real space 1, incorporating a mobile animation or, if necessary, still, to verify that it is framed as desired during acquisition.
- the system it is also possible for the system to include a means of taking into account a change in the focal length of the optics 10 of the filming camera 9.
- the zoom 42 carried by the camera 9 to include an encoder 43 for detecting the degree of rotation of a magnification ring 42, and for the computerized locating module 17 to take the magnification level determined by data transmitted by the encoder 43 is counted. This can be done, for example by repeating the locating step of FIG. 5 for a plurality of different magnifications of the optics 10 of the filming camera. 9.
- the virtual object 31 is expressed directly in the real referential 2 so as to be directly viewable from the angle of view of the filming camera 9.
- it can be provided to couple the three-dimensional model generation module with the animation module 28.
- the link 33 which is described in connection with FIG. 2 for exporting animations from the animation module 28 to the module of composition 32 can also be used in the other direction, to transmit to the animation module 28 the three-dimensional model 21 established for the real space.
- FIG. 8 it is possible to represent on the screen 44 of the computerized animation module 28 the superposition of the virtual object 31 obtained from the database animation and the three-dimensional model 21.
- the animation module 28 may include an application comprising a set of tools represented on the screen 44 by icons 45, and for pre-defining the animation.
- FIG. 8 shows thick arrows representing orders of movement or resizing of the virtual object 31 in the virtual space U, V, W.
- the system which has just been described makes it possible, if necessary, to retouch the animation directly at the time of shooting, in the real space, after acquisition by the system in a learning configuration, which still allows interaction increased between the real world and the virtual world.
- the turning system may, in one embodiment, be used as follows.
- a first step 101 the system is implemented in an optical calibration configuration of the filming camera 9, to determine the possible optical aberrations of the filming camera.
- This preliminary step is for example carried out using a test pattern, and collected information can be used thereafter by the computer system 12 to computerically correct the acquisition of the filming camera 9.
- a step 102 the system is used in a learning configuration, in which a learning sensor is moved into real space, to generate a three-dimensional model of the real space. This three-dimensional model is also scaled.
- an animation is provided from a database of virtual animations, the animation being intended to cooperate with the actual space to be filmed.
- the filming configuration system is used, and a composite image of the real image obtained by the optical camera 9 is generated on a control screen available at the location of the filming. projection generated, for the same time frame, on the real image, according to the location data in the real repository of the filming camera 9.
- step 106 if the director considers that the shooting is satisfactory, (arrow 0), taking into account the composite images generated, it ends the shooting of the video film (step 107).
- step 106 If the determination step 106 shows that the shooting is not satisfactory (arrow N), we can take advantage of having all the actors on site and operators to film the scene again (return to step 105). If necessary, it will be possible during this step to have modified the animation, as described above in relation to FIG. 8.
- the computerized systems described above can be made by one or a plurality of programmable machines communicating with each other, via networks, allowing the remote importation of the animations from a database if necessary. 29 remote animations.
- Computer objects such as keyboard, screen, mouse, processor, cables, etc. may be of a classically known type.
- the animation resulting from the animation database corresponds to a simplified animation animation intended to be present in the final video film. Then, a few weeks later, in a post-production stage, we will be able to plan the final animation from the initial animation used during the shooting, and from the acquired film.
- Simplified animation includes a smaller amount of data (for example at least twice as small) as the final animation.
- the sensor 16 and the filming camera 9 had overlapping optical fields and / or acquisition axes relatively close to being parallel.
- the sensor 16 also called camera witness
- the registration system comprises a second sensor 16 'having at least one characteristic different from the first sensor 16, chosen for example from the position, the orientation, the solid angle of shooting, acquisition frequency, optical axis, optical field.
- a second sensor 16 'can be oriented towards the ceiling, and a third sensor 16' 'can be oriented laterally.
- Each sensor 16, 16 'and 16' ' transmits to the computerized locating module the natural topographic information that it detects.
- the computerized registration module 17 determines the location data in the real repository of the filming camera 9 from the location data of the sensors 16, 16 ', 16' '(taken together or separately), and a comparison between the natural topographic information and the predetermined three-dimensional model of the real space.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Processing Or Creating Images (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- User Interface Of Digital Computer (AREA)
- Closed-Circuit Television Systems (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims
Priority Applications (13)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MX2014007075A MX341801B (es) | 2011-12-13 | 2012-12-13 | Sistema para la filmacion de una pelicula de video. |
US14/364,844 US9648271B2 (en) | 2011-12-13 | 2012-12-13 | System for filming a video movie |
NZ624929A NZ624929B2 (en) | 2011-12-13 | 2012-12-13 | System for filming a video movie |
JP2014546616A JP2015506030A (ja) | 2011-12-13 | 2012-12-13 | ビデオムービーを撮影するためのシステム |
EP12813918.5A EP2791914A1 (fr) | 2011-12-13 | 2012-12-13 | Système de tournage de film vidéo |
BR112014014173A BR112014014173A2 (pt) | 2011-12-13 | 2012-12-13 | sistema de filmagem de filme de vídeo |
AU2012351392A AU2012351392A1 (en) | 2011-12-13 | 2012-12-13 | System for filming a video movie |
CN201280062324.7A CN104094318A (zh) | 2011-12-13 | 2012-12-13 | 适用于拍摄视频电影的系统 |
KR1020147016197A KR20140100525A (ko) | 2011-12-13 | 2012-12-13 | 비디오 영화를 촬영하기 위한 시스템 |
CA2856464A CA2856464A1 (fr) | 2011-12-13 | 2012-12-13 | Systeme de tournage de film video |
IL232766A IL232766A0 (en) | 2011-12-13 | 2014-05-22 | Video recording system |
HK15101977.8A HK1201625A1 (en) | 2011-12-13 | 2015-02-27 | System for filming a video movie |
US14/790,099 US9756277B2 (en) | 2011-12-13 | 2015-07-02 | System for filming a video movie |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1161535A FR2984057B1 (fr) | 2011-12-13 | 2011-12-13 | Systeme de tournage de film video |
FR1161535 | 2011-12-13 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/364,844 A-371-Of-International US9648271B2 (en) | 2011-12-13 | 2012-12-13 | System for filming a video movie |
US14/790,099 Continuation US9756277B2 (en) | 2011-12-13 | 2015-07-02 | System for filming a video movie |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013088076A1 true WO2013088076A1 (fr) | 2013-06-20 |
Family
ID=47557364
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FR2012/052916 WO2013088076A1 (fr) | 2011-12-13 | 2012-12-13 | Système de tournage de film vidéo |
Country Status (13)
Country | Link |
---|---|
US (2) | US9648271B2 (fr) |
EP (1) | EP2791914A1 (fr) |
JP (1) | JP2015506030A (fr) |
KR (1) | KR20140100525A (fr) |
CN (1) | CN104094318A (fr) |
AU (1) | AU2012351392A1 (fr) |
BR (1) | BR112014014173A2 (fr) |
CA (1) | CA2856464A1 (fr) |
FR (1) | FR2984057B1 (fr) |
HK (1) | HK1201625A1 (fr) |
IL (1) | IL232766A0 (fr) |
MX (1) | MX341801B (fr) |
WO (1) | WO2013088076A1 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013167901A1 (fr) * | 2012-05-09 | 2013-11-14 | Ncam Technologies Limited | Système de mélange ou de composition en temps réel, objets tridimensionnels (3d) générés par ordinateur et source vidéo provenant d'une caméra cinématographique |
WO2022045897A1 (fr) * | 2020-08-28 | 2022-03-03 | Weta Digital Limited | Étalonnage de capture de mouvement utilisant des drones multi-caméras |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR3034940B1 (fr) * | 2015-04-10 | 2018-04-13 | Solidanim | Systeme et procede de tournage de film video, et environnement utilise |
GB2569267A (en) * | 2017-10-13 | 2019-06-19 | Mo Sys Engineering Ltd | Lighting integration |
WO2019080047A1 (fr) | 2017-10-26 | 2019-05-02 | 腾讯科技(深圳)有限公司 | Procédé de mise en œuvre d'image de réalité augmentée, dispositif, terminal et support de stockage |
DE102018122435A1 (de) * | 2018-09-13 | 2020-03-19 | Hendrik Fehlis | Virtuelle dreidimensionale Objekte in einem Livevideo |
WO2020082286A1 (fr) * | 2018-10-25 | 2020-04-30 | 郑卜元 | Système de capture et de surveillance d'image en temps réel de réalité virtuelle, et procédé de commande |
FR3093215B1 (fr) * | 2019-02-22 | 2021-08-27 | Fogale Nanotech | Procédé et dispositif de surveillance de l’environnement d’un robot |
US10890918B2 (en) | 2019-04-24 | 2021-01-12 | Innovation First, Inc. | Performance arena for robots with position location system |
CN110446020A (zh) * | 2019-08-03 | 2019-11-12 | 魏越 | 沉浸式堪景方法、装置、存储介质及设备 |
KR102770795B1 (ko) * | 2019-09-09 | 2025-02-21 | 삼성전자주식회사 | 3d 렌더링 방법 및 장치 |
JP2021149671A (ja) * | 2020-03-19 | 2021-09-27 | 富士フイルム株式会社 | 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラム |
WO2022019692A1 (fr) * | 2020-07-22 | 2022-01-27 | (주) 애니펜 | Procédé, système et support d'enregistrement lisible par ordinateur non transitoire pour créer une animation |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070248283A1 (en) * | 2006-04-21 | 2007-10-25 | Mack Newton E | Method and apparatus for a wide area virtual scene preview system |
GB2465791A (en) * | 2008-11-28 | 2010-06-02 | Sony Corp | Rendering shadows in augmented reality scenes |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5991085A (en) | 1995-04-21 | 1999-11-23 | I-O Display Systems Llc | Head-mounted personal visual display apparatus with image generator and holder |
JP3558104B2 (ja) | 1996-08-05 | 2004-08-25 | ソニー株式会社 | 3次元仮想物体表示装置および方法 |
JP4878083B2 (ja) | 2001-03-13 | 2012-02-15 | キヤノン株式会社 | 画像合成装置及び方法、プログラム |
JP4136420B2 (ja) | 2002-03-29 | 2008-08-20 | キヤノン株式会社 | 情報処理方法および装置 |
US7138963B2 (en) | 2002-07-18 | 2006-11-21 | Metamersion, Llc | Method for automatically tracking objects in augmented reality |
AU2003264048A1 (en) | 2002-08-09 | 2004-02-25 | Intersense, Inc. | Motion tracking system and method |
US7231063B2 (en) | 2002-08-09 | 2007-06-12 | Intersense, Inc. | Fiducial detection system |
JP4235522B2 (ja) | 2003-09-25 | 2009-03-11 | キヤノン株式会社 | 画像処理装置、画像処理方法、およびプログラム |
DE10347738B4 (de) | 2003-10-14 | 2012-01-26 | Siemens Ag | Motorisch verstellbares Röntgengerät und Verfahren zu dessen Verstellung |
GB2411532B (en) | 2004-02-11 | 2010-04-28 | British Broadcasting Corp | Position determination |
DE102004027270A1 (de) | 2004-06-03 | 2005-12-29 | Siemens Ag | System und Verfahren zur Bestimmung einer Position, insbesondere für Augmented-Reality Anwendungen |
JP4227561B2 (ja) | 2004-06-03 | 2009-02-18 | キヤノン株式会社 | 画像処理方法、画像処理装置 |
JP4810295B2 (ja) | 2006-05-02 | 2011-11-09 | キヤノン株式会社 | 情報処理装置及びその制御方法、画像処理装置、プログラム、記憶媒体 |
JP4863790B2 (ja) | 2006-07-03 | 2012-01-25 | 三菱プレシジョン株式会社 | 3次元コンピュータグラフィックス合成方法及び装置 |
US8023726B2 (en) | 2006-11-10 | 2011-09-20 | University Of Maryland | Method and system for markerless motion capture using multiple cameras |
JP4689639B2 (ja) | 2007-04-25 | 2011-05-25 | キヤノン株式会社 | 画像処理システム |
GB2452546B (en) | 2007-09-07 | 2012-03-21 | Sony Corp | Video processing system and method |
US7999862B2 (en) | 2007-10-24 | 2011-08-16 | Lightcraft Technology, Llc | Method and apparatus for an automated background lighting compensation system |
US20100045701A1 (en) | 2008-08-22 | 2010-02-25 | Cybernet Systems Corporation | Automatic mapping of augmented reality fiducials |
GB2466714B (en) | 2008-12-31 | 2015-02-11 | Lucasfilm Entertainment Co Ltd | Visual and physical motion sensing for three-dimentional motion capture |
WO2011014340A2 (fr) | 2009-07-31 | 2011-02-03 | Lightcraft Technology, Llc | Procédés et systèmes d'étalonnage dun objectif réglable |
US20110234631A1 (en) | 2010-03-25 | 2011-09-29 | Bizmodeline Co., Ltd. | Augmented reality systems |
KR101335391B1 (ko) | 2010-04-12 | 2013-12-03 | 한국전자통신연구원 | 영상 합성 장치 및 그 방법 |
US9699438B2 (en) | 2010-07-02 | 2017-07-04 | Disney Enterprises, Inc. | 3D graphic insertion for live action stereoscopic video |
US9529426B2 (en) | 2012-02-08 | 2016-12-27 | Microsoft Technology Licensing, Llc | Head pose tracking using a depth camera |
GB201208088D0 (en) | 2012-05-09 | 2012-06-20 | Ncam Sollutions Ltd | Ncam |
-
2011
- 2011-12-13 FR FR1161535A patent/FR2984057B1/fr active Active
-
2012
- 2012-12-13 MX MX2014007075A patent/MX341801B/es active IP Right Grant
- 2012-12-13 WO PCT/FR2012/052916 patent/WO2013088076A1/fr active Application Filing
- 2012-12-13 EP EP12813918.5A patent/EP2791914A1/fr not_active Ceased
- 2012-12-13 KR KR1020147016197A patent/KR20140100525A/ko not_active Withdrawn
- 2012-12-13 BR BR112014014173A patent/BR112014014173A2/pt not_active IP Right Cessation
- 2012-12-13 CA CA2856464A patent/CA2856464A1/fr not_active Abandoned
- 2012-12-13 AU AU2012351392A patent/AU2012351392A1/en not_active Abandoned
- 2012-12-13 US US14/364,844 patent/US9648271B2/en active Active
- 2012-12-13 CN CN201280062324.7A patent/CN104094318A/zh active Pending
- 2012-12-13 JP JP2014546616A patent/JP2015506030A/ja active Pending
-
2014
- 2014-05-22 IL IL232766A patent/IL232766A0/en unknown
-
2015
- 2015-02-27 HK HK15101977.8A patent/HK1201625A1/xx unknown
- 2015-07-02 US US14/790,099 patent/US9756277B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070248283A1 (en) * | 2006-04-21 | 2007-10-25 | Mack Newton E | Method and apparatus for a wide area virtual scene preview system |
GB2465791A (en) * | 2008-11-28 | 2010-06-02 | Sony Corp | Rendering shadows in augmented reality scenes |
Non-Patent Citations (5)
Title |
---|
CREATIVE COW: "Global GPS Tracking added to Previzion VFX System", 25 July 2011 (2011-07-25), pages 1 - 2, XP002677630, Retrieved from the Internet <URL:http://news.creativecow.net/story/866741> [retrieved on 20120612] * |
GALVIN, BOB: "Feature: Going Hollywood", February 2011 (2011-02-01), pages 1 - 4, XP002677644, Retrieved from the Internet <URL:http://www.profsurv.com/magazine/article.aspx?i=70875> [retrieved on 20120613] * |
LIGHTCRAFT TECHNOLOGY: "PREVIZION User Manual", 2 February 2011 (2011-02-02), pages 1 - 210, XP002677631, Retrieved from the Internet <URL:http://ebookbrowse.com/gdoc.php?id=245009125&url=a50fd273f10032925ea1d37a3191e5b7> [retrieved on 20110612] * |
See also references of EP2791914A1 * |
THE AMERICAN SOCIETY OF CINEMATOGRAPHERS: "New Products and Services: Lightcraft offers Free Photogrammetry Tools", August 2011 (2011-08-01), pages 1 - 12, XP002677629, Retrieved from the Internet <URL:http://www.theasc.com/new_products/August2011/index.php#prod1423> [retrieved on 20120612] * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013167901A1 (fr) * | 2012-05-09 | 2013-11-14 | Ncam Technologies Limited | Système de mélange ou de composition en temps réel, objets tridimensionnels (3d) générés par ordinateur et source vidéo provenant d'une caméra cinématographique |
EP2847991A1 (fr) * | 2012-05-09 | 2015-03-18 | Ncam Technologies Limited | Système de mélange ou de composition en temps réel, objets tridimensionnels (3d) générés par ordinateur et source vidéo provenant d'une caméra cinématographique |
EP2847991B1 (fr) * | 2012-05-09 | 2023-05-03 | Ncam Technologies Limited | Système de mélange ou de composition en temps réel, objets tridimensionnels (3d) générés par ordinateur et source vidéo provenant d'une caméra cinématographique |
US11721076B2 (en) | 2012-05-09 | 2023-08-08 | Ncam Technologies Limited | System for mixing or compositing in real-time, computer generated 3D objects and a video feed from a film camera |
US12217375B2 (en) | 2012-05-09 | 2025-02-04 | Ncam Technologies Limited | System for mixing or compositing in real-time, computer generated 3D objects and a video feed from a film camera |
WO2022045897A1 (fr) * | 2020-08-28 | 2022-03-03 | Weta Digital Limited | Étalonnage de capture de mouvement utilisant des drones multi-caméras |
Also Published As
Publication number | Publication date |
---|---|
US9648271B2 (en) | 2017-05-09 |
MX341801B (es) | 2016-09-01 |
FR2984057B1 (fr) | 2014-01-03 |
US9756277B2 (en) | 2017-09-05 |
EP2791914A1 (fr) | 2014-10-22 |
IL232766A0 (en) | 2014-07-31 |
JP2015506030A (ja) | 2015-02-26 |
CN104094318A (zh) | 2014-10-08 |
BR112014014173A2 (pt) | 2017-06-13 |
US20140369661A1 (en) | 2014-12-18 |
KR20140100525A (ko) | 2014-08-14 |
MX2014007075A (es) | 2015-03-06 |
FR2984057A1 (fr) | 2013-06-14 |
NZ624929A (en) | 2016-01-29 |
CA2856464A1 (fr) | 2013-06-20 |
HK1201625A1 (en) | 2015-09-04 |
AU2012351392A1 (en) | 2014-05-29 |
US20150358508A1 (en) | 2015-12-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2791914A1 (fr) | Système de tournage de film vidéo | |
EP0511101B1 (fr) | Procédé de modélisation d'un système de prise de vues et procédé et système de réalisation de combinaisons d'images réelles et d'images de synthèse | |
EP2715662B1 (fr) | Procede de localisation d'une camera et de reconstruction 3d dans un environnement partiellement connu | |
EP2385405B9 (fr) | Dispositif de projection panoramique, et procédé mis en oeuvre dans ce dispositif | |
JP2014014076A (ja) | 3d深度情報に基づいたイメージぼかし | |
WO2008099092A2 (fr) | Dispositif et procede d'observation de realite augmentee temps reel | |
WO2002059692A1 (fr) | Camera combinant les parties les mieux focalisees de differentes expositions d'une image | |
EP0940979A1 (fr) | Procédé et dispositif de remplacement de panneaux cibles dans une séquence vidéo | |
FR2911708A1 (fr) | Procede et dispositif de creation d'au moins deux images cles correspondant a un objet tridimensionnel. | |
FR2801123A1 (fr) | Procede de creation automatique de maquette numerique a partir de couples d'images stereoscopiques | |
CA2914360A1 (fr) | Systemes de reperage de la position de la camera de tournage pour le tournage de films video | |
EP1702472B1 (fr) | Procede et systeme de determination du deplacement d un pixe l, et support d enregistrement pour la mise en oeuvre du pro cede | |
FR3034940A1 (fr) | Systeme et procede de tournage de film video, et environnement utilise | |
WO2018229358A1 (fr) | Procédé et dispositif de construction d'une image tridimensionnelle | |
CA3057337A1 (fr) | Procede de texturation d'un modele 3d | |
FR3052287B1 (fr) | Construction d'une image tridimensionnelle | |
FR3057430A1 (fr) | Dispositif d'immersion dans une representation d'un environnement resultant d'un ensemble d'images | |
Nilsson et al. | The ARC 3D webservice | |
FR2931611A1 (fr) | Procede de modelisation 3d de scenes reelles et dynamiques | |
Bigley | Matching changing live action lights An Experiment Into the use of Video Based Lighting | |
FR2964203A1 (fr) | Acquisition d'image par synthese progressive | |
WO2012093209A1 (fr) | Procédé et dispositif d'aide à la prise de vue d'une photo numérique au moyen d'un objectif grand angle | |
FR2962871A3 (fr) | Acquisition d'image par synthese progressive |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12813918 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2856464 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 232766 Country of ref document: IL |
|
ENP | Entry into the national phase |
Ref document number: 2012351392 Country of ref document: AU Date of ref document: 20121213 Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2014546616 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2014/007075 Country of ref document: MX |
|
ENP | Entry into the national phase |
Ref document number: 20147016197 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2012813918 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012813918 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2014128584 Country of ref document: RU Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14364844 Country of ref document: US |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112014014173 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112014014173 Country of ref document: BR Kind code of ref document: A2 Effective date: 20140611 |