CN119325704A - Calibration system and method for dynamic projection mapping - Google Patents
Calibration system and method for dynamic projection mapping Download PDFInfo
- Publication number
- CN119325704A CN119325704A CN202380045516.5A CN202380045516A CN119325704A CN 119325704 A CN119325704 A CN 119325704A CN 202380045516 A CN202380045516 A CN 202380045516A CN 119325704 A CN119325704 A CN 119325704A
- Authority
- CN
- China
- Prior art keywords
- emitters
- light
- calibration tool
- projector
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/315—Modulator illumination systems
- H04N9/3152—Modulator illumination systems for shaping the light beam
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Geometry (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Processing Or Creating Images (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
一种用于动态投影映射系统的校准工具包括刚性本体、设置在刚性本体上的成排的至少三个光发射器,以及设置在刚性本体上并且偏离成排的至少三个光发射器的附加光发射器。该校准工具还包括设置在刚性本体上并配置成检测投射光的传感器。
A calibration tool for a dynamic projection mapping system includes a rigid body, at least three light emitters arranged in a row on the rigid body, and an additional light emitter arranged on the rigid body and offset from the at least three light emitters in the row. The calibration tool also includes a sensor arranged on the rigid body and configured to detect the projected light.
Description
Cross reference to related applications
The present application requests priority and the benefit of U.S. provisional application No. 63/350,301, entitled "CALIBRATION SYSTEMS AND METHODS FOR DYNAMIC PROJECTION MAPPING," filed on 8 at 6 at 2022, which provisional application is incorporated herein by reference in its entirety for all purposes.
Background
This section is intended to introduce the reader to various aspects of art that may be related to various aspects of the present technology, which are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.
Amusement parks and other casinos contain animated figures to provide entertainment to guests, among many other attractions. Some animated figures may be animated through projection maps that traditionally guide the projection of a predetermined appearance onto the animated figure. For example, the animated character may be visually supplemented with a set of pre-made or fixed images that may be consistent with the preprogrammed movements of the animated character. The animated character may have an internally positioned projector that projects an image through the translucent projection surface of the animated character, however, an internally positioned projector may also generate an unrealistic backlight or glow on the translucent projection surface of the animated character. It is currently recognized that it is desirable to make the animated figures appear more animated and to provide the ability to enable the animated figures to blend in a realistic, convincing manner with their surroundings.
Disclosure of Invention
The following summarizes certain embodiments commensurate in scope with the originally claimed subject matter. These embodiments are not intended to limit the scope of the present disclosure, but rather, they are intended to provide a brief summary of certain disclosed embodiments. Indeed, the present disclosure may encompass a variety of forms that may be similar to or different from the embodiments described below.
In one embodiment, a calibration tool for a dynamic projection mapping system includes a rigid body, a row of at least three light emitters disposed on the rigid body, and an additional light emitter disposed on the rigid body and offset from the row of at least three light emitters. The calibration tool also includes a sensor disposed on the rigid body and configured to detect the projected light.
In one embodiment, a dynamic projection mapping system includes a projector configured to project visible light. The dynamic projection mapping system also includes a calibration tool having a plurality of emitters configured to emit infrared light and a sensor configured to detect visible light projected by the projector. The dynamic projection mapping system further includes a plurality of tracking cameras configured to generate image data indicative of infrared light emitted by the plurality of emitters. The dynamic projection mapping system further includes processing circuitry configured to establish a common origin for the projector and the plurality of tracking cameras based on the sensor data received from the sensor and the image data received from the plurality of tracking cameras.
In one embodiment, a method of operating a projection system and an optical tracking system for dynamic projection mapping includes instructing, via a processing circuit, a set of emitters of a calibration tool to emit light in an environment. The method also includes receiving, from a plurality of tracking cameras and at the processing circuit, image data indicative of a respective location of each of the set of emitters in the environment. The method further includes instructing, via the processing circuit, the projector to project visible light into the environment. The method further includes receiving, from a sensor of the calibration tool and at the one or more processors, sensor data indicative of visible light detected by the sensor. The method further includes establishing, via the processing circuitry, a common origin in the environment for the plurality of tracking cameras and projectors based on the image data and the sensor data.
Drawings
These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
FIG. 1 is a schematic diagram of an embodiment of a media system including a motion tracking system and a projection system according to an embodiment of the present disclosure;
FIG. 2 is a block diagram of an embodiment of the media system of FIG. 1, according to an embodiment of the present disclosure;
FIG. 3 is a front view of a human-like facial feature projection onto a head portion of an animated character using the media system of FIG. 1 in accordance with an embodiment of the disclosure;
FIG. 4 is a front view of an animal-like facial feature projection onto a head portion of an animated character using the media system of FIG. 1 in accordance with an embodiment of the disclosure;
FIG. 5 is a perspective view of a performance scene of a attraction that may utilize the media system of FIG. 1, with a calibration tool moving through the performance scene to calibrate tracking cameras of the motion tracking system to one another, in accordance with an embodiment of the present disclosure;
Fig. 6 is a perspective view of a performance scene of the attraction of fig. 5, with a calibration tool positioned within the performance scene to establish an origin for calibrating a tracking camera of a motion tracking system and a projector of a projection system for the performance scene, in accordance with an embodiment of the disclosure;
FIG. 7 is a front view of a calibration tool that may be used in the performance scenario of the attraction of FIG. 5, and
Fig. 8 is a flowchart of a method of operating the media system of fig. 1 to complete a calibration process, according to an embodiment of the present disclosure.
Detailed Description
One or more specific embodiments of the present disclosure will be described below. In an effort to provide a concise description of these embodiments, all features of an actual implementation may not be described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
When introducing elements of various embodiments of the present disclosure, the articles "a," "an," and "the" are intended to mean that there are one or more of the elements. The terms "comprising," "including," and "having" are intended to be inclusive and mean that there may be additional elements other than the listed elements. Furthermore, it should be appreciated that references to "one embodiment" or "an embodiment" of the present disclosure are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features.
The present embodiments are directed to media systems for attractions in an entertainment environment (e.g., theme park, amusement park, theater, stadium, concert hall). The media system may include a projector for projecting an image onto an outer surface of a prop, such as an animated character. As discussed herein, the media system utilizes external tracking of the animated character (e.g., via optical performance capture or optical motion capture) to dynamically generate and accurately project images onto the exterior surface of the animated character.
In more detail, in order to support the accurate projection of images onto an animated character, the animated character may be equipped with a tracker so that a tracking camera of a motion tracking system of the media control system can recognize movements, positions and orientations of the animated character in real time. The media control system may operate independently of the animated character (e.g., independent of position, velocity, and/or acceleration information from sensors or actuators of the animated character), and the media control system may dynamically generate and coordinate projected images onto the animated character at a real frame rate that simulates a real character, such as by rendering textures, colors, and/or movements that appear indistinguishable from the animated character. As will be appreciated, the media control system may generate and update a skeletal model of the animated figure based on feedback from the tracking camera. The skeletal model generally represents a movable portion of the animated character and is dynamically updated to represent a current three-dimensional (3D) position (e.g., including x, y, and z coordinates), orientation, and scale of the animated character or portion thereof (e.g., a pose of the animated character). Thus, the media control system utilizes the skeletal model to generate a projected image that is precisely tailored to the current position and orientation of the animated figure. As discussed herein, a calibration process may be performed to calibrate the tracking camera of the motion tracking system and the projector of the projection system to the show scene (e.g., show space), and thus to each other (e.g., share a common origin and coordinate system).
While certain examples presented herein mention animated figures for ease of discussion, it should be appreciated that the term is intended to broadly encompass any prop that can be moved in a attraction and/or projected onto it via a projection system. In general, it should be considered that the techniques disclosed herein are applicable to projection onto any prop (e.g., object; structure; show action apparatus [ SAE ]). For example, the prop may be a full animated robotic character. As another example, a prop may be one or more objects (e.g., a building, stick, sword, or other object that is simpler than a full-motion robot avatar) that move around via a complex SAE, full-motion robot avatar, and/or participant (e.g., a human participant or actor). In practice, the prop may be a participant (e.g., a human participant or actor). Further, regardless of its structure, a prop may or may not represent a character (e.g., a human-like character, an animal-like character) or a character (e.g., an inanimate object such as a building, furniture, water).
With the above in mind, FIG. 1 illustrates a media system 8 (e.g., a dynamic projection mapping system) of a attraction 10 that includes a prop, which may be referred to herein as an animated character 12, that receives an image 14 (e.g., projected content) from a projector 16 (e.g., an external projector, a lens-bearing optical projector) of a projection system of a media control system 20. As shown, the attraction 10 is a performance scene having a stage ceiling 22, a stage floor 24, and scenery objects 26 disposed between the stage ceiling 22 and the stage floor 24. The show scene may also include any suitable stage lighting device 30, such as the illustrated lighting instrument or device. From the guest region 32 of the attraction 10, multiple guests 34 may view the animated figure 12 and/or interact with the animated figure 12. Although shown within a stage-type environment, it should be appreciated that the media system 8 may be used to provide entertainment for the guest 34 in any entertainment environment, such as a dark ride, an outdoor arena, an area near the ride path of a ride vehicle carrying the guest 34, and so forth.
Notably, projector 16 is located outside of animated figure 12, thereby enabling the use of a closed volume within animated figure 12 to accommodate components other than projector 16, such as certain actuation systems. In the illustrated embodiment, the projector 16 is positioned in front of the animated figure 12 and is obscured from view by the guest 34 by the overhang 36 of the stage ceiling 22. Regardless of the position of projector 16, projector 16 directs image 14 onto an outer surface 40 of a body 42 (e.g., structure) of animated figure 12 that may correspond to a head portion 44 of animated figure 12. Thus, the media control system 20 may impart a realistic and engaging texture to the head portion 44, thereby providing an immersive and interactive experience for the guest 34.
Animated character 12 is part of a motion control system 50 (e.g., prop control system) that is operable independently of media control system 20. For example, the motion control system 50 may utilize the interactive data to dynamically update the animated image 12. It should be appreciated that the motion control system 50 may instruct the actuators to adjust the animated figure 12 and/or adjust the position of any other suitable component of the attraction 10 that the guest 34 may see. For example, motion control system 50 may control an actuatable motion device 66 (e.g., an actuatable motion base) physically coupled to animated figure 12. Actuatable motion device 66 may be any suitable motion generating component that can move (e.g., translate, rotate) animated figure 12 laterally, longitudinally, and/or vertically. Further, it should be appreciated that actuatable motion devices 66 may be or include suspension systems and/or flight systems coupled to animated figure 12 from above stage floor 24.
A tracker 60 (e.g., a trackable marker) may be positioned on the animated figure 12. The tracker 60 may be positioned on the rear surface 62 of the animated figure 12 or on any suitable surface. The tracker 60 enables one or more tracking cameras 64 of a motion tracking system of the media control system 20 to sense or resolve the position and orientation of the animated figure 12 within the attraction 10, such as via optical performance capture or optical motion capture techniques. Thus, as will be appreciated, projector 16 may project image 14 onto animated figure 12 in synchronization with the actual current position and orientation (e.g., pose) of animated figure 12, independent of position, velocity, and/or acceleration information from sensors or actuators of animated figure 12. However, it should be appreciated that in some embodiments, media control system 20 may verify the positioning and operation of projector 16 based on sensor-derived information and/or actuator-derived information from animated figure 12.
It should be appreciated that media system 8 may include any suitable number of projectors 16, trackers 60, and tracking cameras 64. For example, more than one animated character 12 may be included within a single attraction 10, and the media system 8 may include at least one projector 16 for each animated character 12. However, it is presently recognized that the particular infrastructure of media system 8 is capable of moving any number of animated figures 12 within the optical range of at least one tracking camera 64 and within the projection cone of at least one projector 16 to receive images 14. In an embodiment, multiple projectors 16 may be provided to deliver content to multiple sides of a single animated figure 12. Additionally, certain embodiments of the animated figure 12 may include at least two trackers 60 to enable one or more tracking cameras 64 to resolve the relative positioning of the at least two trackers 60 to effectively track the animated figure 12, but it should be appreciated that a change in position of a single tracker 60 may also enable the position of the animated figure 12 to be resolved in a less complex and/or less accurate system.
In an embodiment, projector 16 and tracking camera 64 may be physically coupled to each other. For example, projector 16 and tracking camera 64 may be rigidly mounted to a frame (e.g., a rigid frame) to form a unified system such that projector 16 and tracking camera 64 remain in a fixed position relative to each other. In addition, the frame may be rigidly mounted to the stage floor 24 or another stationary surface of the performance scene. Thus, during operation of the attraction 10, the frame may block (e.g., reduce or eliminate) drift between the projector 16 and the tracking camera 64, as well as drift between the projector 16 and the tracking camera 64 and the show scene.
Regardless of how the projector 16 and tracking camera 64 are positioned within the attraction 10, a calibration process is performed to establish a relationship between the projector 16 and the tracking camera 64 to enable the projector 16 to project the image 14 onto the animated figure 12 tracked via the tracking camera 64. The calibration process may occur prior to operation of the attraction 10. For example, the calibration process may be performed before the beginning of a week, before the amusement park is opened every day, before each cycle of attraction 10, or any combination thereof. In an embodiment, the calibration process may occur (e.g., trigger) after various events, such as in response to detecting an offset between the projected image and the animated character (e.g., the sensor/imaging device detects an offset and this triggers the media system 8 to recalibrate; the operator visually observes the offset and provides an input to instruct the media system 8 to recalibrate).
Fig. 1 also shows an example of an interactive data source 70 that includes guest sensors 72. The guest sensor 72 may collect guest inputs from any guest 34 within the guest area 32. As recognized herein, guest input is a form of interactive data that may be used to adaptively update an animated image 12 or attraction 10. The motion control system 50 may generate a response to be performed by the animated figure 12 based on the interactive data and then instruct the actuators of the animated figure 12 to perform the response.
In an embodiment, animated character 12 is overlaid with a tracker 60 (e.g., visible or invisible; active or passive; retroreflective markers or active emitters). These discrete points on the animated image 12 may be used directly as visual reference points, based on which a two-dimensional (2D) or three-dimensional (3D) pose estimation process is performed. These discrete points can also be identified and fed through machine learning algorithms, compared to known ground truth surface gestures, and gesture matched in real time.
Fig. 2 is a block diagram of a media system 8 having a media control system 20, the media control system 20 being operable to project images from outside onto an animated figure 12 (e.g., without being communicatively coupled to a motion control system 50 or relying solely on the motion control system 50). In embodiments, media control system 20 may not send communication signals directly to motion control system 50 or receive communication signals from motion control system 50. However, as described below, the interactive data source 70 may be communicatively coupled upstream of both the media control system 20 and the motion control system 50 to enable coordination of the media control system 20 and the motion control system 50 without requiring intercommunication between the control systems 20, 50. A network device 90, such as a switch or hub, may be communicatively coupled directly downstream of the interactive data source 70 to facilitate efficient communication between the interactive data source 70 and the control systems 20, 50. However, it should be understood that the network device 90 may be omitted, that a plurality of network devices 90 may be implemented, or that any other suitable data management device may be used to facilitate the transfer of data from the interactive data source 70 to the control system 20, 50.
In the illustrated embodiment, the animated character 12 includes a character processor 100 and a character memory 104 that may collectively form all or part of a character controller 102 of the motion control system 50. The tracker 60 is disposed on the body 42 of the animated figure 12 to enable the tracking camera 64 of the motion tracking system of the media control system 20 to sense the position and orientation or pose of the animated figure 12. The tracker 60 may be an active device that may each transmit a separate signal to the tracking camera 64. For example, the tracker 60 may emit infrared light, electromagnetic energy, or any other suitable signal that may be detected by the tracking camera 64 (and at least in some cases not by the guest 34). Alternatively, the tracker 60 may be a passive device (e.g., reflector, painted portion) that does not emit a signal and enables the tracking camera 64 to accurately distinguish the passive device from the animated figure 12 and/or other portions of the attraction 10.
In addition, the animated figure 12 is equipped with any suitable actuator 106 that enables the animated figure 12 to move (e.g., walk, translate, rotate, pivot, lip sync) in a realistic and life-simulating manner. Interactive data source 70 may include any suitable data source that provides a time-varying data set as interactive data 109. For example, the guest sensor 72 may sense guest interactions and relay interactive data indicative of guest interactions to the image controller 102. The character controller 102 may then instruct the actuator 106 to dynamically manipulate the animated character 12 in response to the interactive data 109.
Media control system 20 may include projector 16, tracking camera 64, camera network device 110, and/or media controller 112. The media controller 112 is communicatively coupled to the interactive data source 70 (e.g., via the network device 90) so that the media controller 112 can dynamically react to the interactive data 109 and/or other changes in the attraction 10. In an embodiment, media control system 20 may be communicatively isolated from motion control system 50. That is, motion control system 50 may be independent of media control system 20. Thus, the media control system 20 provides operational freedom for the animated figure 12 to adaptively respond to the interactive data 109 in substantially real-time (e.g., within microseconds or milliseconds of interaction) while the media control system 20 monitors or tracks movement of the animated figure 12 to project images thereon in substantially real-time. Thus, when motion control system 50 executes the avatar feedback loop, media control system 20 concurrently executes the media feedback loop to modify the image projected onto animated avatar 12.
The media control system 20 utilizes the tracking camera 64 to gather information about the current position and orientation of the animated figure 12. The type or configuration of tracking camera 64 may be selected individually to correspond to and detect the type of tracker 60. The positioning of the tracker 60 in combination with the geometric model or skeletal model of the animated figure 12 facilitates coordination of projections onto the animated figure 12 in different orientations.
The tracking camera 64 is communicatively coupled to a camera network device 110, which camera network device 110 relays signals to the media controller 112 indicating the current three-dimensional position (e.g., including x, y, and z coordinates relative to the origin) and orientation of the animated figure 12 or portion thereof (e.g., pose of the animated figure 12). Thus, the camera network device 110 is a network switch or sensor hub that integrates multiple information streams from the tracking camera 64 for efficient processing by the media controller 112. The media controller 112 includes a media processor 114 and a media memory 116 that operate together to determine, generate and/or adjust dynamic images to be projected onto the animated figure 12 in a current position and orientation. Media controller 112 may then instruct projector 16 to project the dynamic image onto animated character 12. The image may be rendered entirely on demand based on the current pose (e.g., position and orientation) of the animated figure 12. In a less complex configuration, the image may be generated by adapting a pre-recorded video stream to the current pose of the animated figure 12. The media controller 112 may be any suitable media generator or game engine having powerful processing power and reduced latency. It should be appreciated that the media controller 112 is therefore capable of generating images to be projected onto the animated figure 12 in substantially real-time based on data received from the tracking camera 64. In practice, the media controller 112 may maintain a skeletal model or algorithm that represents the animated figure 12 and its actuatable portions (e.g., chin, limbs, joints). Based on this data, the media controller 112 may update the skeletal model to represent the actual current position and orientation of the animated figure 12 and then generate an image to be projected onto the animated figure 12 having the current position and orientation.
Projector 16 may include a projector processor 120 and projector memory 122 to facilitate rendering images onto animated figure 12. Projector processor 120 generally receives data indicative of the image from media controller 112 and then directs the light sources within projector 16 to output the image through the lens. The media controller 112 may determine a current silhouette or shape of the target avatar portion of the animated avatar 12 to receive the projected image based on the updated skeletal model and then instruct the projector 16 to provide the image onto the silhouette.
Processors 100, 114, 120 are each any suitable processor that can execute instructions for implementing the presently disclosed technology, such as a general purpose processor, a system on a chip (SoC) device, an Application Specific Integrated Circuit (ASIC), a processor of a Programmable Logic Controller (PLC), a processor of an Industrial PC (IPC), or some other similar processor configuration. These instructions are encoded in a program or code stored in a tangible, non-transitory, computer-readable medium, such as memory 104, 116, 122 and/or other storage circuitry or devices. Thus, graphics processor 100 is coupled to graphics memory 104, media processor 114 is coupled to media memory 116, and projector processor 120 is coupled to projector memory 122. The present embodiment of the media system 8 also includes a show control system 130 that coordinates additional output devices of the attraction 10. For example, the show controller 132 of the show control system 130 is communicatively coupled between the network device 90 and one or more lighting output devices 134, audio output devices 136, and/or venue-specific special effect output devices 138 (e.g., a smoke generator, a vibration generator, an actuatable portion of the scenery object 26).
Fig. 3 is a front view of image 14 disposed on a head portion 44 of a body 42 of animated figure 12. Image 14 may include features or textures that resemble a face. For example, eyebrows, eyes, nose, lips, and/or wrinkles may be projected onto the head portion 44. The animated figure 12 is equipped with clothing elements (e.g., caps, wigs, jewelry), and the media controller 112 and/or projector 16 may identify the outline of the outer surface 40 of the animated figure 12 formed by the clothing elements (e.g., via a projection mask). Projector 16 then directs image 14 to a target portion or character portion of outer surface 40 of animated character 12. The media control system 20 may monitor movements of the animated figure 12, such as large movements on a stage and/or small movements of an articulating chin, and project appropriate, realistic images onto the head portion 44 of the animated figure 12.
Fig. 4 is a front view of image 14 disposed on an exterior surface 40 of animated figure 12. As shown, the image 14 provides the animated figure 12 with a character, non-human, or fanciful appearance, such as the appearance of a cat owl. The outer surface 40 of the head portion 44 may be textured to complement the image 14. It should also be appreciated that image 14 may also include ultra-natural, singular, or non-human images and/or effects such as flames, smoke, deformation, discoloration, and the like.
Aspects related to calibration and alignment of projector 16 and tracking camera 64 may be better understood with reference to fig. 5-8. In fig. 5, a attraction 10 is shown, which includes a tracking camera 64 and a projector 16. The calibration process may be performed within the performance scene using a calibration tool 150 (e.g., a calibration bar or device). As part of the initial portion of the calibration process (e.g., camera calibration; calibrating the tracking cameras 64 to each other), an operator 152 (e.g., an operator; autonomous or remotely controlled robot) may carry the calibration tool 150 in the performance scene. The operator 152 may move the calibration tool 150 around within the performance scene, such as by waving the calibration tool 150 back and forth while traveling through the performance scene (e.g., walking or scrolling from one side of the performance scene to the other side of the performance scene). To facilitate an initial portion of the calibration process, the calibration tool 150 includes a plurality of transmitters 154, such as at least three transmitters 154. The plurality of emitters 154 may be arranged in a single row or row on the calibration tool 150 and at known relative positions. In one embodiment, the plurality of emitters 154 may be light emitters (e.g., light emitting diodes [ LEDs ]). For example, the plurality of emitters 154 may be light emitters that emit Infrared (IR) light that is detectable by the tracking camera 64 (and is not visible or detectable to the guest 34).
Each of the tracking cameras 64 captures a plurality of image frames (e.g., tens, hundreds, thousands) as the calibration tool 150 moves through the performance scene. The media controller 112 or any other suitable processing circuitry of the media system 8 may process the plurality of image frames to calibrate the tracking cameras 64 to one another. For example, the media controller 112 or other suitable processing circuitry of the media system 8 may compare the plurality of frames to one another to determine the relative position of the tracking camera 64 (e.g., compare the plurality of frames from different tracking cameras 64 with the calibration tool 150 at a first position, then compare the plurality of frames from different tracking cameras 64 with the calibration tool 150 at a second position, and so on). Advantageously, the initial portion of the calibration process performed in this manner may also account for/compensate for changes in properties of the tracking camera 64, such as lens distortion of the tracking camera 64.
As shown, the calibration tool 150 also includes at least one additional emitter 156 (e.g., offset emitter) and at least one sensor 158 (e.g., photodetector). The calibration tool 150 may be used to perform the calibration process periodically (e.g., before the beginning of a week, before each day of amusement park opening, before each cycle of attraction 10, or any combination thereof) and/or at other suitable times (e.g., in response to certain events). Furthermore, the calibration tool 150 may only be used during the calibration process in the performance scenario, and not during the performance in the performance scenario. However, in one embodiment, the calibration tool 150 may encompass or include an associated object (e.g., prop) onto which the projector 16 projects an image during a show. For example, the calibration tool 150 may be a piece of performance action equipment (SAE) that is projected onto it during a performance and/or that only occurs during a portion (e.g., beginning; initial portion) of the performance.
Fig. 6 shows the attraction 10 including the tracking camera 64 and projector 16, with a calibration tool 150 positioned within the performance scene to establish an origin (e.g., a common origin). The origin may then be used to calibrate the tracking camera 64 and projector 16 for the performance scene, which may also effectively calibrate the tracking camera 64 and projector 16 to each other.
As part of an additional portion of the calibration process (e.g., origin calibration; to establish an origin in the performance scene), the calibration tool 150 may be positioned and maintained at an origin setup position in the performance scene. For example, the calibration tool 150 may be fastened (e.g., bolted) and/or otherwise secured (e.g., via an interference fit) to a structure 160 (e.g., a stationary structure) in the performance scene. In one embodiment, the structure 160 may be mounted (e.g., fastened, such as bolted) to the stage floor 24 or other surface in the performance scene such that the structure 160 remains stationary relative to the performance scene. In one embodiment, the structure 160 may include a bracket 162 configured to support a portion of the calibration tool 150. For example, in fig. 6, the structure 160 includes a bracket 162 having a recess configured to receive a portion of the calibration tool 150. In this way, the calibration tool 150 may be rigidly coupled to the structure 160 and held at an origin set position within the show scene. However, it should be appreciated that the calibration tool 150 may be positioned and held at the origin setting position via any suitable technique.
During an additional portion of the calibration process, at least three transmitters on the calibration tool 150 are visible to the tracking camera 64. In one embodiment, the at least three emitters may include two emitters 154 and an additional emitter 156 arranged in a triangle (e.g., as three points, such as a three-point area or area, that forms or outlines a triangle across the calibration tool 150). Since the tracking cameras 64 have been previously calibrated to each other in the initial portion of the calibration process, at least three transmitters arranged in a triangle may be tracked three-dimensionally. The tracking camera 64 may capture image frames and the media controller 112 or any other suitable processing circuitry of the media system 8 uses the respective positions of at least three emitters (e.g., two emitters 154 and an additional emitter 156) in the image frames to set the origin of the tracking camera 64 within the show scene. For example, the origin may be set to coincide with a corresponding center of the additional emitter 156, a corresponding center of the sensor 158, or any other suitable location. Further, because the at least three emitters have known relative positions and/or spacing on the calibration tool 150, the media controller 112 or any other suitable processing circuitry of the media system 8 sets the coordinate system based on tracking the origin of the camera 64 within the performance scene using the respective positions of the at least three emitters (e.g., the two emitters 154 and the additional emitter 156) in the image frame. In this way, the media controller 112 or any other suitable processing circuitry of the media system may set the origin (0, 0) and a coordinate system having an x-axis or direction, a y-axis or direction, and a z-axis or direction (implicitly protruding perpendicularly from the 90 degree corner of the triangle).
Then, as part of the sensor mode portion of the calibration process, the calibration tool 150 is moved around within the performance scene while instructing the plurality of emitters 154, 156 (e.g., all emitters) to emit light, and while instructing the projector 16 to emit light (e.g., structured light scan) into the performance scene toward the calibration tool 150. In one embodiment, the calibration tool 150 is moved around to additional locations within the performance scene to perform the sensor mode portion of the calibration process (e.g., these steps are performed sequentially and the calibration tool 150 is located at a different location than the origin setup location). However, it is contemplated that an additional portion of the calibration process and a portion of the sensor mode portion of the calibration process may be performed simultaneously (e.g., performing certain steps at the origin set position simultaneously or at overlapping times, and then moving the calibration tool 150 to another position to complete the sensor mode portion of the calibration process). In any event, during the sensor mode portion of the calibration process, the calibration tool 150 is placed in a plurality of locations, such as at least six locations. For ease of discussion, fig. 6 includes a plurality of additional structures 164 located at different locations around the performance scene (e.g., at least six additional structures 164 located at least six different locations around the performance scene). When present, the additional structure 164 may include any feature of the structure 160 (e.g., fastened to the show scene; with a corresponding bracket). In operation, once the additional portions of the calibration portion are completed to set the origin, the calibration tool 150 may be moved from the structure 160 to each of the additional structures 164 in turn (e.g., coupled to each of the additional structures 164 in turn; a first one of the additional structures 164 then a second one of the additional structures 164, and so on). However, it should be appreciated that the disclosed techniques may be implemented without the additional structure 164, and instead, the calibration tool 150 may be carried by an operator or positioned in some other manner at different locations around the performance scene.
In any event, in each of a plurality of positions during the sensor mode portion of the calibration process, the plurality of transmitters 154, 156 are instructed to illuminate so that the media controller 112 or any other suitable processing circuitry of the media system 8 can track the calibration tool 150 (e.g., the body of the calibration tool 150). In addition, during the sensor mode portion of the calibration process, simultaneously and at each of a plurality of locations, projector 16 is instructed to emit light (e.g., structured light scan) into the show scene toward calibration tool 150. Sensor 158 detects light emitted by projector 16 and sensor 158 provides data (e.g., sensor data; signals) to media controller 112 or any other suitable processing circuitry of media system 8. The data is processed to determine which pixels of light from projector 16 strike sensor 158 (e.g., the x, y pixel locations of the location of sensor 158). These techniques provide a real world offset position (e.g., a position relative to an origin and a coordinate system established for tracking the camera 64) of the sensor 158. Thus, this may be used to set the origin and coordinate system of projector 16 within the show scene (e.g., determine the pose of projector 16 relative to the origin and coordinate system). In this way, both the tracking camera 64 and projector 16 are calibrated for the show scene (e.g., based on a common origin and coordinate system).
The origin may establish a coordinate system (e.g., two-dimensional or three-dimensional; a relative coordinate system of the attraction 10) that does not change within the loop of the attraction 10. The tracking camera 64 then references the origin and the coordinate system to track the animated figure 12 within the coordinate system. In addition, as shown in fig. 6, projector 16 may also reference the origin and coordinate system to enable projector 16 to accurately project image 14 onto animated figure 12 during a cycle of attraction 10 (e.g., at all times and all poses). In this way, the tracking camera 64 and projector 16 are calibrated and aligned with each other. In operation during a loop of the attraction 10, when the tracking camera 64 detects that the animated character 12 (e.g., a particular vertex) is located at the first set of coordinates, the media controller 112 may then instruct the projector 16 to project an image onto the animated character 12 at the first set of coordinates. Since the tracking camera 64 and projector 16 have been calibrated and aligned with each other, the images are properly aligned and mapped onto the animated figure 12.
It should be appreciated that the media controller 112 may generally operate as a two-dimensional solution (e.g., in an XY coordinate system) such that the animated figure 12 is captured with the tracking camera 64, features or markers of the animated figure 12 are identified in two-dimensional space with the shared X/Y origin of the tracking camera 64 and projector 16, and images are mapped directly to the animated figure 12 in two-dimensional space. In an embodiment, the media controller 112 may generally operate as a three-dimensional solution (e.g., in an XYZ coordinate system). In this case, machine learning may be used to solve for the estimation of the pose of the animated figure 12 in three-dimensional space. When the animated figure 12 has a face, this may be generally a face tracking in which the machine learning model is trained on a large number of tagged and marked facial images, attention gestures, expressions, proportions, and surface features. The resulting pose estimate may then be used to project the mask or digital apparel and effect elements in real-time.
Fig. 7 is a front view of an embodiment of a calibration tool 150 that may be used as part of the media system 8. As shown, the calibration tool 150 includes a body 170 (e.g., a rigid body). Body 170 may include a handle portion 172 configured to be grasped by operator 152. The body 170 may also include or be coupled to a cable 174, which cable 174 conveys instructions and data between the cable 174 and other devices and/or systems, such as the media controller 112. For example, the cable 174 may transmit instructions from the media controller 112 to cause certain of the transmitters 154, 156 (e.g., all of the transmitters 154; both of the transmitters 154 and additional transmitters 156) to illuminate and/or may transmit data from the sensor 158. It should also be appreciated that the cable 174 may provide power to the calibration tool 150. Furthermore, in one embodiment, the calibration tool 150 may include a communication device configured to communicate via a wireless protocol (e.g., wi-Fi; bluetooth) and/or an internal power source (e.g., a battery, such as a rechargeable and/or replaceable battery). In some such cases, the calibration tool 150 may not include or may not be coupled to the cable 174 (or any cable). Further, while cable 174 is shown extending from handle portion 172 for ease of discussion, it should be appreciated that cable 174 may extend from any suitable portion of calibration tool 150 (e.g., in addition to handle portion 172 or end portion such that handle portion 172 or end portion may more easily fit into bracket 162 or otherwise couple to structure 160).
As shown, the body 170 includes a cross-shaped design with a first arm 176 (e.g., a horizontal arm) and a second arm 178 (e.g., a vertical arm) transverse (e.g., orthogonal) to the first arm. The first arm 176 and the second arm 178 may be separate structures that are fastened (e.g., bolted) to each other, or the first arm 176 and the second arm 178 may be integrally formed (e.g., molded as a single piece). The emitters 154 may be distributed in a single row or row on the calibration tool 150, such as across the first arm 176. The transmitters 154 may be located at known relative positions (e.g., with known spacing) on the calibration tool 150, and the transmitters 154 may not be equidistantly spaced from one another. For example, a first distance 180 between the first emitter 154 and the second emitter 154 may be different than a second distance 182 between the second emitter 154 and the third emitter 154. The different spacings may facilitate the calibration process by enabling the orientation of the calibration tool 150 in the performance scene to be detected based on the different spacings reflected in the plurality of image frames captured by the tracking camera 64. The emitters 154 may form a first set of emitters 184 (e.g., bar emitters; camera alignment emitters).
As shown, the calibration tool 150 also includes an additional transmitter 156. Additional transmitters 156 may be used with any two of the transmitters 154 to establish the origin of the tracking camera 64. The additional emitter 156 and the two emitters 154 may be arranged to form a triangle (e.g., right triangle; not equilateral triangle; each emitter forms a point of the triangle). The additional emitters 156 and 154 may be located at known relative positions (e.g., with known spacing) on the calibration tool 150, and the additional emitters 156 and 154 may not be equidistantly spaced from each other. For example, a first distance 180 between the first emitter 154 and the second emitter 154 may be different than a third distance 186 between the additional emitter 156 and the second emitter 154. The different spacings may facilitate the calibration process by enabling the orientation of the calibration tool 150 in the performance scene to be detected based on the different spacings reflected in the plurality of image frames captured by the tracking camera 64. The additional emitter 156 and the two emitters 154 may form a second set of emitters 188 (e.g., camera origin emitters).
As shown, the calibration tool 150 also includes a sensor 158. The sensor 158 may represent one or more sensors. For example, sensor 158 may represent or include a plurality of sensors that detect light corresponding to a plurality of pixels of light from projector 16. As another example, sensor 158 may represent or include one sensor that detects light corresponding to one pixel of light from projector 16 (e.g., a smaller sensor area) or multiple pixels of light from projector 16 (e.g., a large sensor area). As another example, sensor 158 may represent or include one sensor that detects an aligned/corresponding pixel of light from projector 16 in a plurality of structured light scans from projector 16. In any event, sensor 158 may be used to establish the origin of projector 16.
In fig. 7, the emitter 154 and the additional emitter 156 are positioned on a front side (e.g., surface) of the calibration tool 150, and the sensor 158 is positioned on a rear side (e.g., surface) of the calibration tool 150 opposite the front side of the calibration tool 150. As shown in fig. 1, this may facilitate completion of a calibration process in a performance scene that includes a tracking camera 64 on one side of the performance scene and a projector 16 on the other side (e.g., opposite side) of the performance scene. In general, it may be desirable to position the tracking camera 64 behind the animated figure 12 such that the tracker 60 is placed on the rear surface of the animated figure 12 (e.g., invisible to guests viewing the front surface of the animated figure 12) and to position the projector 16 in front of the animated figure 12 to project light onto the front surface of the animated figure 12 (e.g., visible to guests). However, the emitter 154, the additional emitter 156, and the sensor 158 may be positioned on one side (e.g., the same side or surface) of the calibration tool 150 to facilitate the calibration process in a performance scene that includes the tracking camera 64 and the projector 16 on one side (e.g., the same side) of the performance scene. In practice, the emitter 154, the additional emitter 156, and the sensor 158 may be positioned on any side (e.g., the same side or surface; different sides or surfaces, including opposite sides or surfaces and/or any front/side/rear sides or surfaces) of the calibration tool 150 to facilitate the calibration process in a performance scenario including the tracking camera 64 and projector 16 at various locations of the performance scenario.
The disclosed technology provides a real world offset position of the sensor 158. When sensor 158 detects light emitted by projector 16, the data output by sensor 158 indicates the pixels aligned with sensor 158. Accordingly, media controller 112 or other suitable processing circuitry of media system 8 may align projector 16 with the origin (e.g., establish a common origin for tracking camera 64 and projector 16).
It should be appreciated that variations of the calibration tool 150 are contemplated, including variations in the number and/or arrangement of the emitters 154, additional emitters 156, and sensors 158. For example, the emitters 154 may be arranged in a single row or row along the second arm 178. As another example, additional transmitters 156 may be used with more additional transmitters 156 (e.g., instead of transmitters 154 that are also used to calibrate tracking cameras 64 to each other; transmitters 154 may not be reused in additional portions of the calibration process). More specifically, the calibration tool 150 may include three or more emitters 154 in the first set of emitters 184 to calibrate the tracking cameras 64 to one another, and may include three or more additional emitters 156 in the second set of emitters 188 to establish the origin of the tracking cameras 64 (e.g., a total of at least six emitters 154, 156 instead of a total of at least three emitters 154, 156 shown in fig. 7). Further, the sensor 158 may be co-located with one of the transmitters 154, 156. For example, the sensor 158 may be positioned on the back side of the calibration tool 150 directly opposite or behind an additional emitter 156 on the front side of the calibration tool 150. Further, as noted above, the emitter 154, the additional emitter 156, and the sensor 158 may be positioned on one side (e.g., the same side or surface) of the calibration tool 150.
The active light emitters 154, 156 may facilitate detection by the tracking camera 64. However, it should be appreciated that one or more of the transmitters 154, 156 may be replaced by a passive device that does not emit light or any signal that may be detected by the tracking camera 64. For example, retroreflective markers may be positioned at the illustrated locations of the transmitters 154, 156, and the retroreflective markers reflect light that may be detected by the tracking camera 64. In this case, the calibration tool 150 may include a movable cover (e.g., movable via an electronically controlled actuator and/or manually adjustable) to cover retroreflective markers that are not desired or used during the current portion of the calibration process. For example, during an initial portion of the calibration process, retroreflective markers positioned at the indicated locations of additional emitters 156 will be covered, while during an additional portion of the calibration process, at least one of the retroreflective markers positioned at the indicated locations of one of emitters 154 will be covered.
The sensor 158 may be a visible light sensor (e.g., a photodiode) to enable the sensor 158 to detect light from the projector 16 (e.g., light from the projector 16 may be only within the visible spectrum). Further, the emitters 154, 156 may be Infrared (IR) Light Emitting Diodes (LEDs) in order for the tracking camera 64 to detect light from the emitters 154, 156 (e.g., the tracking camera may capture only light of wavelengths associated with IR light). However, the sensor 158 may detect any type of light (e.g., a first type of light), and the emitters 154, 156 may emit any type of light (e.g., a second type of light that is the same or different than the first type of light).
The emitters 154, 156 and the sensor 158 provide different functions. For example, the purpose of the transmitters 154, 156 is to provide tracking points or "markers" for the tracking camera 64. Any of a variety of IR LEDs may be used as the emitters 154, 156, and the emitter light output (beam angle) is equivalent to the IR LED specification. In an embodiment, the emitters 154, 156 may emit light having a wavelength of approximately 850 nanometers (nm). The sensor 158 is used to detect visible light from the projector 16. The diameter of the sensor 158 (e.g., about 1 millimeter mm) may be sized to correspond to a 1-pixel size of the target pixel pitch (e.g., 0.05 inches or 1.27mm per pixel), however, the sensor 158 may be greater than or less than 1 pixel. In another embodiment, the diameter of the sensor 158 may be about 0.5mm. The sensor 158 may have a peak response in the human eye visible spectrum. Ideally, sensor 158 is a high resolution (16 bit) ambient light sensor and provides a linear response in the range of 0-65k lux. In an embodiment, as the light moves closer to the center of the sensor 158, the sensor 158 may have an increased reading value, which may enable sub-pixels (e.g., pixels of projector accuracy). In an embodiment, the sensor 158 may be a small sensor array (e.g., a phototransistor array) to achieve similar results. In an embodiment, the sensor 158 is not affected by IR light (including light leakage from the emitters 154, 156). In an embodiment, the sensor 158 is not a photoresistor or phototransistor.
In an embodiment, the emitters 154, 156 are always on (emitting light). Alternatively, the emitters 154, 156 may be controllable, such as by simple negative-positive-negative (NPN) digital I/O bits. In an embodiment, the sensor 158 is configured to convert visible light to an analog signal that is either directly output as an analog output (e.g., 0-5V, 0-10V, 0-15V, 0-20V, 5-10V, 5-15V, 5-20V) or sensed as a threshold on a sensor amplifier. Once the adjustable threshold is detected, an NPN digital output is triggered. The sensor bandwidth or scan rate may be at least 50Hz, and ideally at least 100Hz (or at least 150Hz, 200Hz, 250 Hz). The compatible voltage of the system may be 24Vdc or 5Vdc or any other suitable Vdc. For example, the emitters 154, 156 and the sensor 158 may also be supported on a Printed Circuit Board (PCB) to facilitate coordination of the light emitted by the emitters 154, 156 and processing and transmitting of the light detected via the sensor 158. The PCB may also provide a rigid substrate that maintains a fixed relative position of the emitters 154, 156 and the sensor 158.
Fig. 8 is a flow chart of an embodiment of a method 200 of operating the media system 8 of fig. 1. The method 200 includes various steps represented by blocks. It should be noted that the method 200 may be performed by a system (such as the media system 8 of fig. 1) as an automated program. Although the flowchart shows steps in a certain order, it should be understood that the steps may be performed in any suitable order and that certain steps may be performed simultaneously where appropriate. Furthermore, certain steps or portions of the method 200 may be performed by separate systems or devices.
In block 202, the method 200 may begin with moving a calibration tool within an environment (e.g., a performance scene of a attraction) to calibrate a plurality of tracking cameras to one another. As part of the initial portion of the calibration process, an operator (e.g., an operator; an autonomous or remotely controlled robot) may carry the calibration tool within the environment. The operator may move the calibration tool around within the environment, such as by swinging the calibration tool back while traveling through the environment. To facilitate an initial portion of the calibration process, the calibration tool includes a plurality of transmitters, such as at least three transmitters. The plurality of emitters may be arranged in a single row or row on the calibration tool and at known relative positions. In one embodiment, the plurality of emitters may be light emitters (e.g., light emitting diodes [ LEDs ]). For example, the plurality of emitters may be light emitters that emit Infrared (IR) light that is detectable by a tracking camera (and is not visible or detectable to a guest).
Each of the tracking cameras captures a plurality of image frames (e.g., tens, hundreds, thousands) as the calibration tool moves through the environment. The media controller or any other suitable processing circuitry may process the plurality of image frames to calibrate the tracking cameras to one another. For example, the media controller or other suitable processing circuitry may compare the plurality of frames to one another to determine the relative position of the tracking camera (e.g., compare the plurality of frames from different tracking cameras with the calibration tool at a first position, then compare the plurality of frames from different tracking cameras with the calibration tool at a second position, and so on).
In block 204, the method 200 may continue by setting the calibration tool at a location within the environment (e.g., an origin setting location). As part of this additional portion of the calibration process, the calibration tool may be fastened (e.g., bolted) and/or otherwise secured (e.g., via an interference fit) to a structure in the environment (e.g., a stationary structure). In one embodiment, the structure may be mounted (e.g., fastened, such as with bolts) to a stage floor or other surface in the environment such that the structure remains stationary relative to the environment. In one embodiment, the structure may include a bracket configured to support a portion of the calibration tool. In this way, the calibration tool may be rigidly coupled to the structure and remain at that location within the environment. However, it should be appreciated that the calibration tool may be positioned and held at the origin setting position via any suitable technique.
In block 206, the method 200 may continue by detecting or establishing an origin with the tracking camera. During this additional portion of the calibration process, at least three transmitters on the calibration tool may be visible to the tracking camera. In one embodiment, the at least three transmitters may include two of the transmitters used in block 202 and additional transmitters in a triangular arrangement (e.g., as three points forming a triangle across the calibration tool). The tracking camera may capture image frames and the media controller or any other suitable processing circuitry uses the respective positions of at least three transmitters in the image frames to set the origin of the tracking camera within the environment. For example, the origin may be set to coincide with the respective center of the additional emitter, the respective center of the sensor, or any other suitable location. Further, since the at least three emitters have known relative positions and/or spacing on the calibration tool, the media controller or any other suitable processing circuitry uses the respective positions of the at least three emitters in the image frame to set the coordinate system based on tracking the origin of the camera within the environment.
In block 208, the method 200 may continue by moving the calibration tool 150 to a plurality of additional locations in the performance scene (e.g., at least six additional locations in the performance scene). Further, at each of a plurality of additional locations in the performance scene, block 208 of method 200 may include emitting light from at least the emitter (e.g., a plurality of emitters; all emitters on the calibration tool) and also detecting light (e.g., a structured light scan) output by the projector via a sensor of the calibration tool to calibrate the projector (e.g., determine pose of the projector relative to origin and coordinate system; establish a common origin and coordinate system for tracking camera/motion tracking system and projection/projector system). As part of this sensor mode portion of the calibration process, the projector is instructed to emit light (e.g., structured light scan) into the environment toward the calibration tool. Blocks 202, 204, 206, and 208 may be performed in a coordinated manner via electronic control signals from a media controller (e.g., automatically) and/or via manual input from an operator (e.g., manually).
In any event, the sensor detects the light emitted by the projector and the sensor provides data (e.g., sensor data; signals) to the media controller or any other suitable processing circuitry. The data is processed to determine which pixels of light from the projector impinge on the sensor. This technique provides a real world offset position of the sensor. Thus, this may be used to set the origin and coordinate system of the projector within the environment (e.g., determine the pose of the projector relative to the origin and coordinate system). In this way, both the tracking camera and projector are calibrated for the environment (e.g., based on a common origin and coordinate system). In particular, the algorithm in the media controller equalizes (e.g., associates) the two with each other such that the coordinates (X, Y, Z) tracking the origin of the camera in space are equal to the pixel position (X1, Y1) relative to the projector raster.
During the loop of the attraction, neither the origin nor the coordinate system changes. Thus, the camera reference origin and the coordinate system are tracked to track the animated image within the coordinate system. In addition, the projector may also reference the origin and coordinate system to enable the projector to accurately project images onto the animated figure during a cycle of the attraction (e.g., at all times and all poses). In this way, during a loop of a attraction, when the tracking camera detects that an animated character (e.g., a particular vertex) is located at a first set of coordinates, the media controller may then instruct the projector to project an image onto the animated character at the first set of coordinates. Since the tracking camera and projector have been calibrated and aligned with each other, the images are properly aligned and mapped onto the animated character.
Advantageously, the calibration tool disclosed herein includes an emitter that is operable both as a calibration stick for tracking a camera (e.g., via a single row of emitters), as a calibration triangle for tracking a camera (e.g., via three emitters forming points of a triangle), and as a projector calibration for a projector (e.g., via a sensor). It will be appreciated that any of the features shown or described with reference to figures 1-8 may be combined in any suitable manner. While only certain features of the disclosure have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the disclosure.
The technology presented and claimed herein is cited and applied to specific examples of material objects and actual properties that significantly improve the art and are thus not abstract, intangible, or purely theoretical. Furthermore, if any claim appended at the end of this specification contains one or more elements designated as "means for [ performing ] [ function ]," or "steps for [ performing ]," such elements are intended to be interpreted according to 35u.s.c.112 (f). However, for any claim containing elements specified in any other way, such elements are not intended to be construed in accordance with 35u.s.c. ≡112 (f).
Claims (20)
1. A calibration tool for a dynamic projection mapping system, the calibration tool comprising:
a rigid body;
at least three light emitters disposed in a row on the rigid body;
an additional light emitter disposed on the rigid body and offset from the row of at least three light emitters, and
A sensor disposed on the rigid body and configured to detect the projected light.
2. The calibration tool of claim 1, wherein the rigid body comprises a first side and a second side opposite the first side, the row of at least three light emitters and the additional light emitter being disposed on the first side, and the sensor being disposed on the second side.
3. The calibration tool of claim 1, wherein two light emitters of the row of at least three light emitters and the additional light emitter are configured to form a triangle of light emitters.
4. The calibration tool of claim 1, wherein the at least three light emitters and the additional emitter are configured to emit infrared light.
5. The calibration tool of claim 1, wherein the sensor is configured to detect visible projected light.
6. The calibration tool of claim 1, wherein the rigid body comprises a handle portion configured to be gripped by an operator.
7. The calibration tool of claim 1, wherein the rigid body comprises a first arm and a second arm transverse to the first arm.
8. A dynamic projection mapping system, comprising:
a projector configured to project visible light;
a calibration tool, comprising:
A plurality of emitters configured to emit infrared light, and
A sensor configured to detect visible light projected by the projector;
A plurality of tracking cameras configured to generate image data indicative of infrared light emitted by the plurality of emitters, and
Processing circuitry configured to establish a common origin for the projector and the plurality of tracking cameras based on sensor data received from the sensor and image data received from the plurality of tracking cameras.
9. The dynamic projection mapping system of claim 8, wherein the calibration tool comprises a rigid body having a first side and a second side opposite the first side, the plurality of emitters positioned on the first side, and the sensor positioned on the second side.
10. The dynamic projection mapping system of claim 8, wherein the plurality of emitters comprises at least three emitters in a row and additional emitters offset from the at least three emitters in the row.
11. The dynamic projection mapping system of claim 10, wherein two emitters of the row of at least three emitters and the additional emitters are arranged to form a triangle of emitters.
12. The dynamic projection mapping system of claim 11, wherein the processing circuit is configured to instruct the at least three emitters of the bank to emit the infrared light and to calibrate the plurality of tracking cameras to one another based on additional image data generated by the plurality of tracking cameras while the at least three emitters of the bank emit the infrared light, and subsequently, the processing circuit is configured to instruct a triangle of the emitters to emit the infrared light and to establish the common origin for the plurality of tracking cameras based on image data generated by the plurality of tracking cameras while the triangle of the emitters emits the infrared light.
13. The dynamic projection mapping system of claim 8, wherein the calibration tool comprises a handle portion configured to be grasped by an operator.
14. The dynamic projection mapping system of claim 8, wherein the plurality of tracking cameras are configured to track a prop based on detection of one or more trackers coupled to the prop, and the projector is configured to project an image onto the prop as the prop moves through the environment.
15. The dynamic projection mapping system of claim 8, comprising:
Props configured to move through an environment, and
One or more trackers on the prop, wherein the plurality of tracking cameras are configured to generate additional image data representative of the one or more trackers in the environment based on detection of the one or more trackers on the prop by the plurality of tracking cameras;
Wherein the processing circuitry is configured to determine a position of the prop relative to the common origin based on the additional image data and instruct the projector to project the visible light onto the prop based on the position of the prop relative to the common origin.
16. A method of operating a projection system and an optical tracking system for dynamic projection mapping, the method comprising:
Indicating, via the processing circuit, a set of emitters of the calibration tool to emit light in the environment;
Receiving image data from a plurality of tracking cameras and at the processing circuit indicative of a respective location of each of the set of emitters in an environment;
Instruct a projector to project visible light into the environment via the processing circuitry;
Receiving sensor data indicative of the visible light detected by the sensor from a sensor of the calibration tool and at the one or more processors, and
A common origin is established in the environment for the plurality of tracking cameras and the projector based on the image data and the sensor data via the processing circuitry.
17. The method of claim 16, comprising:
instructing, via the processing circuitry, the at least three transmitters of the calibration tool in a row to emit respective light in the environment, and
Receiving initial image data from the plurality of tracking cameras and at the processing circuitry indicating a respective location of each of a row of transmitters in the environment, and
The plurality of tracking cameras are calibrated to one another based on the initial image data via the processing circuitry.
18. The method of claim 17, comprising calibrating the plurality of tracking cameras with each other based on the initial image data before instructing a transmitter group of the calibration tool to transmit light in the environment.
19. The method of claim 16, comprising:
Tracking, via the plurality of tracking cameras and the processing circuitry, the prop in the environment relative to the common origin based on detection of one or more trackers coupled to the prop, and
The projector is instructed via the processing circuitry to project an image onto the prop based on a position of the prop relative to the common origin as the prop moves through the environment.
20. The method of claim 16, wherein the emitter group comprises three emitters arranged to form points of a triangle.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202263350301P | 2022-06-08 | 2022-06-08 | |
US63/350301 | 2022-06-08 | ||
US18/206,549 US20230403381A1 (en) | 2022-06-08 | 2023-06-06 | Calibration systems and methods for dynamic projection mapping |
US18/206549 | 2023-06-06 | ||
PCT/US2023/024730 WO2023239802A1 (en) | 2022-06-08 | 2023-06-07 | Calibration systems and methods for dynamic projection mapping |
Publications (1)
Publication Number | Publication Date |
---|---|
CN119325704A true CN119325704A (en) | 2025-01-17 |
Family
ID=87070968
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202380045516.5A Pending CN119325704A (en) | 2022-06-08 | 2023-06-07 | Calibration system and method for dynamic projection mapping |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP4537525A1 (en) |
JP (1) | JP2025522348A (en) |
KR (1) | KR20250022131A (en) |
CN (1) | CN119325704A (en) |
WO (1) | WO2023239802A1 (en) |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7352239B2 (en) * | 2019-06-10 | 2023-09-28 | 株式会社アブストラクトエンジン | Projection system, projection control device, projection control program, and projection system control method |
US11758100B2 (en) * | 2019-09-11 | 2023-09-12 | The Johns Hopkins University | Portable projection mapping device and projection mapping system |
WO2022216913A1 (en) * | 2021-04-09 | 2022-10-13 | Universal City Studios Llc | Systems and methods for dynamic projection mapping for animated figures |
-
2023
- 2023-06-07 KR KR1020257000436A patent/KR20250022131A/en active Pending
- 2023-06-07 CN CN202380045516.5A patent/CN119325704A/en active Pending
- 2023-06-07 WO PCT/US2023/024730 patent/WO2023239802A1/en active Application Filing
- 2023-06-07 JP JP2024571877A patent/JP2025522348A/en active Pending
- 2023-06-07 EP EP23736540.8A patent/EP4537525A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2023239802A1 (en) | 2023-12-14 |
EP4537525A1 (en) | 2025-04-16 |
KR20250022131A (en) | 2025-02-14 |
JP2025522348A (en) | 2025-07-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7639004B2 (en) | Systems and methods for optical performance capture animated figures using real-time responsive projected media | |
US20220323874A1 (en) | Systems and methods for dynamic projection mapping for animated figures | |
US7576727B2 (en) | Interactive directed light/sound system | |
JP7482881B2 (en) | Augmented reality systems for recreational vehicles | |
WO2004055776A1 (en) | Interactive directed light/sound system | |
WO2022216913A1 (en) | Systems and methods for dynamic projection mapping for animated figures | |
KR102457608B1 (en) | Method and control device for operating a headset for virtual reality in a vehicle | |
CN113711162A (en) | System and method for robotic interaction in mixed reality applications | |
US20250005837A1 (en) | Systems and methods for animated figure display | |
US20230403381A1 (en) | Calibration systems and methods for dynamic projection mapping | |
CN119325704A (en) | Calibration system and method for dynamic projection mapping | |
KR102124564B1 (en) | Apparatus and Method For Image Processing Based on Position of Moving Light Source in Augmented Reality | |
US20250142029A1 (en) | Systems and methods for projection mapping onto multiple rigid bodies | |
CN114902163A (en) | Information processing apparatus, information processing method, and program | |
US20240350939A1 (en) | Augmented reality display with adjustable parallax | |
US12017137B2 (en) | Device including plurality of markers | |
WO2024226309A1 (en) | Augmented reality display with adjustable parallax | |
HK40079197A (en) | Systems and methods for optical performance captured animated figure with real-time reactive projected media | |
JPH1185385A (en) | Coordinate detection system and light emission body | |
JP2024523410A (en) | System and method for projection mapping of attraction systems | |
Daphalapurkar et al. | Motion capture for human-centered simulation using kinects | |
HK40057794A (en) | Augmented reality system for an amusement ride |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |