[go: up one dir, main page]

GB2640117A - Computer-implemented method and system for controlling real fixtures - Google Patents

Computer-implemented method and system for controlling real fixtures

Info

Publication number
GB2640117A
GB2640117A GB2403411.8A GB202403411A GB2640117A GB 2640117 A GB2640117 A GB 2640117A GB 202403411 A GB202403411 A GB 202403411A GB 2640117 A GB2640117 A GB 2640117A
Authority
GB
United Kingdom
Prior art keywords
fixture
real
virtual
computer
game engine
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2403411.8A
Other versions
GB202403411D0 (en
Inventor
O'flynn Gaynor
Harvey Carlo
John Rothwell Nicholas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beinghuman Ltd
Original Assignee
Beinghuman Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beinghuman Ltd filed Critical Beinghuman Ltd
Priority to GB2403411.8A priority Critical patent/GB2640117A/en
Publication of GB202403411D0 publication Critical patent/GB202403411D0/en
Priority to PCT/GB2025/050467 priority patent/WO2025186582A1/en
Publication of GB2640117A publication Critical patent/GB2640117A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/175Controlling the light source by remote control
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)
  • Pinball Game Machines (AREA)

Abstract

A computer-implemented method of controlling with a game engine a real fixture, such as a lighting fixture 44, in a real space 30, which method comprises the steps of: (a) providing a virtual fixture 54 within a virtual space 50 in the game engine, said virtual fixture corresponding to said real fixture and which comprises a virtual parameter that is adjustable to change an effect provided by the virtual fixture within the virtual space in the game engine; (b) with the game engine, playing a programmed sequence of events, which comprises at least one adjustment in said virtual parameter from a first value to a second value; (c) determining whether said at least one virtual parameter is adjusted as the sequence of events is executed and, if so, outputting from the game engine an indication comprising an identity of the virtual fixture and said second virtual parameter value; (d) obtaining, using said indication, a real fixture identity corresponding to said virtual fixture identity and an updated real parameter value corresponding to said second virtual parameter value; (e) generating a reformatted indication comprising said real fixture identity and said updated real parameter value; (f) transmitting said reformatted indication towards said real fixture; and said real fixture adjusting said real parameter value to said updated real parameter value according to said reformatted indication; whereby, as said sequence of events is played by the game engine, the effect provided by the real fixture in the real space imitates changes to the effect provided by the virtual fixture in the virtual space.

Description

COMPUTER-IMPLEMENTED METHOD AND SYSTEM FOR CONTROLLING REAL
FIXTURES
TECHNICAL FIELD
This invention relates to a computer-implemented method of controlling with a game engine a real fixture in a real space, to a computer-implemented method of providing effects in a production environment, to various computer programs, computer program products, computer readable data carriers, and to a game engine.
BACKGROUND
Production environments rely on an array of different technology and technical skills behind the scenes to set up, configure and operate lighting rigs, sound systems, stage sets, displays, digital media assets, projectors, screens, pyrotechnics, smoke machines, etc. to provide the required effects within the production environment during a show. The effects are generally intended to be experienced live by an audience and/or the sound and visuals recorded for viewing later. Examples of such production environments include a film or television studio or set, an indoor or outdoor theatre, a concert venue, a stadium, a cinema, an immersive show or experience, a museum, an art gallery, a building interior or exterior, to name but a few.
Physical devices and objects to be controlled at a venue of the show are often referred to as 'fixtures'. At the venue, there are often many different types of fixtures to be controlled to produce the necessary effects during the show. Often these fixtures are controlled by a variety of different control devices and utilise a variety of different communication applications and protocols, such as DMX, OSC, USB, Art-Net, Ethernet/Wi-Fi, lighting desks, QLab, L-Acoustics, Isadora, etc. Furthermore, for a show that is to be performed at different venues (for example as part of a tour), the number and type of fixtures and their associated control devices often changes from venue to venue. This makes it challenging from the technical perspective to create the same effects for the same show at different venues.
The effects to be provided during a show are almost always designed in advance, long before show time. Many shows are designed at least in part using various bespoke computer packages which allows the show to be pre-visualised on a computer screen.
Owing to the differing size and shape of venues, the wide range of technical skills required to design and implement a show, and the variety of fixtures and control devices available at different venues, planning a show often requires input from a multi-disciplinary technical team. Often people in the technical team come from different organizations and are in different geographical locations.
Even if a show is designed and pre-visualised using a computer package, there remain significant difficulties transferring the show from the virtual space into the real space.
All of this increases the difficulty of the task of designing, pre-visualising and then implementing the show in a venue, whether the venue is well known for putting on shows (such as the Royal Opera House for example), or if the venue is not used on a day-to-day basis for shows but is being used to put on a temporary show (such as a building or landmark).
SUMMARY OF THE INVENTION
According to some embodiments there is provided a computer-implemented method of controlling with a game engine a real fixture in a real space. The real fixture may comprise a real fixture identity and at least one real parameter that is adjustable to change an effect provided by the real fixture within the real space. The method may comprise the step of providing a virtual fixture within a virtual space in the game engine. The virtual fixture may correspond to said real fixture and which may comprise a virtual fixture identity and at least one virtual parameter that is adjustable to change an effect provided by the virtual fixture within the virtual space in the game engine. The method may comprise the step of with the game engine, playing a sequence of events programmed in the game engine. The sequence of events may comprise at least one adjustment in said at least one virtual parameter from a first virtual parameter value to a second virtual parameter value. The method may comprise the step of determining whether said at least one virtual parameter is adjusted as the sequence of events is executed and, if so, outputting from the game engine an indication comprising said virtual fixture identity and said second virtual parameter value. The method may comprise the step of obtaining, using said indication, a real fixture identity corresponding to said virtual fixture identity and an updated real parameter value corresponding to said second virtual parameter value. The method may comprise the step of generating a reformatted indication comprising said real fixture identity and said updated real parameter value. The method may comprise the step of transmitting said reformatted indication towards said real fixture. The method may comprise the step of said real fixture adjusting said real parameter value to said updated real parameter value according to said reformatted indication. As said sequence of events is played by the game engine, the effect provided by the real fixture in the real space may imitate changes to the effect provided by the virtual fixture in the virtual space.
In some embodiments, the method may further comprise the step of providing a virtual space object within the virtual space in the game engine. The virtual space object may correspond to said real space. The at least one virtual fixture may be positioned in substantially the same position in the virtual space object as the real fixture in the real space.
In some embodiments, said virtual space object may comprise a digital twin of said real space.
In some embodiments the virtual fixture may comprise a virtual model of said corresponding real fixture.
In some embodiments the method may comprise the step of providing a plurality of virtual fixtures in the virtual space, each of which corresponds to a respective real fixture in the real space. The method may comprise tagging each of said plurality of virtual fixtures. The step of determining may further comprise checking each tagged virtual fixture for adjustment of said at least one virtual parameter as the sequence of programmed events is played by the game engine.
In some embodiments said playing of the sequence of events may comprise generating a series of frames within the game engine and said step of checking comprises comparing at least two frames of said series to determine whether said adjustment of said at least one virtual parameter has occurred. The at least two frames may be adjacent frames in said series, such adjoining frames.
In some embodiments the method may further comprise the step of storing, for each tagged virtual fixture, the last known first virtual parameter value for use in said step of determining.
In some embodiments said at least one virtual parameter may comprise a plurality of virtual parameters of said virtual fixture. The step of outputting an indication from the game engine may comprise generating a respective indication for each virtual parameter of said plurality of virtual parameters that is determined to be adjusted.
In some embodiments, said indication may comprise at least one datagram comprising said virtual fixture identity and said second virtual parameter value. The at least one datagram may be constructed in accordance with the Open Sound Control (OSC) protocol.
In some embodiments the method may further comprise the step of parsing said indication to identify said virtual fixture identity and said second virtual parameter value.
In some embodiments the method may further comprise the step of using said virtual fixture identity to lookup a type of reformatting of said indication. The lookup may utilise a mapping between said virtual fixture identity and a real fixture production control protocol indicating the type of reformatting required.
In some embodiments the step of generating a reformatted indication may comprise generating at least one message in a format complying with said real fixture production protocol. The at least one message may comprise data representing said real fixture identity and said updated real parameter value.
In some embodiments said mapping may further comprise a fixture specification of the real fixture corresponding to said virtual fixture identity.
In some embodiments said real fixture production control protocol may comprise a DMX-type protocol and a DMX base address of the real fixture. The method may further comprise using said DMX base address to convert said virtual fixture identity to a DMX channel and to convert the second virtual parameter value to a DMX parameter value. The method may further comprise including the DMX channel and the DMX parameter value in said at least one message as said data representing said real fixture identity and said updated real parameter value.
In some embodiments the real fixture production protocol may comprise configuration information about an external API. The method may further comprise using the configuration information about the external API to generate said at least one message in a format complying with the external API, said at least one message comprising data representing said real fixture identity and said updated real parameter value.
In some embodiments the configuration information about said external API may comprise information for a QLab API, L-Acoustics API or Isadora API.
In some embodiments the method may further comprise the steps of selecting from a plurality of reformatting options a reformatting option for said indication. The reformatting options may comprise (i) no reformatting and transmitting said indication substantially in the form output from the game engine. The reformatting options may comprise (ii) reformatting said indication.
In some embodiments the method may further comprise selecting option (i) when said virtual fixture identity does not map to a real fixture production control protocol. The method may comprise selecting option (ii) when said virtual fixture identity does map to a real fixture production control protocol.
In some embodiments the method may further comprising the step of selecting option (i) for indications output from the game engine that comprise media metadata and/or media 30 cues.
In some embodiments the method may comprise the step of as each indication is received from the game engine, checking a control status for the virtual fixture identity in said indication, which control status indicates whether said real fixture is controlled by the indications output by the game engine or is not controlled by the indications output by the game engine. The method may comprise dropping said indication received from the game engine if said control status indicates that said real fixture is not controlled by the game engine.
In some embodiments said control status may indicate that said real fixture is not controlled by the game engine. The method may further comprise the step of receiving an external parameter value, for example an external parameter value input by a user through a control device, which external parameter value overrides the updated real fixture parameter value from the game engine. The method may further comprise using a representation of said external parameter value in place of said updated real fixture parameter value in said reformatted indication.
In some embodiments the method may further comprise the step of mapping said external parameter value to said real fixture identity and said at least one real parameter.
In some embodiments the external parameter value may comprises at least one of a MIDI CC number, a MIDI controller value, and a MIDI channel.
In some embodiments the method may further comprise the steps of if said control status indicates that said real fixture is controlled by the game engine, adjusting said updated real fixture parameter value and before transmitting it in said reformatted indication.
In some embodiments the control status may indicate that said real fixture is controlled by the game engine. The method may further comprise receiving an adjustment value, for example an adjustment value input by a user through a control device, which adjustment value indicates an adjustment to said at least one real parameter value of the real fixture. The method may comprise using said adjustment value to change said updated real fixture parameter value before transmitting it in said reformatted indication.
In some embodiments said adjustment value may represent a correction to be applied to said at least one parameter value, for example to correct for a misalignment between the spatial orientation of said virtual fixture in said virtual space and the spatial orientation of said real fixture in said real space.
In some embodiments the effect provided by said real fixture and by said virtual fixture corresponding to said real fixture, may comprise an effect used in a live production environment including, but not limited to any one or more of the following: light, sound, audio, still images, moving images, smoke, motion, digital display.
In some embodiments the method may further comprise the step of providing a user-adjustable trim function that enables the user to change a maximum and/or minimum value of said real parameter value that will be transmitted to the real fixture.
In some embodiments the method may further comprise the step of dropping any indication comprising said second virtual parameter value that corresponds to a value greater than said maximum value or less than said minimum value of said real parameter value.
In some embodiments the method may further comprise the step of providing a user-adjustable shift and/or scale function that enables the user to apply a shift and/or a scale factor to a fixture range of said real parameter value.
In some embodiments the method may further comprise the step of applying a fixed shift to said second virtual parameter value and/or multiplying to said second virtual parameter value by said scale factor during said reformatting step.
In some embodiments each real fixture may comprise a communication interface allowing that real fixture to receive digital communications. The real fixture may be adapted to act on instructions contained in those communications. The real fixture may comprise a computer-controlled apparatus for implementing those instructions as a real-world effect.
In some embodiments the method may comprise the step of receiving an external indication for controlling a change to an effect provided by a real fixture. The external indication may be received from software, a system or a device external of said game engine. The method may further comprise the step of processing said external indication to generate an adapted indication suitable for controlling a real fixture using a production control protocol, such as DMX. The method may further comprise the step of transmitting said adapted indication toward said real fixture.
According to some embodiments there is provided a computer-implemented method of providing effects in a production environment. The production environment may comprise real fixtures in a real space in which the effects are intended to be experienced by one or more of the human senses and/or recorded by audio-visual recording equipment. The method may comprise any of the steps of the computer-implemented method as set out above or as described or as claimed anywhere herein. In this way the effects in the real space may be controlled by the sequence of events programmed in the game engine.
In some embodiments the production environment may comprise, a film or television studio or set, an indoor or outdoor theatre, a concert venue, a stadium, a cinema, an immersive show or experience, a museum, an art gallery, a building interior or exterior.
According to some embodiments there is provided a computer program comprising instructions which, when the program is executed by the computing device, cause the computing device to perform the step of providing a game engine on said computing device. The computing device may be caused to provide a virtual fixture within a virtual space in the game engine. The virtual fixture may correspond to said real fixture. The virtual fixture may comprise a virtual fixture identity and at least one virtual parameter that is adjustable to change an effect provided by the virtual fixture within the virtual space in the game engine. The game engine may be used to play a sequence of events programmed in the game engine, which sequence of events comprises at least one adjustment in said at least one virtual parameter from a first virtual parameter value to a second virtual parameter value. The computer program may cause the game engine to determine whether said at least one virtual parameter is adjusted as the sequence of events is executed and, if so, output from the game engine an indication comprising said virtual fixture identity and said second virtual parameter value.
In some embodiments the computer program may further comprise instructions which, when the program is executed by the computing device, cause the computing device to perform any of the computer program steps described above, or as described or claimed anywhere herein.
According to some embodiments there is provided a computer program product comprising a computer program as set out above, or as described or claimed anywhere herein.
The computer program product may be in the form a plugin for a game engine.
According to some embodiments there is provided a computer-readable data carrier having stored thereon a computer program product as described above, or as a described or as claimed anywhere herein.
According to some embodiments there is provided a data carrier signal carrying the computer program product set out above, or as described or as claimed anywhere herein.
According to some embodiments there is provided a game engine comprising the instructions of the computer program as set out above, or as described or as claimed anywhere herein.
According to some embodiments there is provided a computer program for facilitating performance of the computer-implemented method as set out above, which computer program comprises instructions which, when the program is executed by a computing device, cause the computing device to perform steps (d), (e) and (f) of claim 1.
In some embodiments said program may further comprises instructions which, when the program is executed by the computing device, cause the computing device to perform the steps of any of claims 10 to 33.
According to some embodiments there is provided a computer program product comprising a computer program as set out above, or as described or as claimed anywhere herein.
In some embodiments the computer program product may be in the form an app downloadable from an app store.
A computer-readable data carrier having stored thereon the computer program product of claim 45 or 46.
According to some embodiments there is provided a data carrier signal carrying the computer program product as set out above, or as described or as claimed anywhere herein.
According to some embodiments there is provided a computer-implemented method and/or computer program. The computer-implemented method and/or computer program may be adapted to facilitate planning and visualisation of a show in a virtual space with virtual fixtures. The virtual space may be provided a game engine. A sequence of events may be programmed into the virtual space that cause changes to the virtual fixtures with time (for example lights moving and changing colour intensity, and images displayed on a virtual screen). The pre-programmed sequence of events may make up at least part of the show.
The sequence of events may be pre-visualised in the virtual space by 'playing' and watching the sequence of events in the virtual space. The computer-implemented method and/or computer program may be adapted so that, at show time, the show is again 'played' in the virtual space, and the virtual space provides an output that can be used to control corresponding real fixtures in a real space to provide substantially the same show seen in the virtual space. The output provided by the virtual space may be in the form of an indication as each parameter of each virtual fixture changes. These indications may be further processed to reformat them into messages that conform to the necessary communication protocol. The messages may be transmitted directly to the real fixtures, and/or to external control software (e.g. QLab) or devices (e.g. lighting desks), to make a corresponding change in the effect provided by the real fixtures.
Throughout the description and claims of this specification, the words "comprise" and "contain" and variations of the words, for example "comprising" and "comprises", mean "including but not limited to", and do not exclude other components, integers or steps. Moreover the singular encompasses the plural unless the context otherwise requires: in particular, where the indefinite article is used, the specification is to be understood as contemplating plurality as well as singularity, unless the context requires otherwise.
Preferred features of each aspect of the invention may be as described in connection with any of the other aspects. Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 is a schematic block diagram of a computer-implemented system according to some embodiments.
Figure 2 is a schematic block diagram illustrating further aspects of the computer-implemented system of Figure 1.
Figure 3 is a screenshot of a scene within a game engine.
Figure 4 is a block diagram of steps of a computer-implemented method according to some embodiments.
Figure 5A is an illustration of a fixture specification library; Figure 5B is a schematic illustration of a project-specific map utilised in at least some 15 embodiments.
Figure 6 is a block diagram of an embodiment of a computer-implemented system for controlling a fixture.
Figures 6A -6G are screenshots of illustrating aspects of the computer-implemented system of Fig. 6.
Figures 6H and 61 are views of the computer-implemented system of Fig. 6 during use.
DETAILED DESCRIPTION
In some embodiments, a computer-implemented method and/or computer program is provided in which a show may be planned and visualised in a virtual space with virtual fixtures (such as a game engine). A sequence of events may be programmed into the virtual space that cause changes to the virtual fixtures with time (for example lights moving and changing colour intensity, and images displayed on a virtual screen). This preprogrammed sequence of events makes up at least part of the show and can be pre-visualised in the virtual space by 'playing' and watching the sequence of events in the virtual space. At show time, the show is again 'played' in the virtual space, and the virtual space provides an output that can be used to control corresponding real fixtures in a real space to provide substantially the same show seen in the virtual space. The output provided by the virtual space may be in the form of an indication as each parameter of each virtual fixture changes. These indications may be further processed to reformat them into messages that conform to the necessary communication protocol. The messages may be transmitted directly to the real fixtures, and/or to external control software (e.g. QLab) or devices (e.g. lighting desks), to make a corresponding change in the effect provided by the real fixtures.
In this way, the planning of the show in the virtual space can be substantially agnostic about the communication protocols, external devices, etc. that are required in the real world to implement the show. Furthermore, even if these communication protocols and devices are different at a new venue, the same show (i.e. pre-programmed sequence of events) can be 'played' in the virtual space and the output indications from the virtual space used reformatted to messages suitable for controlling real devices at the new venue.
In some embodiments virtual fixtures in a virtual space may communicate (for example, indirectly) with real fixtures in real space. For each real fixture in the real space there may be a corresponding virtual fixture in the virtual space. The real fixtures may utilise heterogeneous communication protocols, for example different industry-standard communication protocols (such as DMX and Open Sound Communication Protocol). That is, one or more of the real fixtures may utilise a first communication protocol (for example DMX) and one or more other real fixtures may utilise a second communication protocol (for example Open Sound Communication protocol, OSC) that is different to the first communication protocol.
In the virtual space the virtual fixtures do not need to use different communication protocols and are controlled instead by the underlying computer system providing the virtual space. Sequencing of events of each virtual device can be programmed inside the virtual space, for example using a sequencer of a game engine. The sequence of events can be visualised in the virtual space, for example by utilising the rendering capabilities of a game engine during playback. An advantage of some embodiments is that a show designer can focus on the design of the show and the necessary effects and can then pre-visualise the show with a reduced need to check technical requirements of implementing the show in a real space.
In some embodiments, it may be possible to override or 'punch in' different effects to the real world to those occurring during playback of the show in the virtual space which would otherwise by implemented in the real space. This may be achieved by dropping indications output from the virtual space when it is indicated that control is taken over by an external device. In some embodiments, additionally or alternatively, it may be possible to 'trim' indications output from the virtual space. For example, it may be possible to increase or decrease a value by a certain amount, percentage, etc. This trim could be adjusted by the user at the real space, for example to account for differences between the virtual space and the real space. The trimming may be performed within in the virtual space, or outside the virtual space (for example by a computer program).
The computer-implemented method and/or computer program may facilitate bi-directional communication between the virtual space and the real space. For example, the computer-implemented method and/or computer program may facilitate control and/or update of devices in the real space based on changes in the virtual space and may facilitate control and/or update of virtual devices in the virtual space based on changes in the real space.
In some embodiments, the virtual space may comprise a model of the real space. This model is sometimes called a digital twin. The virtual devices may themselves be models, or digital twins, of the real devices in the real space. The real space may be a production environment, in which audio and/or visual effects may be required. For example, the production environment could be any space, indoor or outdoor, to be used for a show, film, concert, art exhibition, immersive experience, film set, etc. The production environment could be a live production environment where an audience is present or could be a production environment where a performance is recorded, such as a film set or television studio. The real space may be a home environment, office environment, etc. Further detail of these and other aspects is described in the following.
Fig. 1 is a schematic view of a system according to some embodiments. In Fig.1, a computing device 10 comprises a memory 40 and a processor 50. The computing device may be a hand-held personal computing device, such as a smartphone, laptop, or tablet. In other aspects the computing device could be a cloud-based computing device, a remote server. The memory 40 stores computer executable instructions that, when executed by the processor 50, facilitate the aforementioned communication between virtual devices in a virtual space 20 and real devices in a real space 30, as will be described in greater detail below. Although Fig. 1 shows a single processor, it will understood that the invention can be applied in various different computing systems and environments, including those with more than one processor, and computer systems comprising at least one CPU and at least one GPU. Furthermore the functionality of the computer-executable instructions may be distributed amongst various different computing devices, as will be apparent in the discussion below of various embodiments.
In some embodiments, the virtual space 20 may be provided by a game engine installed on the computing device 10. A game engine may be a software framework or platform designed to ease creation and development of video games and interactive 3D applications.
Many game engines are available, but currently popular game engines include Unreal Engine (1)ttps: =*.v.u::°. ...................... by Epic Games, Unity (ittils:/, CryEngine (IittasiA, espnit) and Godot aiiTtgs://egodnienpirie,orga The game engine is generally provided as a downloadable application from a website of the game engine provider, that enables the user to install the game engine on a computing device. Once installed, a user may configure the game engine and install other software components (also known as plugins) to extend its functionality, amongst other things.
Although embodiments and examples of the invention are described in the context of Unreal Engine, it is to be noted that the invention is not limited to the use of Unreal Engine. Any other game engine that would enable the functionality described herein could be used in place of Unreal Engine.
A game engine generally comprises a core set of functionalities, including a graphics rendering engine, a physics engine, an audio engine, scripting and programming facilities, such as a visual script editor. The virtual space 20 corresponding to the real space 30 may be set up as a new project in the game engine and may use a template provided within the game engine. For example, in Unreal Engine 5.3 there are various templates provided under the category 'film, Television and Live events'. A user of the computing device 10 may use a such a template to create a new project representing the real space 30, or alternatively may create a new project from scratch. To establish in the game engine a digital twin of the real space where a production is to take place, a user may take a LIDAR scan of the real space and import it into the game engine as a 3D object within a scene (a scene is typically a self-contained portion of a project in the game engine). One way to obtain a LIDAR scan is using an iPad® or iPhone® equipped with a LIDAR sensor and appropriate app installed (e.g. polycam, .........) to capture a scan of the real space and then import it into the game engine. There are other ways of establishing a digital twin of the real space, including building it from scratch within the game engine, importing 3D objects from software such as Blender (https;//,www.blenctermigt) and scanning a real space using a digital built environment scanner such as a Matterport (https:// ,Tort,cornil) or FARO (131;Dsi,:thyyiy; Aar°, cgpal) systems.
Referring to Fig. 2, a more detailed example of the system of Fig. 1 is illustrated schematically. In this example, the real space 30 is a theatre 42 in which there are various real fixtures including lighting fixtures 44, speakers 46 and a projector 48. Objects that are positioned within the real space 30 to produce a real-world effect are generally referred to herein as real fixtures. In some embodiments the real-world effect may include audio and/or visual effects. Other real-world effects are envisaged including, without limitation, motion, smoke, fog, haze, pyrotechnics. It is to be noted that the meaning of real fixtures is not limited to the examples given above and shown in the Figures. In general, any real fixture that: comprises (1) a communication interface (for example a network interface) allowing it to receive digital communications; (2) which can act on instructions contained in those communications; (3) comprises some computer-controlled apparatus for implementing those instructions as a real-world effect; and which (4) can be represented inside the virtual world 20 as a virtual object; can be used in at least some embodiments of the invention. Some examples of other types of real fixtures that can be used in the real space 30 include microphones, fog and haze machines, displays (e.g. LED video walls), camera rigs, camera equipment, rigging (e.g. trusses, motors, hoists), pyrotechnic devices, motors for moving parts of a stage set.
Some of the real fixtures (in this example the lighting fixtures 44) are controlled in the real space 30 by another device (such as a lighting desk utilising the DMX protocol), or a piece of software (such as QLab, Ds:I/OLab.aupt) represented as control device 43 in Fig. 2.
For example, QLab may communicate DMX data using the Art-Net communication protocol (htips:Part-netorci.uk/) over an Ethernet cabling (or wireless network) to a so-called 'Node' 45 that converts Art-Net data to and from DMX data for the lighting fixture 44, and also over USB to a USB-to-DMX converter 47 for another lighting fixture 44. Additionally in this example, QLab communicates directly with another lighting fixture 44 using the DMX communication protocol, over DMX cabling. QLab may also provide audio and video output for the speakers 46 and projector 48. A media source 49 may provide audio and/or video data to the control device 43 for cueing into the live performance (i.e., the show). The media source 49 could be a computer memory storing appropriate files, and this computer memory is accessible to the control device 43. It does not need to be part of the control device 43, and could be remote such as on a server, or stored in the cloud. The media source may also take the form of a live feed from other devices, such as microphones and video cameras.
A user wishes to prepare sequence of events (e.g. show control cues) to be used in a live performance at the theatre. The sequence of control cues determines when and how the fixtures in the real space 30 are orchestrated to provide effects for the show, often involving visual and/or audio effects, i.e., in general, to provide real-world effects. To achieve that, the user turns to the virtual space 20 on the computing device 10.
The purpose of the virtual space 20 is to simulate the real space 30 including the real fixtures and their positions. An example of the physical arrangement of these fixtures is shown schematically in Fig. 2, but the actual positions of real fixtures should be the same, or close to the same, in the virtual space 20.
In Fig. 2 the virtual space 20 is shown separately from the computing device 10 to aid understanding. However, as shown in the embodiment of Fig. 1, the virtual space 20 is operated within the computing device 10. The virtual space 20 is created, operated and controlled by a game engine 60 that is shown schematically within the computing device 10.
The virtual space 20 comprises a scene 50, shown schematically. The scene 50 has been created within the game engine 60 to correspond to the real space 30. As described above, the scene 50 could comprise a LIDAR scan of the real space 30 that is imported into the virtual space 20 as a 3D object. As such the scene 50 can be considered a digital twin of the real space 30 that has generally corresponding features. Fig. 3 shows an example scene 50 created in Unreal Engine, from which it is apparent that there is a stage (created as a 3D object from a LIDAR scan), with lighting fixtures and sound fixtures amongst other things positioned relative to the 3D object. Referring again to Fig. 2, virtual fixtures corresponding to the real fixtures have been placed within the scene 50. In particular, a room 52 comprising a stage has been imported as a 3D object into the scene 50. Virtual lighting fixtures 54, virtual speakers 56 and a virtual projector 58 have also been imported into and positioned within the scene 50 corresponding to the position of the real fixtures. A media source 59 is accessible to the game engine 60. Typically, the media source will be stored within the memory 40 of the computing device 10. However, this is not essential, and the media source could be located on an external device accessible to the computing device 10, such as a server in the cloud or a local memory device such as a solid-state memory device.
Each virtual fixture may comprise a virtual model or representation of the real fixture. The virtual model is usually designed to have some visual similarity to the real fixture, although this is not essential. However, the virtual model will be provided with the relevant real-world properties and characteristics of the real fixture. Such real-world properties and characteristics may include geometry, colour mixing capabilities, beam angles, electrical specifications, control protocols (such as DMX channels), amongst other things. These properties and characteristics of each object can be described using a standardized file format, or Fixture Information Interchange Format (FIIF), such as the General Device Type Format (GDTF) and My Virtual Rig (MVR), which may also include a 3D model for producing a visual representation of the virtual fixture in the scene 50. GDFT/MVR import is supported natively by many game engines, including Unreal Engine, and so it is possible to import virtual models into the game engine 60.
After the properties of the fixtures have been imported into the game engine 60, the user can set up and 'pre-visualise' audio and/or visual effects using the game engine 60 to see how the show will look before it is run either in rehearsal or as a live performance in the real space 30. In particular, the user may user the Sequence Editor of the game engine 60 to set up the cues for the fixtures, and cues for any media files accessible from the media source 59. Changes can be made to the show in the game engine to account for any aspects of the real space and/or the available fixtures, and then pre-visualised again to see how the show looks in the virtual space 20.
Of note, the virtual fixtures in the virtual space 20 do not need to replicate the various communication protocols required between the or each control device 43 (e.g. QLab, lighting desks, etc) and the real fixtures in the real space 30. In the scene 50 in the virtual space 20, the necessary control of the virtual fixtures is performed by the sequencer of game engine 60 according to the setup by the user in the Sequence Editor. There is thus no need to establish various virtual communication protocols inside the game engine that correspond to the various real communication protocols needed in the real space, such as DMX, OSC, Art-Net, USB, etc. This makes it straightforward to set up, and adjust if needed, a show in the virtual space 20 without the need to consider how the corresponding real fixtures are to be controlled.
As will be explained in the following, the computing device 10 is provided with a computer-executable instructions to enable the real fixtures to be controlled in real-time by the game engine's sequencer, such as receiving the necessary cues and data. In this way the performance of the show can be controlled from the computing device 10 with reduced technical set up time in the real space 30, despite the differing communication protocols used by the real fixtures in the real space 30.
Referring to Fig. 4 a schematic diagram of the functions performed by the computer- executable instructions is generally indicated by reference numeral 100. The computer-executable instructions comprise two general parts: a plugin 102 and a computer program 104. The plugin 102 is adapted to be installed within the game engine 60 to extend its functionality. When run, the computer program 104 may operate as a standalone application on the computer device 10. In use, the plugin 102 and the computer program 104 communicate with one another as will be described in greater detail below.
Game Engine Plugin 102 The plugin 102 is adapted to be installed in the game engine 60 operating on the computing device 10. The plugin could be made available through an online marketplace, such as the Epic Marketplace, and installed in the same way as existing plugins for a game engine.
Alternatively the functionality of the plugin 102 could be built-in to the core functionality of the game engine.
The scene 50 (see Fig. 2) of the game engine 60 is shown in Fig. 4 to aid understanding, although it is to be noted that it does not form part of the plugin. At step Si, the plugin enables a user to 'tag' the virtual fixtures in the scene 50. The purpose of this is to enable the game engine to know which virtual fixtures to monitor for changes as the sequence is played in the game engine 60, to process the information, and then communicate details of the changes to the computer program 104.
To enable tagging of virtual fixtures in the game engine 60, the plugin 102 could modify the user interface to include a checkbox, radio button, etc. permitting the user to tag/un-tag each virtual fixture. In Unreal Engine one way this could be performed is to configure the plugin to use the Editor Utility Widgets to modify the user interface of the 'actors' in the game engine to include a checkbox, radio button, etc. When checked, the actor ticking function is enabled for that particular virtual fixture. In Unreal Engine, virtual fixtures can be tagged using the 'actor ticking' function for example.
When a virtual fixture is tagged ('tagged virtual fixture'), its object properties are inspected at regular intervals to check for any changes since the previous interval. Example object properties that may be inspected could be any one or more of: light intensity, object transformation (translate, rotate), media file path, triggered play events, play head time differences, volume, RGB values, etc. But in principle any real-world property of the corresponding real-world fixture can be a corresponding object property of the corresponding virtual fixture and be inspected. To perform the object property inspection, a script is run on the tagged virtual fixture at regular intervals as the sequence is played in the game engine. The regular interval may be once per frame for example. In some embodiments, for example with large numbers of virtual fixtures, tick groups could be used to control when in a frame a virtual fixture should tick relative to other in-engine frame processes. This function could be used to group different kinds of virtual fixtures together
for example.
At step 52, the plugin 102 determines how many virtual fixtures have been tagged for monitoring. The plugin 102 may further group the tagged virtual fixtures by category, e.g. as N lighting fixtures, M speakers, K projectors, L displays, etc. and these could be formed into respective tick groups of the same fixture kind, if needed.
In Fig. 4 a box 106 contains steps performed by the plugin 102 in real-time as the show sequence is played in the game engine 60. At step 53 the plugin 102 monitors each virtual fixture category for relevant changes to any individual virtual fixture in the current frame compared to the previous frame of the sequence. By 'relevant changes', it is meant changes to virtual fixture parameters affecting any individual virtual fixture that needs to be passed to the corresponding real fixture. For example, relevant changes of virtual fixture parameters in the lighting virtual fixture group may include any one or more of transformation, colour, intensity and fixture feature updates. Relevant changes of virtual fixture parameters in the speaker virtual fixture group include any one or more of volume, media path, and playhead position. Relevant changes of virtual fixture parameters in the projector fixture group include any one or more of media path and playhead position. It is noted that any one or more virtual fixture group may be monitored by the plugin 102, including other virtual fixture groups not listed above.
To detect that a virtual fixture parameter has changed, the plugin 102 stores, for each tagged virtual fixture, each last known virtual fixture parameter value, or first virtual fixture parameter value. At a second frame, later than the first frame, the plugin 102 compares the new virtual fixture parameter value(s) (or second virtual fixture parameter value) with the stored last known values to determine if any have changed. The comparison is typically made frame by frame as the sequence is played in the game engine 60, but longer gaps between comparison are possible to reduce computational overhead.
The plugin 102 is configured to implement a number of Custom Events inside the game engine 60. Following the step of the user tagging of virtual fixtures as described above, a Custom Event is created for each virtual fixture parameter of each virtual fixture. Each Custom Event is configured to fire when a change is detected in its associated virtual fixture parameter value, as described above. The event triggered is either to create an OSC client in the game engine 60 or use an existing OSC client if it already exists for the IP address and port number specified for the computer program 104 in the plugin configuration (see the example in Figure 6 for more details).
The OSC client Is typically made available in the game engine 60 by installation of another plugin, for example the OSC Plugin in Unreal Engine. In other game engines a plugin may be needed, or it may be that an OSC client is already part of the core software of the game engine.
At step S4, the OSC client builds an OSC message to communicate the change in the virtual fixture parameter to the computer software 104. Each OSC message takes the form of an address pattern plus a type tag string (specifying the data types of the arguments that follow) and arguments. For example, there may be the following virtual fixture parameter changes between one frame and the next for virtual lighting fixture, Light 1: Pan 30; Tilt 20; Intensity 1; RGB 40 50 60. These changes may be transformed into OSC messages comprising: /lightl/pan/,i 30 /lightl/tilt/,i 20 /lightl/intensity/,i 30 /lightl/rgb/,iii 40 50 60 There may be changes in fixture parameter value of a virtual speaker, Speaker 1: Speaker 1; Volume 1; Playhead 0; MediaPath /home/resources/sound.ogg. These changes may be transformed into the following OSC messages: /speakerl/volume/,i 1 /speakerl/playhead/,i 0 /speakerl/mediapath/,s /home/resources/sound.ogg There may be changes in fixture parameter value of a projector, Projector 1: Playhead 0; MediaPath /home/resources/video.mp4. These changes may be transformed into the following OSC messages: /projectorl/playhead/1 0 /projectorl/mediapath/,s /home/resources/sound.ogg -One shot momentary Cues To trigger playback of media assets (e.g. audio and/or video files), the user may include in the pre-programmed sequence of events in the game engine 60 'one shot momentary cues'.
These cues trigger creation of an OSC message as described above, but which may reference a media asset (for example by a relative directory location). The OSC message may be in a format ready to be used by external software such as QLab. For example, the OSC message could be generated in a form that complies with the QLab OSC Dictionary (:t:ns' Lwa.Oprildnrc? -Positional information of Media Sources The positioning of media sources (for example loudspeakers and projectors) may be controlled by a system that ensures correlation between the setup of the real fixtures in the real space 30 and encoding the GDTF/MVR file. This file may then be used in the virtual scene to ensure that the setup for scene rig and fixtures in the virtual space matches that in the real space.
Computer program 104 The purpose of the computer program 104 is to receive OSC messages from the plugin 102, reformat these messages into a format that can be interpreted by the real fixtures (that often use different communication protocols as described above) in the real space 30, and then transmit the reformatted messages to the real fixtures and/or any control devices (e.g. QLab).
-Mapping Virtual Fixtures to Real Fixtures The computing device 10 has access to a fixture specification library, which is global and relatively static (for example sha Nhtina.corn). The fixture specification library provides certain specification information (or parameters) about real fixtures. This fixture specification library may be stored locally (e.g. in the memory 40 of the computing device 10) or remotely (e.g. on a server) that is accessible to the computing device 10. Fig. 5A shows an example of how a fixture specification library 200 may be organised. The fixture specification library comprises a plurality of individual fixture specifications that can be looked up by the computing device 10. Each individual fixture specification 210 comprises the manufacturer and model of the fixture, and information about how the fixture can be addressed and controlled. For example, a fixture specification for a DMX fixture may comprise a mapping between each listed parameter of the fixture and its corresponding DMX offset value to be used when addressing the fixture.
The computing device 10 is configured with a mapping that indicates how the virtual fixtures map to the real fixtures in the real space 30. This is referred to herein as a project specific map.
It will be recalled from the description of Fig. 2 that the user can tag virtual fixtures in the virtual space 20 to be monitored for virtual parameter fixture changes when the sequence is played by the game engine 60. When arriving at the real space 30 with the computing device 10, the user starts the computer program 104. The computer program 104 provides a Graphical User Interface (GUI), not shown. The GUI facilitates a walkthrough process that configures the computer program 104 to know how the tagged virtual fixtures correspond to the real fixtures in the real space 30. The GUI may ask the user to confirm how the tagged virtual fixtures are set up and controlled in the real space 30. For example, via the GUI, a user would indicate that the real fixtures 44A -44c corresponding to virtual fixtures 54A 54c are controlled by QLab, that the real fixtures 44D -44F corresponding to virtual fixtures 54D -54 are connected by Art-Net, and that the real fixtures 44G -441 corresponding to fixtures 54G -541 have direct DMX connections, etc. Note that these subscripts used on reference numerals 44 and 54 are not shown in the drawings but are used here to explain how different groups of virtual fixtures and real fixtures are controlled respectively. The GUI may also prompt the user to indicate any other required information, such as the IP address and port number of QLab. It may be that, in some circumstances, the computer program 104 may auto-discover some of this information at the real space 30. If so, the user may be asked simply to confirm it and supplement as needed. The result is that the computer program 104 creates and stores in the memory 40 a mapping between each virtual fixture 54, each real fixture 44, and the details of the communication protocol to be used for each real fixture 44. Fig. 5B shows a project-specific map 300 stored by the computing device 10. An extract 310 is shown within project-specific map 300 for real fixtures 44 that are controlled by DMX. It is to be noted that other parts of the project-specific map may include mappings to other communication protocols, APIs, etc. The extract 310 illustrates how virtual fixture identities are mapped to a real production control protocol of the corresponding real fixture, the information available in the fixture specification library 200 along with a DMX base address that is used to address the real fixture 44.
An example fixture specification for two fixtures is given below: fixtures {"simple" {"brightness" [1]} "LM70-mode-9" {"x-axis" [1] "y-axis" [2] "master-control" [3] "rgbw" [4 5 6 7] "speed" [8] "reset" [9]}} In this example, the two fixtures are called "simple" and "LM70-mode-9". The fixture "simple" has a parameter "brightness" with a DMX offset of [1]. The fixture "LM70-mode-9" has 9 parameters as listed, each with a respective DMX offset. For example, the parameter "rgbw" has four DMX offsets indicated by [4 5 6 7].
An example of a project-specific map that references the example fixture specification is shown below: rig {"dummy" ["simple" 0] "my-mover-1" ["LM70-mode-9" 10] "my-mover-2" ["LM70-mode-9" 20]} In this example, the project specific map is called 'rig'. There is an instance of the "simple" fixture specification called "dummy" that has a DMX base address of 0. There are two instances of the "LM70-mode-9" fixture specification, called "my-mover-1" and "my-mover- 2" respectively. "my-mover-1" has a DMX base address of [10] and "my-mover-2" has a base address of [20]. The address ranges of "my-mover-1" and "my-mover-2" do not overlap.
-Processing of OSC messages Returning to Fig. 4, following the generation of an OSC message by the OSC client in the game engine 60, it is dispatched for delivery to the computer program 104. In this example, the plugin 102 and the computer program 104 are running on the computing device 10 and OSC messages simply require internal routing on the computing device 10. If the plugin 102 and computer program 104 were running on different devices, it would be possible to route the OSC messages over a suitable network, such as a LAN.
At step S5, the computer program 104 parses incoming OSC messages to determine how they are to be processed. Broadly, the computer program 104 determines whether each OSC message can be transmitted without any reformatting (e.g. for some kinds of media metadata and media cues) or whether the OSC message relates to a virtual fixture update whereby reformatting is required so that the information content of OSC message can be sent to the real fixtures 44 in the real space 30 using a compatible communication protocol.
To determine whether an OSC message relates to a virtual fixture update and requires reformatting, the computer program 104 takes the OSC address part of the OSC message and looks it up in the project-specific map. Continuing with the example above, the following example OSC messages could be received relating to the project-specific map 'rig': /fixture/set dummy brightness 100 /fixture/set my-mover-1 rgbw 255 100 0 100 The computer program 104 examines the first OSC message listed above and finds that the OSC address "fixture/set" exists in the project-specific map 'rig'. On that basis, the computer program 104 routes the first OSC message to be reformatted. The computer program 104 examines the second OSC message listed above and finds that the OSC address "fixture/set" exists in the project-specific map 'rig'. On that basis, the computer program routes the second OSC message to be reformatted as described in greater detail below.
On the other hand if the computer program 104 does not find a match for the first argument of an OSC message in the project-specific map, the message can be routed without reformatting. In this case the OSC message may relate to some media metadata and media cues for example which can be sent unchanged to QLab at step 56. This is because the cue outputs from the plugin 102 and the game engine 60 are, for example, OSC messages in the form /cue/{cue number}/ ... which is a format recognized according to QLab's OSC dictionary.
For OSC messages that are to be reformatted, at step 57 the incoming virtual parameter values in the OSC messages are normalised to a scale between 0 and 1 to ease processing in later steps. At step 58, tracking and punch-in/trim/shift-and-scale are handled which is described in greater detail below.
At step S9 the computer program 104 then determines which reformatting is required by using the first argument of each OSC message to lookup the reformatting needed according to the project-specific map, which maps the first argument (i.e. virtual fixture identity) to real fixture production control protocol. In this example, there are three possibilities: the first argument relates to real fixture that is controlled by QLab, step 59-1; the first argument relates to a real fixture controlled directly by DMX, 59-2; and the first argument relates to a real fixture controlled by other systems and APIs (for example L-Acoustics, Isadora), 59-3.
At step S9-1, the computer program 104 reformats the message to conform to QLab's OSC API, and possibly performs some processing to remap fixture and parameter names, and virtual parameters (for example if QLab uses different device instances and parameters -in this case the project specific map 300 may comprise details of the different device instances and parameters). The reformatted message is then transmitted by the computing device as an OSC message to QLab.
At step 59-2, the computer program 104 reformats the message to DMX message format. In particular, the computer program uses the first argument of the OSC message to lookup the corresponding fixture specification and DMX base address in the project-specific map. With that information, the fixture and parameter can be converted into a DMX channel and parameter value for transmission using the DMX protocol. If there is more than one DMX universe in use in the real space, the project-specific map may also comprise an identity of the DMX universe of the fixture.
At step 59-3, the computer program 104 reformats the message to another format or API.
Finally, at step 510, the computer program 104 outputs the reformatted OSC messages to effect changes to the real fixtures 44 corresponding to those made by the game engine 60 to the virtual fixtures 54.
-Punch in As mentioned above, at step 58 the computer software 104 can track "punch in" status during running of a sequence of events from a game engine. Punch-in means that, rather than control of real fixtures happening from the game engine, control of any one or more real fixture can be overridden either entirely or in part by the computer software 104. For example, it may be that an offset or correction needs to be applied to one or more real fixture to keep better spatial alignment and synchronization with the game engine (there could be slight spatial misalignments between virtual fixtures in the virtual space and the real fixtures in the real space, and/or latency between the game engine and the real fixtures could cause parameter changes of the real fixtures to lag parameter changes of the virtual fixtures.
In one example, one or more MIDI controller 108 can receive input from MIDI control surfaces (not shown). The MIDI control surfaces can be for two purposes: manual firing of media playback cues in QLab, and "punch-in" of control messages in real time (whether they are destined for QLab, DMX / Art-Net, L-ISA etc.). At step 512 (for manual firing of media playback cues in QLab) MIDI Continuous Controller ('CC') numbers and note values are received by the computer program 104 from MIDI controller 108. The computer program 104 maintains a mapping from MIDI CC numbers and note values and to QLab cues. As the MIDI CC numbers and note values are received, the computer program 104 uses the mapping convert them into the correspongin QLab cue. The QLab cues are then sent to step S7 where they are transmitted from the computing device to QLab.
For punch-in of control messages in real-time, at step S12 the computer program 104 receives MIDI CC numbers from the MIDI controller 108. The computer program has previously stored a mapping from MIDI CC numbers to QLab cues. This mapping is used to map the incoming media cues to corresponding QLab cues. the computer program 104 has previously stored a mapping in the project-specific map from MIDI CC numbers, controller values and/or MIDI channels to real DMX fixture and parameter names (and, optionally, some scaling and ranging information), or sound source names. At step 512 the computer program 104 uses this mapping to lookup the corresponding real fixture and parameter to be used in a DMX message. The aforementioned GUI may be used to establish and maintain the MIDI mappings in the project-specific map. For sound control using DMX, every parameter in the project-specific map has a "punch-in status" or control status value that determines whether output is tracking OSC messages coming from the game engine 60 or has been taken over by MIDI. The GUI may also be used to maintain this mapping in the project-specific map (for each parameter, whether it is under game engine control or MIDI control). The punch-in status could be updated manually by the user, automatically by the computer software 104, or a combination of both.
The output data from step 512 is then passed to step 58, where the punch in status is checked by the computer program 104: if the punch in status for a fixture and parameter indicates that it is taken over by MIDI, the output data from step 512 is passed to step S9 for reformatting and output in the correct message format for the real fixture as described previously (and either any OSC messages from the game engine 60 are ignored whilst the punch in status indicates take over by MIDI, or OSC messages are processed to provide the necessary 'trim'). On the other hand, if the punch in status for a fixture and parameter indicates that it is under game engine control, the output data from step 512 is not passed to step S9, and the computer program continues to pass the OSC messages from the game engine 60 to step 59.
-Additional/alternative way to map virtual fixtures to real fixtures At step 513, an additional and/or alternative way map real fixtures to virtual fixtures is shown. Fixture and parameter information may be received and stored in memory by the computer program 104 from an external source. The fixture and parameter information may comprise a file containing data that can be used to map the virtual fixture identity to the real-world fixture identity and corresponding parameters. In the embodiment described above, the virtual fixture identity is the first argument in the OSC message, which is mapped to the relevant fixture information in the project-specific map. The computer program 104 may read the file to extract real fixture information and insert it into the project-specific map 300. For example, the file may be in the General Device Type Format (GDTF) and the data in this format may be extracted (entirely or in part) and used in the project-specific map 300. This could be a one-off extraction into the project-specific map from, or could be received from the game engine. For example, a user may import a GDTF/MVR specification into a scene running the plugin 102 within the game engine 60. The plugin 102 may be configured to autogenerate an.xml file in a file format adapted for the computer program 104, and then sent to the computer program 104.
-Trim and/or Shift-and-Scale In some embodiments, the computer program 104 may enable the user to control a 'trim' function and/or a 'shift-and-scale' function for one, some or all of the fixtures. These features enable the user to adjust the signals from the game engine 60 as they are being sent to real fixtures to correct for small misalignments or fixture misinformation / misalignment or misconfiguration. For trim, the computer program 104 may allow the user to set (e.g. via the GUI) minimum and maximum values of any parameter of a fixture for which communication won't be supported for values outside of the minimum and maximum values. Once these values have been set by the user, the trim may be implemented by dropping OSC messages received from the game engine 60 containing parameter changes outside the range.
For the 'shift-and-scale' function the computer program 104 may allow the user to shift and/or scale fixture parameter ranges. For example, it may be that once at the real space the user finds that there are differences between the GDTF/MVR data used in the virtual space and the real fixtures. Additionally or alternatively, the user may find that could be other differences between the virtual fixtures and the real fixtures, including but not limited to rotation ranges and illumination patterns. The shift-and-scale functions allow the user to adjust the virtual fixtures 'on the fly' to match the real fixtures in the real space. This may be done by shifting the virtual fixture ranges by a certain amount and/or or by scaling values. Once these values have been set by the user, any shift and scale may be implemented by adjusting parameter values in OSC messages received from the game engine 60 as part of the message re-formatting process described above.
Synchronisation of State In some embodiments a real-time communication protocol may be used to provide synchronisation of state between the game engine 60 and the real fixtures in the real space 30, via the computer program 104. This involves a two-way data flow, allowing the game engine 60 to not only control directly (virtual to real) but also receive feedback from the real fixtures (real to virtual). For example, the computer program 104 may monitor network traffic (e.g. DMX network traffic) to ensure accurate state representation. Changes of state indicated in the network traffic and detected by the computer program 104 may be communicated to the game engine 60 using OSC messages in a similar way to that described above.
In some embodiments file paths etc (e.g. for media files) may need to be communicated by relative directory rather than absolute directory. The computing device running the computer program 104, along with the computing device running the game engine 60 may both comprise an identical file structure for media used, and each should work relatively from a set project directory to ensure equitability of operation.
Example set-up
Fig. 6 illustrates an example of at least some embodiments of the invention to control a sequence of events or actions of a real fixture from a game engine. A local area network ('LAN') generally identified by reference numeral 500 comprises a first computing device 510 running the Windows (RTM) operating system, a second computing device 520 running the MacOS operating system, and an Art-Net node 530. It is not essential that the first and second computing devices utilise the mentioned operating systems. Other operating systems can be used, the examples being illustrative. In this example the first and second computer devices are desktop computers. However, this is not essential, and it is envisaged that any suitable form of computer device may be used instead including, but not limited to, a server, a cloud-server, laptop, tablet, smartphone, etc. In this example, the LAN 500 comprises a wired LAN, but this is not essential. Communication at the physical layer between network interfaces of the first and second computing devices in the LAN 500 may utilise any wired or wireless communication protocol, and this communication is illustrated by lines 515 and 525. Examples of such communication protocols include Ethernet, Wi-Fi and Art-Net.
The Art-Net node 530 is connected to the second computing device 520 by an Ethernet cable 525 and to a light fixture 540 via a DMX cable 550. Communication between the second computing device 520 and the Art-Net node 530 utilises the Art-Net communication protocol (e.g. Art-Net 4), which enables transmission of DMX512 lighting control data over Ethernet networks. In this example, the Art-Net node 530 is a Showtec NET-2/3 router that converts Art-Net packets to DMX512 data and vice-versa. In this example the light fixture 540 comprises a moving head light, such as a BETOPPER LM705 LED moving head stage light. Typically, a moving head light has various parameters that are controllable over DMX512, including brightness, pan/tilt (or x and y axis), RGBW, and speed.
Network interfaces of the first computing device 510, the second computing device 520 and the Art-Net node 530 have IP addresses in the subnet 192.168.1.0/24, in this case 192.168.1.117 and 192.168.1.83 respectively. There is nothing important about the particular IP addresses, and they are used here simply to aid understanding. It is also envisaged that the various components could be on entirely different networks and communicate over a WAN or the like.
A game engine 60 (such as Unreal Engine) is installed on the first computing device 510.
The functionality of the game engine has been expanded by means of the plugin 106, as described previously. It is envisaged that the functionality of the plugin 106 could be provided as part of the core software of the game engine 60, such that plugin 106 does not need to be added after the game engine 60 is installed on the first computing device 510. Fig 6A is a screenshot 600 of the desktop environment of the first computing device 510 with game engine 60. A window 601 within the game engine 60 (that is generated by the plugin 106) enables the user to input some configuration for the plugin 106, including a 'send IP' address and 'send port' where output OSC messages are to be sent by the OSC client of the game engine 60. In this case, OSC messages are to be sent to the IP address 192.168.1.117, which is the IP address of the second computing device 520.
The computer program 104 is installed on the second computing device 520. Fig. 6B is a screenshot 610 of the desktop environment of the second computing device 520. A window 611 in the desktop environment provides information about the computer program 104 running on the second computing device 520. When opened each time, the computer program 104 reads a first config file, which is shown in an explorer window 612 in Fig. 6C.
The first config file specifies the port number that the computer program 104 should listen to and the IP address to which it should send Art-Net packets. In this example, the first config file indicates that the computer software 104 should listen for incoming OSC messages on port 53001 and send Art-Net packets to IP address 192.168.1.83 (which is the IP address of the Art-Net node 530).
The computer program 104 also has access to a second config file, which is shown within an explorer window 613 in Fig. 6D. The second config file comprises the fixture specifications and a project specific map, the latter mapping each virtual fixture to a corresponding fixture specification. The second config file contains the same information explained above in connection with Fig. 5B. In particular there is a virtual fixture called 'my-mover-1' which is defined as having the fixture specification 'LM70-mode-9' with a DMX offset of zero.
Fig. 6B is another screenshot 620 of the desktop environment of the first computing device 510 with game engine 60 running in window 621. The game engine 60 comprises a virtual space in the form of a scene 614 that is a digital twin representation of a real space. In this example the real space is a room containing the first computing device 510, second computing device 520 and the light fixture 540. The scene 614 in the game engine comprises a 3D object of the room obtained, for example, using a LIDAR scan as described elsewhere herein. A virtual fixture (not visible in the scene 614) representing the real light fixture 540 has also be placed in the scene 614 and has been tagged so that its object properties are inspected at regular intervals by the game engine 60 to check for any changes since the previous interval. The game engine 60 simulates the lighting of the scene 614 by the virtual light fixture, as indicated approximately by the illuminated region 616. An area 615 in the screenshot 620 shows a time-based sequence of events of the virtual light fixture in the game engine 60. In this example, the virtual light fixture is sequenced to move and to change colour over time. In other words, the object properties of the virtual light fixture are configured to change with time, and this is encoded in the sequence of the game engine. When the sequence is run within the game engine 60, the game engine 60 changes the properties of the virtual fixture according to the pre-programmed sequence of events. In Fig. 6E at time the virtual light fixture is pointing towards the corner of the scene 614 and is pink in colour. Referring to Fig. 6F at a later time t2 the illuminated region 616 has moved to the right in the scene 614 to illuminate the door and is still pink in colour. Referring to Fig. 6G at a later time t3 the illuminated region 616 has moved further to the right in the scene 614 to illuminate the door and has changed to a yellow colour. In this way the game engine 60 is configured to simulate a sequence of events in the scene 614, and this sequence of events can be played back in real-time and watched by a user on screen.
Referring again to Fig. 5, the combination of the plugin 106 and the computer program 104 enables the light fixture 540 (in the real space) to perform the same pre-programmed sequence of events in the game engine, as the sequence of event is played by sequencer of the game engine. In other words, in this example, as the virtual light moves and changes colour within the scene 614 in the game engine, the real light fixture 540 moves and changes colour in the same way. The invention is not limited to this particular sequence of movement and colour changes and many different real world fixtures can perform a pre-programmed sequence of events of a corresponding virtual fixtures in a game engine.
Fig. 6H shows a view of the screen displaying the game engine 60, in which the scene 614 can be seen, and at the top of Fig. 6H a curtain, door and wall in the corresponding real space is visible. At time ti the virtual light fixture is pointing towards the left corner of the scene 614 and is pink in colour. It illuminates a region 630. Looking at the real space, the light fixture 540 is also pointing towards the left corner of the scene 614 and is pink in colour. It illuminates a region 632 that is approximately the same as the region 630 in the virtual space (scene 614).
Fig. 61 shows a later time t2 in the sequence. At time t2 the virtual light fixture is pointing towards the right corner of the scene 614 and is green in colour. It illuminates a region 634.
Looking at the real space, the light fixture 540 is also pointing towards the left corner of the scene 614 and is pink in colour. It illuminates a region 636 that is approximately the same as the region 634 in the virtual space (scene 614).
As the sequencer in the game engine 60 is running, the plugin 106 causes the game engine to inspect the virtual light fixture object at each time interval to look for changes to the object's parameters. In this case the parameters are 'light intensity', 'RGB' and 'object transformation' can be monitored. When the game engine 60 detects a change in one or more of these parameters, the corresponding Custom Event of the or each parameter is fired. Upon firing of a Custom Event, the OSC client builds an OSC message encoding the change in the parameter and sends it to the IP address and port number set by the user (see window 601 in Fig. 6A). In example, OSC messages are sent to the second computing device 520 (see Fig. 5).
Upon receiving the OSC messages, the computer program 104 processes them as described in connection with Fig. 4. In this example, the second computer device is connected only to the Art-Net node 530, and incoming OSC messages relate only to the virtual light fixture (and by implication to the real light fixture 540). Accordingly, at step 510 the computer program 104 will perform the particular step 510-2, i.e., the computer program 104 uses the information in each OSC message to look up the corresponding fixture specification and DMX base address in the project-specific map. The computer program 104 then prepares a message conforming to the Art-Net protocol which contains the DMX channel and new parameter value, and then sends it to the Art-Net node 530. Upon receipt of the message, the Art-Net node 530 converts the Art-Net message to DMX512 data and sends it to the light fixture 540. The light fixture 540 receives the DMX512 data and implements the changed parameter. In this way the light fixture 540 in the real space is effectively controlled by the sequence programmed for the virtual fixture in the game engine.
It will be noted that there a slight misalignment between the position of illuminated part of the scene in the game engine and the position of the illuminated part of the room at any given time. This is a deliberate misalignment of the real lighting fixture to illustrate when a correction (as described above in relating to 'punch in') can be used by the computer program 104 to correct values received in OSC messages, by applying 'offset' for example. This offset may be configurable by the user on the fly through the computer software to help deal with small differences between the virtual fixtures in the virtual space and the real fixtures in the real space.
This example illustrates how the game engine can be used to prepare, review and change, or 'pre-visualise', a sequence of events of a virtual fixture in virtual space before implementing the sequence of events corresponding real fixture in a corresponding real space. It will be appreciated that this example serves to illustrate a general principle, and that embodiments may be used to pre-visualise a sequence of events for a number of virtual devices in a virtual space. For example, a digital twin of a theatre space may be set up in the game engine 60 with virtual fixtures corresponding to the real fixtures in the real theatre. The game engine may be programmed with a sequence of events for any one or more, or any combination, of the virtual fixtures in order to pre-visualise lighting, sound, images, etc. of a show to be performed in the theatre. The show can then be watched by a user in the virtual space and changes made as needed. Once the show is finalised in the virtual space, the computing device operating the game engine can be connected to a computer device at the theatre venue that is running the computer program 104. After configuration as described above, the game engine runs the show in the virtual space following the pre-programmed sequence of events, and the plugin 106 and the computer program 104 allow the real world fixtures at the theatre venue to follow the same pre-programmed sequence. In this way, the game engine effectively runs the live show at the theatre.
-Other inputs to computer program 104 It is envisaged that the computer program 104 may receive other inputs that do not originate in the game engine 60. In this way the computer program may act as a replacement for other devices/software (such as QLab). For example, some systems (such as TouchOSC) may output OSC-style message, which may be adapted to be received with other systems such as QLab. The computer program 104 may be adapted to receive those messages, convert them into DMX messages and transmit them to DMX devices (thereby doing the DMX driving itself).
It will be appreciated that there are various advantages of at least some embodiments of the invention. For example, much of the work of preparing a pre-programmed sequence of events for fixtures in a show, event, concert, etc. can be performed on the game engine in advance and away from the real space. This enables the show to be pre-visualised by as many people as required, who may be in different locations. Furthermore, it is easier to create live shows and events inside a gaming engine without the need for specialised technical knowledge.

Claims (48)

  1. CLAIMS1. A computer-implemented method of controlling with a game engine a real fixture in a real space, which real fixture comprises a real fixture identity and at least one real parameter that is adjustable to change an effect provided by the real fixture within the real space, which method comprises the steps of: (a) providing a virtual fixture within a virtual space in the game engine, said virtual fixture corresponding to said real fixture and which comprises a virtual fixture identity and at least one virtual parameter that is adjustable to change an effect provided by the virtual fixture within the virtual space in the game engine; (b) with the game engine, playing a sequence of events programmed in the game engine, which sequence of events comprises at least one adjustment in said at least one virtual parameter from a first virtual parameter value to a second virtual parameter value; (c) determining whether said at least one virtual parameter is adjusted as the sequence of events is executed and, if so, outputting from the game engine an indication comprising said virtual fixture identity and said second virtual parameter value; (d) obtaining, using said indication, a real fixture identity corresponding to said virtual fixture identity and an updated real parameter value corresponding to said second virtual parameter value; (e) generating a reformatted indication comprising said real fixture identity and said updated real parameter value; (f) transmitting said reformatted indication towards said real fixture; and said real fixture adjusting said real parameter value to said updated real parameter value according to said reformatted indication; whereby, as said sequence of events is played by the game engine, the effect provided by the real fixture in the real space imitates changes to the effect provided by the virtual fixture in the virtual space.
  2. 2. A computer-implemented method as claimed in claim 1, further comprising the steps of: providing a virtual space object within the virtual space in the game engine, which virtual space object corresponds to said real space; and said at least one virtual fixture positioned in substantially the same position in the virtual space object as the real fixture in the real space.
  3. 3. A computer-implemented method as claimed in claim 2, wherein said virtual space object comprises a digital twin of said real space.
  4. 4. A computer-implemented method as claimed in claim 1, 2 or 3, wherein said virtual fixture comprises a virtual model of said corresponding real fixture.
  5. 5. A computer-implemented method as claimed in any of claims 1 to 4, further comprising the steps of: providing a plurality of virtual fixtures in the virtual space, each of which corresponds to a respective real fixture in the real space; tagging each of said plurality of virtual fixtures; and wherein said step of determining further comprises checking each tagged virtual fixture for adjustment of said at least one virtual parameter as the sequence of programmed events is played by the game engine.
  6. 6. A computer-implemented method as claimed in claim 6, wherein said playing of the sequence of events comprises generating a series of frames within the game engine and said step of checking comprises comparing at least two frames of said series to determine whether said adjustment of said at least one virtual parameter has occurred, and optionally wherein said at least two frames are adjacent frames in said series, such adjoining frames.
  7. 7. A computer-implemented method as claimed in claim 5 or 6, further comprising the step of: storing, for each tagged virtual fixture, the last known first virtual parameter value for use in said step of determining.
  8. 8. A computer-implemented method as claimed in any preceding claim, wherein said at least one virtual parameter comprises a plurality of virtual parameters of said virtual fixture, and wherein said step of outputting an indication from the game engine comprises: generating a respective indication for each virtual parameter of said plurality of virtual parameters that is determined to be adjusted.
  9. 9. A computer-implemented method as claimed in any preceding claim, wherein said indication comprises at least one datagram comprising said virtual fixture identity and said second virtual parameter value, optionally wherein said at least one datagram is constructed in accordance with the Open Sound Control (OSC) protocol.
  10. 10. A computer-implemented method as claimed in any preceding claim, further comprising the step of parsing said indication to identify said virtual fixture identity and said second virtual parameter value.
  11. 11. A computer-implemented method as claimed in any preceding claim, further comprising the step of: using said virtual fixture identity to lookup a type of reformatting of said indication, which lookup utilises a mapping between said virtual fixture identity and a real fixture production control protocol indicating the type of reformatting required.
  12. 12. A computer-implemented method as claimed in claim 11, wherein said step of generating a reformatted indication comprises generating at least one message in a format complying with said real fixture production protocol, said at least one message comprising data representing said real fixture identity and said updated real parameter value.
  13. 13. A computer-implemented method as claimed in claim 11 or 12, wherein said mapping further comprises a fixture specification of the real fixture corresponding to said virtual fixture identity.
  14. 14. A computer-implemented method as claimed in claim 12 or 13, wherein said real fixture production control protocol comprises a DMX-type protocol and a DMX base address of the real fixture, the method further comprising the steps of: using said DMX base address to convert said virtual fixture identity to a DMX channel and to convert the second virtual parameter value to a DMX parameter value; including the DMX channel and the DMX parameter value in said at least one message as said data representing said real fixture identity and said updated real parameter 15 value.
  15. 15. A computer-implemented method as claimed in claim 12 or 13, wherein said real fixture production protocol comprises configuration information about an external API, the method further comprising the steps of: using the configuration information about the external API to generate said at least one message in a format complying with the external API, said at least one message comprising data representing said real fixture identity and said updated real parameter value.
  16. 16. A computer-implemented method as claimed in claim 15, wherein said configuration information about said external API comprises information for a QLab API, L-Acoustics API or Isadora API.
  17. 17. A computer-implemented method as claimed in any of claims 12 to 16, further comprising the steps of: selecting from a plurality of reformatting options a reformatting option for said indication, wherein said reformatting options comprise: (i) no reformatting and transmitting said indication substantially in the form output from the game engine; and (ii) reformatting said indication.
  18. 18. A computer-implemented method as claimed in claim 17, further comprising the steps of: selecting option (i) when said virtual fixture identity does not map to a real fixture production control protocol; and selecting option (ii) when said virtual fixture identity does map to a real fixture production control protocol.
  19. 19. A computer-implemented method as claimed in claim 17 or 18, further comprising the step of selecting option (i) for indications output from the game engine that comprise media metadata and/or media cues.
  20. 20. A computer-implemented method as claimed in preceding claim, further comprising the steps of: as each indication is received from the game engine, checking a control status for the virtual fixture identity in said indication, which control status indicates whether said real fixture is controlled by the indications output by the game engine or is not controlled by the indications output by the game engine; and dropping said indication received from the game engine if said control status indicates that said real fixture is not controlled by the game engine.
  21. 21. A computer-implemented method as claimed in claim 20, wherein said control status indicates that said real fixture is not controlled by the game engine, the method further comprising the steps of: receiving an external parameter value, for example an external parameter value input by a user through a control device, which external parameter value overrides the updated real fixture parameter value from the game engine; and using a representation of said external parameter value in place of said updated real fixture parameter value in said reformatted indication.
  22. 22. A computer-implemented method as claimed in claim 21, further comprising the step of mapping said external parameter value to said real fixture identity and said at least one real parameter.
  23. 23. A computer-implemented method as claimed in claim 21 or 22, wherein said external parameter value comprises at least one of a MIDI CC number, a MIDI controller value, and a MIDI channel.
  24. 24. A computer-implemented method as claimed in any of claims 20 to 23, further comprising the steps of: if said control status indicates that said real fixture is controlled by the game engine, adjusting said updated real fixture parameter value and before transmitting it in said reformatted indication.
  25. 25. A computer-implemented method as claimed in any of claims 20 to 24, wherein said control status indicates that said real fixture is controlled by the game engine, the method further comprising the steps of: receiving an adjustment value, for example an adjustment value input by a user through a control device, which adjustment value indicates an adjustment to said at least one real parameter value of the real fixture; and using said adjustment value to change said updated real fixture parameter value before transmitting it in said reformatted indication.
  26. 26. A computer-implemented method as claimed in claim 25, wherein said adjustment value represents a correction to be applied to said at least one parameter value, for example to correct for a misalignment between the spatial orientation of said virtual fixture in said virtual space and the spatial orientation of said real fixture in said real space.
  27. 27. A computer-implemented method as claimed in any preceding claim, wherein said effect provided by said real fixture and by said virtual fixture corresponding to said real fixture, comprises an effect used in a live production environment including, but not limited to any one or more of the following: light, sound, audio, still images, moving images, smoke, motion, digital display.
  28. 28. A computer-implemented method as claimed in any preceding claim, further comprising the step of providing a user-adjustable trim function that enables the user to change a maximum and/or minimum value of said real parameter value that will be transmitted to the real fixture.
  29. 29. A computer-implemented method as claimed in claim 28, further comprising the step of dropping any indication comprising said second virtual parameter value that corresponds to a value greater than said maximum value or less than said minimum value of said real parameter value.
  30. 30. A computer-implemented method as claimed in any preceding claim, further comprising the step of providing a user-adjustable shift and/or scale function that enables the user to apply a shift and/or a scale factor to a fixture range of said real parameter value.
  31. 31. A computer-implemented method as claimed in claim 30, further comprising the step of applying a fixed shift to said second virtual parameter value and/or multiplying to said second virtual parameter value by said scale factor during said reformatting step.
  32. 32. A computer-implemented method as claimed in any preceding claim, wherein each real fixture comprises a communication interface allowing that real fixture to receive digital communications, which real fixture is adapted to act on instructions contained in those communications, and further comprises a computer-controlled apparatus for implementing those instructions as a real-world effect.
  33. 33. A computer-implemented method as claimed in any preceding claim, further comprising the steps of: receiving an external indication for controlling a change to an effect provided by a real fixture, wherein said external indication is received from software, a system or a device external of said game engine; processing said external indication to generate an adapted indication suitable for controlling a real fixture using a production control protocol, such as DMX; and transmitting said adapted indication toward said real fixture.
  34. 34. A computer-implemented method of providing effects in a production environment comprising real fixtures in a real space in which the effects are intended to be experienced by one or more of the human senses and/or recorded by audio-visual recording equipment, which method comprises the steps of performing a computer-implemented method of any of claims 1 to 33 whereby the effects in the real space are controlled by the sequence of events programmed in the game engine.
  35. 35. The computer-implemented method as of claim 34, wherein said production environment comprises, a film or television studio or set, an indoor or outdoor theatre, a concert venue, a stadium, a cinema, an immersive show or experience, a museum, an art gallery, a building interior or exterior.
  36. 36. A computer program comprising instructions which, when the program is executed by the computing device, cause the computing device to perform the steps of: providing a game engine on said computing device; providing a virtual fixture within a virtual space in the game engine, said virtual fixture corresponding to said real fixture and which comprises a virtual fixture identity and at least one virtual parameter that is adjustable to change an effect provided by the virtual fixture within the virtual space in the game engine; with the game engine, playing a sequence of events programmed in the game engine, which sequence of events comprises at least one adjustment in said at least one virtual parameter from a first virtual parameter value to a second virtual parameter value; 30 and determining whether said at least one virtual parameter is adjusted as the sequence of events is executed and, if so, outputting from the game engine an indication comprising said virtual fixture identity and said second virtual parameter value.
  37. 37. A computer program as claimed in claim 36, wherein said program further comprises instructions which, when the program is executed by the computing device, cause the computing device to perform the steps of any of claims 2 to 9.
  38. 38. A computer program product comprising a computer program as claimed in claim 36 or 37.
  39. 39. The computer program product of claim 38, in the form a plugin for a game engine.
  40. 40. A computer-readable data carrier having stored thereon the computer program product of claim 38 or 39.
  41. 41. A data carrier signal carrying the computer program product of claim 38 or 39.
  42. 42. A game engine comprising the instructions of the computer program of claim 36 or 37.
  43. 43. A computer program for facilitating performance of the computer-implemented method of any of claims 1 to 33, which computer program comprises instructions which, when the program is executed by a computing device, cause the computing device to perform steps (d), (e) and (f) of claim 1.
  44. 44. A computer program as claimed in claim 43, wherein said program further comprises instructions which, when the program is executed by the computing device, cause the computing device to perform the steps of any of claims 10 to 33.
  45. 45. A computer program product comprising a computer program as claimed in claim 36 or 43 or 44.
  46. 46. The computer program product of claim 45, in the form an app downloadable from an app store.
  47. 47. A computer-readable data carrier having stored thereon the computer program product of claim 45 or 46.
  48. 48. A data carrier signal carrying the computer program product of claim 45 or 46.
GB2403411.8A 2024-03-08 2024-03-08 Computer-implemented method and system for controlling real fixtures Pending GB2640117A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2403411.8A GB2640117A (en) 2024-03-08 2024-03-08 Computer-implemented method and system for controlling real fixtures
PCT/GB2025/050467 WO2025186582A1 (en) 2024-03-08 2025-03-07 Computer-implemented method and system for controlling real fixtures

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2403411.8A GB2640117A (en) 2024-03-08 2024-03-08 Computer-implemented method and system for controlling real fixtures

Publications (2)

Publication Number Publication Date
GB202403411D0 GB202403411D0 (en) 2024-04-24
GB2640117A true GB2640117A (en) 2025-10-15

Family

ID=90730757

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2403411.8A Pending GB2640117A (en) 2024-03-08 2024-03-08 Computer-implemented method and system for controlling real fixtures

Country Status (2)

Country Link
GB (1) GB2640117A (en)
WO (1) WO2025186582A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003067934A2 (en) * 2002-02-06 2003-08-14 Color Kinetics Incorporated Controlled lighting methods and apparatus
WO2005084339A2 (en) * 2004-03-02 2005-09-15 Color Kinetics Incorporated Entertainment lighting system
US20050248299A1 (en) * 2003-11-20 2005-11-10 Color Kinetics Incorporated Light system manager
US20050275626A1 (en) * 2000-06-21 2005-12-15 Color Kinetics Incorporated Entertainment lighting system
US7231060B2 (en) * 1997-08-26 2007-06-12 Color Kinetics Incorporated Systems and methods of generating control signals
US20110115413A1 (en) * 2009-11-14 2011-05-19 Wms Gaming, Inc. Configuring and controlling casino multimedia content shows

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200257831A1 (en) * 2019-02-13 2020-08-13 Eaton Intelligent Power Limited Led lighting simulation system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7231060B2 (en) * 1997-08-26 2007-06-12 Color Kinetics Incorporated Systems and methods of generating control signals
US20050275626A1 (en) * 2000-06-21 2005-12-15 Color Kinetics Incorporated Entertainment lighting system
WO2003067934A2 (en) * 2002-02-06 2003-08-14 Color Kinetics Incorporated Controlled lighting methods and apparatus
US20050248299A1 (en) * 2003-11-20 2005-11-10 Color Kinetics Incorporated Light system manager
WO2005084339A2 (en) * 2004-03-02 2005-09-15 Color Kinetics Incorporated Entertainment lighting system
US20110115413A1 (en) * 2009-11-14 2011-05-19 Wms Gaming, Inc. Configuring and controlling casino multimedia content shows

Also Published As

Publication number Publication date
WO2025186582A8 (en) 2025-10-02
GB202403411D0 (en) 2024-04-24
WO2025186582A1 (en) 2025-09-12

Similar Documents

Publication Publication Date Title
KR102371031B1 (en) Apparatus, system, method and program for video shooting in virtual production
KR102619770B1 (en) Remotely performance directing system and method
US10217274B2 (en) Control for digital lighting
US10165239B2 (en) Digital theatrical lighting fixture
CN110225224A (en) Director method, the apparatus and system of virtual image
US12069788B2 (en) Remote live scene control system, methods, and techniques
KR102192412B1 (en) Method for compositing real time video in 3d virtual space and apparatus using the same
US20200257831A1 (en) Led lighting simulation system
US20240303896A1 (en) Virtual window apparatus and system
JP2022539364A (en) Performance production system and its control method
JP2019534503A (en) Stage production management node and method
CN117751374A (en) Distributed command execution in a multi-location studio environment
CN103455299A (en) Large-wall stereographic projection method
US20140266766A1 (en) System and method for controlling multiple visual media elements using music input
GB2640117A (en) Computer-implemented method and system for controlling real fixtures
KR20240010339A (en) A method for servicing all-in-one live editing for real-time vfx and xr video transmission and an apparatus and a system thereof
CN221039846U (en) Simulation centralized control system for professional performance
Collin et al. How to design virtual video production for augmented student presentations
Jushchyshyn Synchronized Lighting Between Real and Virtual Spaces: Spatial Tracking and Pixel Mapping for Dynamic Lighting Control
JP7711194B2 (en) Motion Capture Reference Frame
JP7398632B2 (en) Image projection system and image generation device
CN117376545A (en) Display and play control method and device based on position
KR20250141459A (en) performance monitoring system of real-time content according to the movement of the virtual camera in the virtual studio
CN116506990A (en) Lighting control method, device, equipment and storage medium in virtual production
HK40089607A (en) Virtual shooting system, method, device, equipment, storage medium and program product