GB2500304A - Tracking system having multiple orientable cameras - Google Patents
Tracking system having multiple orientable cameras Download PDFInfo
- Publication number
- GB2500304A GB2500304A GB1301477.4A GB201301477A GB2500304A GB 2500304 A GB2500304 A GB 2500304A GB 201301477 A GB201301477 A GB 201301477A GB 2500304 A GB2500304 A GB 2500304A
- Authority
- GB
- United Kingdom
- Prior art keywords
- operative
- interest
- master
- units
- orientable
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 claims abstract description 9
- 238000005286 illumination Methods 0.000 claims abstract 2
- 241001465754 Metazoa Species 0.000 claims description 4
- 230000003068 static effect Effects 0.000 claims description 2
- 241000777300 Congiopodidae Species 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
A system for tracking an object of interest O in a working environment comprises a plurality of operative units 3a-d, at least some of which are orientable in three-dimensional space to follow an object of interest. There is a system controller 19 for controlling the operative units. Preferably the operative units are cameras or illumination means such as spotlights. There may be a master or lead operative unit 3a and slave operative units 3b-d such that the slave units are automatically controlled to follow the operation of the master. The system may model the terrain of the working environment to pinpoint the position of the object of interest from the pan or tilt of the master unit. The master unit may be programmed to follow pre-defined path at a speed controlled by an operator or user. Method claims for broadcasting video of an object or illuminating and object are included.
Description
-1 -
Tracking System and Method
The present invention relates to a system for and a method of tracking an 5 object of interest, in particular a camera tracking system for tracking an object using a plurality of video cameras.
In one aspect the present invention provides a tracking system for tracking an object of interest in a working environment, the system comprising: a plurality of operative units, at least ones of which are orientable in three-dimensional space to 10 follow an object of interest within a working environment; and a system controller for controlling operation of the operative units such that the orientable operative units follow the object of interest.
In another aspect the present invention provides a method of broadcasting video of an object of interest from within a working environment, comprising the 15 steps of: providing the above-described tracking system; and tracking an object of interest with the plurality of orientable operative units; whereby, at any given time, cameras on any of the orientable operative units can be selected to provide video of the object of interest.
In a further aspect the present invention provides a method of illuminating 20 an object of interest from within a working environment, comprising the steps of: providing the above-described tracking system; and tracking an object of interest with the plurality of orientable operative units; whereby, at any given time, lights, optionally spotlights, on any of the orientable operative units can be selected to illuminate the object of interest.
25 Preferred embodiments of the present invention will now be described hereinbelow by way of example only with reference to the accompanying drawings, in which:
Figure 1 illustrates a camera tracking system in accordance with a first embodiment of the present invention;
30 Figure 2 represents a pan-tilt head of the cameras of the camera tracking system of Figure 1;
Figure 3 illustrates a camera tracking system in accordance with a second embodiment of the present invention;
Figure 4 illustrates a camera tracking system in accordance with a third 35 embodiment of the present invention;
- 2 -
Figure 5 illustrates a camera tracking system in accordance with a fourth embodiment of the present invention; and
Figure 6 illustrates a camera tracking system in accordance with a fifth embodiment of the present invention.
5 Figures 1 and 2 illustrate a camera tracking system in accordance with a first embodiment of the present invention.
The camera tracking system comprises a plurality of camera units 3a-e, at least ones of which are each orientable in three-dimensional space, such as to allow an object of interest 0 to be followed within an environment, and a system 10 controller 4 for controlling operation of the camera units 3a-e, as will be described in more detail hereinbelow.
In this embodiment the environment is an outdoor environment, here a racetrack, and the object O being tracked is an animal, here a racehorse.
It will be understood that the present invention has application to any 15 outdoor environment and any manner of object, such as people, animals, vehicles and sports objects, including balls. For example, the environment could include other sports, including field sports, such as football or rugby, water sports, such as rowing, track sports, such as running, and road sports, such as motor racing, or events, such as musical or theatrical events, for example, concerts or plays, as held 20 in arenas or on stages.
In another embodiment the environment could be an indoor environment, such as indoor events, for example, theatrical or musical events, for example, concerts or plays, as held in arenas or on stages, or sporting events, for example, as held on an ice rink.
25 In this embodiment the system comprises five camera units 3a-e, but the system could comprise any number of camera units 3a-e.
In this embodiment one of the camera units 3a is a master, and the other of the camera units 3b-3e are slaves which automatically follow the operation of the master camera unit 3a under the control of the controller 4, as will be described 30 further hereinbelow.
In an alternative embodiment a plurality of the camera units 3a, b, here two of the camera units 3a, b, are operable as a master, with one of the master camera units 3a, b being selected at any given time as a master and operated by an operator, and the one or more other camera units 3c-e are slaves, which, together 35 with the one or more non-selected master camera units 3a, b, automatically follow
-3-
the operation of the selected master camera unit 3a, b under the control of the controller 4.
In another alternative embodiment any of the camera units 3a-e are operable selectively as a master, with one of the master camera units 3a-e being 5 selected at any given time as a master and operated by an operator, and the one or more non-selected camera units 3a-e automatically follow the operation of the selected master camera unit 3a-e under the control of the controller 4.
In one embodiment the master camera unit 3a-e is operated locally by an operator, by using an input device, such as a keypad, joystick, hand wheels or a 10 pan-bar.
In another embodiment the master camera unit 3a-e is operated remotely by an operator, by using an input device, such as a keypad, joystick, hand wheels or a pan-bar.
In this embodiment the camera units 3a-e each comprise a camera 5 for 15 capturing an image, here a video image, and a movable head 7, here a pan-tilt head, which supports the camera 5 and can be panned and tilted to follow an object of interest O.
In this embodiment the pan-tilt head 7 includes a plurality of positioning actuators 11 a, b for orienting the pan-tilt head 7.
20 In one embodiment the pan-tilt head 7 comprises first and second actuators
11 a, b, one for adjusting pan and the other for adjusting tilt.
In one embodiment the pan-tilt head 7 includes a zoom actuator 15 for adjusting the zoom of the optics of the camera 5. In an alternative embodiment the zoom actuator 15 can be incorporated into the lens assembly of the camera 5. 25 In one embodiment the pan-tilt head 7 includes a focus actuator 17 for adjusting the focus of the optics of the camera 5. In an alternative embodiment the focus actuator 17 can be incorporated into the lens assembly of the camera 5.
In this embodiment the actuators 11 a, b, 15, 17 comprise electrically operated actuators.
30 In one embodiment the actuators 11 a, b, 15, 17 comprise stepper motors.
In another embodiment the actuators 11a, b, 15, 17 comprise servomotors, brushed or brushless motors, which include an encoder for controlling the position of the motor.
In further embodiments the actuators 11a, b, 15, 17 could comprise 35 hydraulic or magnetic actuators.
-4-
In this embodiment the pan-tilt head 7 includes a local controller 19, which is connected to the system controller 4. The local controllers 19 of the pan-tilt heads 7 can be connected to the system controller 4 in any manner. In this embodiment the local controllers 19 of the pan-tilt heads 7 are connected to the system controller 4 5 in a star network configuration. In alternative embodiments a ring network configuration or a bus network configuration could be employed.
With this configuration, the orientation of the pan-tilt head 7 of the master camera unit 3a-e, which is provided by the position of the positioning actuators 11a, b thereof, determines the location of the object of interest 0 within the working 10 environment, and the one or more slave camera units or non-selected master camera units 3a-e are automatically oriented, by control of the positioning actuators 11a, b thereof under the control of the system controller 4, to follow the same object of interest O as being followed by the master camera unit 3a-e.
The camera tracking system of the present invention has a number of 15 operative modes, as follows.
In one mode, where the object of interest O moves substantially only on a flat surface, which can be horizontal or inclined, but within a two-dimensional working environment, such as a football field, a rugby field, an ice rink or an indoor/outdoor stage, the position of the object of interest O is determined from the 20 intersection of the vector given by the pan and tilt angles of the master camera unit 3a-e and the plane defined by the flat surface.
In one embodiment height offsets from the ground surface can be predefined, for example, to the body or head of a player, thereby allowing the master camera unit 3a-e to follow the head, body or feet of players. 25 In another mode, where the object of interest O moves over an uneven surface, either a surface having different discrete heights, such as a concert stage or theatre stage having sections of different height, or an undulating terrain, such as a cross-country track or a downhill ski track, the position of the object of interest O is determined from the intersection of the vector given by the pan and tilt angles of 30 the master camera unit 3a-e and a predetermined three-dimensional map of the surface.
In one embodiment, where the surface of the working environment has different heights, the zones of different height can be modelled, such as by measurement using the master camera unit 3a-e with reference to measured 35 heights, in order to provide a surface map for the working environment.
-5-
In another embodiment, where the surface of the working environment is an undulating terrain, either existing profile data for the terrain is used as a surface map for the working environment or the terrain is modelled using two or more of the camera units 3a-e to plot out terrain points, either key terrain points or a grid of 5 terrain points or a combination thereof, in order to provide a surface map for the working environment.
In a further mode, where the object of interest O has to follow a pre-defined path within the working environment, such as at a racecourse or racetrack, the predefined path is pre-programmed, and the master camera unit 3a-e is operative to 10 follow the pre-defined path and controlled only in respect of the speed of movement of the master camera unit 3a-e, such as achieved by the operator using the input device, whereby the speed of movement of the master camera unit 3a-e can be speeded up or slowed down in dependence upon the actual speed of the object of interest O, in order to maintain the object of interest O within the field of view. 15 Alternatively the master operative unit could include an image recognition camera which is programmed to follow a defined object of interest, i.e. the master operative unit follows the pre-defined path and is speeded up or slowed down in dependence upon the actual speed of the object of interest, controlled by the image recognition camera, in order to maintain the object of interest within the field of 20 view. Generally, in these embodiments, the master operative unit, e.g. a master operative camera, will comprise an imagine recognition camera. This could simply be a general purpose camera running image recognition software.
In this way, any of the camera units 3a-e can be selected to provide a live broadcast feed, in the knowledge that the selected camera unit 3a-e will be 25 focussed at the object of interest O.
In one embodiment the camera units 3a-e can be controlled such that the image of the object of interest O is of substantially the same size and in focus at all camera units 3a-e; this being provided from the zoom and focus actuators 15, 17 on the master camera unit 3a-e and reference to the determined distances of the 30 master camera unit 3a-e and the one or more slave camera units 3a-e to the object of interest O.
With this configuration, any of the camera units 3a-e can be selected to provide a live broadcast feed, in the knowledge that the selected camera unit 3a-e will both be focussed at the object of interest O and present an image of 35 substantially similar size, irrespective of the camera unit 3a-e which is selected.
-6-
In another embodiment one or more of the slave camera units or non-selected master camera units 3a-e can be controlled to operate at one or more different magnifications, thereby providing for a pre-zoomed-in or pre-zoomed-out image, in dependence upon which camera unit 3a-e is selected. In this 5 embodiment the relative location of the camera units 3a-e is calibrated by directing the camera units 3a-e at one or more, typically two or three calibration targets within the working environment.
In one embodiment the calibrated positions can be checked by pointing each camera unit 3a-e in turn at the other camera units 3a-e, and specifically a 10 calibration feature thereon, in order to determine the relative positions and this checked against the calibrated positions.
Figure 3 illustrates a camera tracking system in accordance with a second embodiment of the present invention.
The camera tracking system of this embodiment is quite similar to the 15 camera tracking system of the above-described embodiment, and thus, in order to avoid unnecessary duplication of description, only the differences will be described in detail with like parts being designated by like reference signs. The camera tracking system of this embodiment differs from that of the above-described embodiment in that the camera 5 of the master camera unit 3a is an image 20 recognition camera which, through operation of the pan-tilt head 7 thereof, is programmed to follow a defined object of interest O, which could be a person, animal, vehicle or sports object, with the other camera units 3b-3d being slaves which follow the operation of the master camera unit 3a under the control of the controller 4. Also, in this embodiment the working environment is an indoor 25 environment, but in other embodiments the working environment could be an outdoor environment.
In this embodiment the slave camera units 3b-3d are configured automatically to follow the operation of the master camera unit 3a, such as to maintain the object of interest O in a central position in the field of view. In one 30 embodiment the image recognition camera 5 determines the position of the object of interest O, and particularly the distance from the master camera unit 3a, by reference to a dimension of the object of interest O and/or pre-defined objects within the working environment.
Figure 4 illustrates a camera tracking system in accordance with a third 35 embodiment of the present invention.
-7-
The camera tracking system of this embodiment is quite similar to the camera tracking system of the second-described embodiment, and thus, in order to avoid unnecessary duplication of description, only the differences will be described in detail with like parts being designated by like reference signs. The camera 5 tracking system of this embodiment differs from that of the second-described embodiment in that the master camera unit 3a has no movable head 7, and the image recognition camera 5 is static.
In this embodiment the image recognition camera 5 has a wide field of view and is programmed to follow a defined object of interest 0 over the field of view, 10 with the other camera units 3b-3d being slaves which follow the operation of the master camera unit 3a under the control of the controller 4. In this embodiment the slave camera units 3b-3d are configured automatically to follow the operation of the master camera unit 3a, such as to maintain the object of interest O in a central position in the field of view.
15 In the set of embodiments comprising an image recognition camera, e.g. as described in relation to Figs. 3 and 4, the image recognition camera may detect multiple objects of interest in its field of view, for example players on a football pitch. The other orientable slave units could, between them, be controlled to follow multiple objects of interest. Each orientable slave unit could follow a different object 20 of interest, or multiple orientable slave units could follow the same object of interest with one or more orientable slave units following other object(s) of interest. I.e. one or more orientable slave units are controlled to follow a first object of interest and one or more orientable slave units are controlled to follow a second object of interest.
25 Figure 5 illustrates a camera tracking system in accordance with a fourth embodiment of the present invention. The camera tracking system of this embodiment is quite similar to the camera tracking system of the first-described embodiment, and thus, in order to avoid unnecessary duplication of description,
only the differences will be described in detail with like parts being designated by 30 like reference signs.
The camera tracking system of this embodiment differs from that of the first-described embodiment in that the master camera unit 3a, instead of the motorized pan-tilt head 7, includes a positioning mechanism 21, to which the camera 5 is mounted and which is manipulated directly by an operator.
The positioning mechanism 21 includes positioning sensors 23a, b for sensing the orient of the positioning mechanism 21, in this embodiment first and second sensors 23a, b which detect the pan and tilt of the positioning mechanism 21, and provide positioning feedback to the system controller 4 for enabling control of the slave camera units 3b-d.
In this embodiment the positioning mechanism 21 comprises pan-bars.
Figure 6 illustrates a camera tracking system in accordance with a fifth embodiment of the present invention. The camera tracking system of this embodiment is quite similar to the camera tracking system of the first-described embodiment, and thus, in order to avoid unnecessary duplication of description,
only the differences will be described in detail with like parts being designated by like reference signs.
The camera tracking system of this embodiment differs from that of the first-described embodiment in comprising one or more additional cameras 31 which are located at a height above the working environment and provide for a determination of the position of the object of interest O in a horizontal plane. Where the working environment is undulating, this determined horizontal position is utilized together with the vector determined from the pan and tilt angles of the master camera unit 3a-e, in order to fix the position of the object of interest O.
Finally, it will be understood that the present invention has been described in its preferred embodiments and can be modified in many different ways without departing from the scope of the invention as defined by the appended claims. For example, although the present invention has been in relation to the tracking of video cameras, the present invention has application to any device which requires the tracking of an object, such as still cameras, and also spot lights, where the light source replaces the camera sensor, for example, as used in a theatre or on a concert stage.
-9-
Claims (1)
- Claims1. A tracking system for tracking an object of interest in a working environment, the system comprising:5 a plurality of operative units, at least ones of which are orientable in three-dimensional space to follow an object of interest within a working environment; and a system controller for controlling operation of the operative units such that the orientable operative units follow the object of interest.10 2. The system of claim 1, wherein the environment is an outdoor environment.3. The system of claim 1, wherein the environment is an indoor environment.4. The system of any of claims 1 to 3, wherein the object of interest is a 15 person, animal, vehicle or sports object.5. The system of any of claims 1 to 4, wherein one of the operative units is a master, and the other of the operative units are slaves which automatically follow the operation of the master operative unit under the control of the system controller.206. The system of any of claims 1 to 4, wherein a plurality of the operative units are operable as a master, with one of the master operative units being selected at any given time as a master and operated by an operator, and the one or more other operative units are slaves, which, together with the one or more non-selected25 master operative units, automatically follow the operation of the selected master operative unit under the control of the system controller.7. The system of any of claims 1 to 4, wherein any of the operative units are operable selectively as a master, with one of the master operative units being30 selected at any given time as a master and operated by an operator, and the one or more non-selected operative units automatically follow the operation of the selected master operative unit under the control of the system controller.- 10-8. The system of any of claims 1 to 7, wherein the master operative unit is operated locally by an operator, by using an input device, such as a keypad, joystick, hand wheels or a pan-bar.5 9. The system of any of claims 1 to 7, wherein the master operative unit is operated remotely by an operator, by using an input device, such as a keypad, joystick, hand wheels or a pan-bar.10. The system of any of claims 1 to 9, wherein the operative units each 10 comprise an operative device and a movable head which supports the operative device and can be moved to follow the object of interest, optionally the movable head is a pan-tilt head which can be panned and tilted to follow the object of interest.15 11. The system of claim 10, wherein the pan-tilt head includes a plurality of positioning actuators for orienting the pan-tilt head.12. The system of claim 10 or 11, wherein the operative device is a camera, optionally a video or still camera.2013. The system of claim 12, wherein the camera includes a zoom actuator for adjusting a zoom of the optics of the camera.14. The system of claim 12, wherein the camera includes a focus actuator for 25 adjusting a focus of the optics of the camera.15. The system of claim 10 or 11, wherein the operative device is a light, optionally a spotlight.30 16. The system of claim 15, wherein the light includes a zoom actuator for adjusting a zoom of the illumination beam.17. The system of any of claims 10 to 16, where the object of interest moves on a generally flat surface, and the position of the object of interest is determined from-11 -the intersection of the vector given by the pan and tilt angles of the master operative unit and a plane defined by the flat surface.18. The system of any of claims 10 to 16, wherein the object of interest moves5 over an uneven surface, and the position of the object of interest is determined from the intersection of the vector given by the pan and tilt angles of the master operative unit and a predetermined three-dimensional map of the surface.19. The system of claim 18, wherein the surface of the working environment has 10 different discrete heights, and zones of different height are modelled to provide the surface map for the working environment.20. The system of claim 18, wherein the surface of the working environment is an undulating terrain, and profile data for the terrain is utilized as the surface map15 for the working environment.21. The system of claim 18, wherein the surface of the working environment is an undulating terrain, and terrain points are modelled to provide the surface map for the working environment.2022. The system of claim 21, wherein the terrain points are key terrain points or a grid of terrain points or a combination thereof.23. The system of any of claims 10 to 22, further comprising: one or more25 cameras which are located at a height above the working environment and provide for a determination of the position of the object of interest in a horizontal plane, wherein this determined horizontal position is utilized together with the vector determined from the pan and tilt angles of the master operative unit to fix the position of the object of interest.3024. The system of any of claims 10 to 23, wherein the master operative unit includes a positioning mechanism to which the operative device is mounted and which is manipulated directly by an operator, and the other orientable operative units are slaves which follow the operation of the master operative unit under the35 control of the system controller.- 12 -25. The system of claim 24, wherein the positioning mechanism comprises pan-bars.5 26. The system of any of claims 1 to 23, wherein the master operative unit is pre-programmed to follow a pre-defined path, and controlled by an operator in respect of the speed of movement of the master operative unit, whereby the speed of movement of the master operative unit can be speeded up or slowed down in dependence upon the actual speed of the object of interest.1027. The system of any of claims 1 to 23 and 26, wherein the master operative unit includes an image recognition camera which is programmed to follow a defined object of interest, and the other orientable operative units are slaves which follow the operation of the master operative unit under the control of the system controller.1528. The system of claim 27, wherein the master operative unit is an orientable operative unit.29. The system of any of claims 1 to 23, wherein the image recognition camera 20 is static.30. The system of claim 27, 28 or 29, wherein the orientable slave units are controlled to follow multiple objects of interest.25 31. The system of any of claims 27 to 30, wherein the slave operative units are configured automatically to follow the operation of the master operative unit and maintain the object of interest in a central position in the respective fields of view.32. The system of any of claims 27 to 31, wherein the image recognition camera 30 determines the position of the object of interest by reference to a dimension of the object of interest and/or pre-defined objects within the working environment.33. The system of any of claims 1 to 32, further comprising: one or more position sensors for determining the position of the object of interest.35- 13-34. The system of any of claims 1 to 33, wherein the relative location of the operative units is calibrated by directing the orientable operative units at one or more calibration targets within the working environment.5 35. The system of claim 34, wherein the calibrated positions are checked by orienting each orientable operative unit at the other operative units to determine the relative positions of the operative units.36. A method of broadcasting video of an object of interest from within a10 working environment, comprising the steps of:providing the tracking system of any of claims 1 to 35; and tracking an object of interest with the plurality of orientable operative units;whereby, at any given time, cameras on any of the orientable operative units can be selected to provide video of the object of interest.1537. The method of claim 36, wherein the orientable operative units are controlled such that the image of the object of interest is of substantially the same size and in focus at each of the cameras of the orientable operative units.20 38. The method of claim 36, wherein ones of the orientable operative units are controlled to operate at one or more different zooms, thereby providing for a pre-zoomed-in or pre-zoomed-out image, in dependence upon which operative unit is selected.25 39. A method of illuminating an object of interest from within a working environment, comprising the steps of:providing the tracking system of any of claims 1 to 35; and tracking an object of interest with the plurality of orientable operative units;whereby, at any given time, lights, optionally spotlights, on any of the30 orientable operative units can be selected to illuminate the object of interest.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1401342.9A GB2507431B (en) | 2012-01-27 | 2013-01-28 | Tracking system and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GBGB1201408.0A GB201201408D0 (en) | 2012-01-27 | 2012-01-27 | Tracking system and method |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201301477D0 GB201301477D0 (en) | 2013-03-13 |
GB2500304A true GB2500304A (en) | 2013-09-18 |
Family
ID=45876190
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GBGB1201408.0A Ceased GB201201408D0 (en) | 2012-01-27 | 2012-01-27 | Tracking system and method |
GB1401342.9A Active GB2507431B (en) | 2012-01-27 | 2013-01-28 | Tracking system and method |
GB1301477.4A Withdrawn GB2500304A (en) | 2012-01-27 | 2013-01-28 | Tracking system having multiple orientable cameras |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GBGB1201408.0A Ceased GB201201408D0 (en) | 2012-01-27 | 2012-01-27 | Tracking system and method |
GB1401342.9A Active GB2507431B (en) | 2012-01-27 | 2013-01-28 | Tracking system and method |
Country Status (1)
Country | Link |
---|---|
GB (3) | GB201201408D0 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9788906B2 (en) | 2013-03-15 | 2017-10-17 | Synaptive Medical (Barbados) Inc. | Context aware surgical systems for intraoperatively configuring imaging devices |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH099124A (en) * | 1995-06-16 | 1997-01-10 | Noriaki Konuma | Automatic control type spot light |
JPH0963314A (en) * | 1995-08-28 | 1997-03-07 | Matsushita Electric Works Ltd | Automatic tracking lighting system |
JPH0963313A (en) * | 1995-08-28 | 1997-03-07 | Matsushita Electric Works Ltd | Automatic tracking lighting system |
JPH1012005A (en) * | 1996-06-19 | 1998-01-16 | Matsushita Electric Works Ltd | Automatic tracking lighting system |
US20030210329A1 (en) * | 2001-11-08 | 2003-11-13 | Aagaard Kenneth Joseph | Video system and methods for operating a video system |
US20060056056A1 (en) * | 2004-07-19 | 2006-03-16 | Grandeye Ltd. | Automatically expanding the zoom capability of a wide-angle video camera |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5912700A (en) * | 1996-01-10 | 1999-06-15 | Fox Sports Productions, Inc. | System for enhancing the television presentation of an object at a sporting event |
US20020005902A1 (en) * | 2000-06-02 | 2002-01-17 | Yuen Henry C. | Automatic video recording system using wide-and narrow-field cameras |
-
2012
- 2012-01-27 GB GBGB1201408.0A patent/GB201201408D0/en not_active Ceased
-
2013
- 2013-01-28 GB GB1401342.9A patent/GB2507431B/en active Active
- 2013-01-28 GB GB1301477.4A patent/GB2500304A/en not_active Withdrawn
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH099124A (en) * | 1995-06-16 | 1997-01-10 | Noriaki Konuma | Automatic control type spot light |
JPH0963314A (en) * | 1995-08-28 | 1997-03-07 | Matsushita Electric Works Ltd | Automatic tracking lighting system |
JPH0963313A (en) * | 1995-08-28 | 1997-03-07 | Matsushita Electric Works Ltd | Automatic tracking lighting system |
JPH1012005A (en) * | 1996-06-19 | 1998-01-16 | Matsushita Electric Works Ltd | Automatic tracking lighting system |
US20030210329A1 (en) * | 2001-11-08 | 2003-11-13 | Aagaard Kenneth Joseph | Video system and methods for operating a video system |
US20060056056A1 (en) * | 2004-07-19 | 2006-03-16 | Grandeye Ltd. | Automatically expanding the zoom capability of a wide-angle video camera |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9788906B2 (en) | 2013-03-15 | 2017-10-17 | Synaptive Medical (Barbados) Inc. | Context aware surgical systems for intraoperatively configuring imaging devices |
Also Published As
Publication number | Publication date |
---|---|
GB2507431B (en) | 2016-05-11 |
GB201401342D0 (en) | 2014-03-12 |
GB201201408D0 (en) | 2012-03-14 |
GB2507431A (en) | 2014-04-30 |
GB201301477D0 (en) | 2013-03-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2010286316B2 (en) | A method and apparatus for relative control of multiple cameras | |
US9813610B2 (en) | Method and apparatus for relative control of multiple cameras using at least one bias zone | |
US10670246B2 (en) | Follow spot control system | |
US8723956B2 (en) | Method and apparatus of camera control | |
JP2008520016A (en) | Image-based motion tracking | |
US20160246039A1 (en) | Interactive projection system | |
JP7133520B2 (en) | Computer eyes (PCEYE) | |
WO2018004354A1 (en) | Camera system for filming sports venues | |
GB2500304A (en) | Tracking system having multiple orientable cameras | |
CN107835384B (en) | Navigation system and method | |
Thomas | Sports TV applications of computer vision | |
GB2559003A (en) | Automatic camera control system for tennis and sports with multiple areas of interest | |
JP7282964B2 (en) | Computer eyes (PCEYE) | |
JP7338015B2 (en) | Computer eyes (PCEYE) | |
JP7282965B2 (en) | Computer eyes (PCEYE) | |
JP7312521B2 (en) | Computer eyes (PCEYE) | |
JP7291452B2 (en) | Computer eyes (PCEYE) | |
JP7282967B2 (en) | Computer eyes (PCEYE) | |
WO2022168995A1 (en) | Computer eye (pc eye) | |
JP2022172185A (en) | Computer eye (pceye) |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |