GB2589080A - Surveillance system - Google Patents
Surveillance system Download PDFInfo
- Publication number
- GB2589080A GB2589080A GB1916245.2A GB201916245A GB2589080A GB 2589080 A GB2589080 A GB 2589080A GB 201916245 A GB201916245 A GB 201916245A GB 2589080 A GB2589080 A GB 2589080A
- Authority
- GB
- United Kingdom
- Prior art keywords
- beacon
- video
- light
- light emitting
- surveillance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 36
- 238000004458 analytical method Methods 0.000 claims abstract description 12
- 238000001514 detection method Methods 0.000 claims abstract description 6
- 230000007704 transition Effects 0.000 claims abstract 3
- 238000012544 monitoring process Methods 0.000 claims description 13
- 230000001360 synchronised effect Effects 0.000 claims description 9
- 238000002329 infrared spectrum Methods 0.000 claims description 8
- 230000001960 triggered effect Effects 0.000 claims description 6
- 238000001429 visible spectrum Methods 0.000 claims description 6
- 230000004044 response Effects 0.000 claims description 4
- 230000001419 dependent effect Effects 0.000 claims description 3
- 239000000463 material Substances 0.000 claims description 3
- 230000008901 benefit Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000005286 illumination Methods 0.000 description 3
- 230000000873 masking effect Effects 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000002035 prolonged effect Effects 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 241001085205 Prenanthella exigua Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000010367 cloning Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000004020 luminiscence type Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000002062 proliferating effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000010223 real-time analysis Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S1/00—Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith
- G01S1/70—Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19608—Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19613—Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19645—Multiple cameras, each having view on one of a plurality of scenes, e.g. multiple cameras for multi-room surveillance or for tracking an object by view hand-over
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Alarm Systems (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
A video surveillance system comprises: at least one camera 50a-b to observe a surveillance field, providing resulting video data; at least one beacon 200 (e.g. a wearable device such as an identification badge, lanyard or insignia) comprising at least one light emitting element 230 (e.g. a non-visible infrared LED), and controller 220 to provide an actuation sequence to the light emitting element; a video analytics system 100 comprising a processor 110 to: receive video data captured by the camera(s) and analyse the video for the presence of beacon light emissions. Upon detection of beacon light emissions, the analytics system decodes the actuation sequence of the light emissions, verifying the identity of the beacon, and outputs a record of the transition of a beacon within the surveillance field. A beacon may be tracked between first and second surveillance fields. Depth location of beacons within images may be determined based on a known spacing distance of light emitting elements on beacons. An analogous broad video surveillance method, and wearable beacon (using a unique beacon identification key), are also disclosed.
Description
Surveillance System
FIELD OF INVENTION
The invention relates to an apparatus and methods of video surveillance monitoring.
BACKGROUND
Video surveillance monitoring is often used to track the movement of people or objects, through specific areas, for reasons of safety and/or security (both of which are intended to be included by the term "video surveillance monitoring", used for convenience herein). For example, video surveillance monitoring may be used in; warehousing and storage facilities, residential, medical or elderly care, industrial and construction sites or agricultural premises. Whilst some video surveillance monitoring may be carried out by operators manually viewing live video streams, the prodigious volume of data generated by a proliferating number of installed cameras, in concert with the affordability of computer resources, mean there is an increasing demand for automated monitoring. Accordingly, video surveillance cameras may be connected over a local or distributed network to a Video Analytics system, specifically adapted to process video streams as a means to provide automated video surveillance monitoring.
An automated advanced video surveillance system is exemplified by Ethersec "A-Eye" platform, which is designed to detect human subjects present within a video stream in real-time, using artificial intelligence techniques to actively recognise subjects who are expected/permitted to enter an area and thus only raise an alarm/notification for unauthorised subjects.
Current systems may be particularly effective when used to monitor an area which is "sterile", in other words, an area in which human subjects should not be present (or very rarely present). In practice true "sterile" zones are relatively scarce and it is more common that the monitored zone allows access to certain people or categories of people but access by others may need to be monitored or trigger a notification or alarm. For example, only skilled and trained operatives may be permitted in an area containing industrial machinery or only security guards may be normally permitted to patrol a storage facility containing valuable items.
Whilst improved analysis methods, (for example Deep Learning techniques), are enabling more fine-grained identification of subjects in video streams, issues such as environmental factors may still result in an undesirably high threshold being required to trigger true positive identifications, whilst avoiding an unacceptable number of false positive identifications. There are also contemporary problems with the public's perception and tolerance to the use of facial and other, artificial Intelligence based automated recognition technologies, which ultimately may limit the potential scope of their application.
One solution to monitoring personnel in a specific zone is to use a combination of technologies, such as employing a video surveillance system to provide video capture and an access control system to identify and classify an individual. For example, access control systems may use passive RFID swipe cards or smart card devices to control access at entry and exit points, with the uniquely identifiable card acting as a proxy for an individual user in a database. Data from such a system may be cross referenced (for example based upon time records), with streams from security cameras to provide live and/or recorded video capture. However, such systems have some disadvantages, in that, in an emergency situation, it is common for access controls to be deactivated and as a result the ability to match individuals to video feeds may be lost. Further, such systems have limited monitoring ability between access control points.
Another solution is to replace video and access control based systems with wireless identification tags such as active RFID identifiers. However, the take up of such systems has been limited due to the relatively high cost of installing and maintaining these -2 -systems. Further, radio frequency technologies are prone to erroneous results and active RFID tags typically draw considerable battery power in use.
It will be appreciated that there is a desire to provide video surveillance monitoring systems which overcome one or more of the disadvantages of the systems described above. At least some embodiments of the present disclosure are intended to provide improved video surveillance monitoring systems capable of identifying individual subjects or groups of subjects within a monitored area.
SUMMARY OF INVENTION
In accordance with a first aspect, the invention provides a video surveillance system comprising: at least one camera to observe a surveillance zone and thus provide video data; at least one beacon comprising at least one light emitting element and a controller to provide a coded actuation sequence to the light emitting element; a video analytics system comprising a processor configured to: ingest video data captured by the at least one camera; analyse the video for the presence of light emissions from a beacon. In the event of the detection of a beacon the processor may further decode the actuation sequence of the light emissions to verify the identity of the beacon; and output a record of the movement of a beacon within the surveillance zone.
It may be appreciated that embodiments of the invention may enable a video surveillance system to automatically identify known users entering and passing through the surveillance zone. Embodiments can operate in a fully automated environment without the need for human input since the beacon and video analytics system of embodiments advantageously provide a machine-to-machine verification/identification approach. -3 -
The beacon may use any convenient light emitting element and the selection may depend upon environmental factors (for example the range of detection required in a particular application or ambient lighting conditions). However, in some embodiments the light emitting element may emit light from the non-visible spectra. References herein to non-visible spectra light may be understood to refer to electro-magnetic spectra which falls outside of the portion which is normally visible to the human eye (which may, for example, be wavelengths between approximately 380 to 740 nm). Infrared light emitted from a source, such as a infrared LED is, for example, generally considered non-visible light, even though some secondary wavelengths emitted by the LED may be observable as a dull red glow to the human eye. In embodiments, the camera may comprise a sensor for detecting non-visible spectra light. The use of non-visible light is advantageous, in ensuring that the beacon is not distracting or a nuisance. Further, a non-visible light beacon may be preferable for security purposes, since the coded actuation sequence of the beacon is concealed from normal human observation.
The non-visible light employed may be of an infrared wavelength. As such, the light emitting element may emit light in the infrared spectrum. For example, one wavelength of the infrared spectrum may be at or around 850nm. The camera may include a sensor that is attuned to be sensitive to light transmitted at a wavelengths of 850nm.
Advantageously, camera equipment which can detect 8SOnm wavelength Infrared is readily commercially available since "commercial of the shelf" CCTV cameras use LED's at this wavelength as floodlights ("often called black light in marketing material"), to covertly illuminate low light scenes. If it is desirable to further reduce the visibility of the beacon to the human eye, light emissions of a lower wavelength (such as 920nm range) could be utilised. It will however be appreciated that photons of such a lower wavelengths have less power and therefore provide less illumination.
The system may comprise a plurality of beacons, for example, for attachment or use by a plurality of persons or objects. In such embodiments the controller of each beacon 30 may provide a distinct coded actuation sequence. For example, the controller may allow -4 -one of a predetermined plurality of coded actuation sequences to be selected, for example, during an initial configuration or updating of the beacon. Alternatively, the controller of each beacon may be pre-programmed with a uniquely coded actuation sequence. The processor of the analytics system may can access a machine-readable storage comprising a database of unique identification keys for a plurality of beacons.
In some embodiments the controller may be provided with an encoding algorithm to provide a changing actuation sequence. The video a nalytics system processor may uses the same encoding algorithm and which may be compared to the actuation sequence to verify the identity of the beacon. An advantage of using an encoding algorithm may be that it prevents simple "cloning" by observation of the output from a beacon and copying this to a new device. Changing actuation sequences for use in such embodiments are known for example from authentication systems used for generating one-time passwords or pass-codes.
In some embodiments the controller may use the encoding algorithm to derive an actuation sequence based upon an identification key and a sequence-based value. For example, the sequence-based value may be generated by a counter or use a predetermined number sequence. The processor verifies the identification key using a current sequence value (for example the processor may have a corresponding counter or stored pre-determined number sequence).
In some embodiments the controller may use the encoding algorithm to derive an actuation sequence based upon an identification key and a time-based value. For example, the controller may further comprise a clock. The processor may verifies the identification key using a current time value. The processor may therefore include a clock which may for example be synchronised with the controller.
The video surveillance system of some embodiments may comprise at least, a first 30 camera to observe a first surveillance zone and at least a second camera to observe a -5 -second surveillance zone. The processor may be further configured to use the verified identity of the beacon to track movement over time through the first and second surveillance fields. Thus, some embodiments may advantageously provide a system which is able to track movement of a single subject from camera to camera across a time period and security zones. Further embodiments could be arranged to carry out such multi-camera tracking even in a crowded environment in which several beacons are present by verifying the individual beacon identities.
The beacon may comprise a plurality of light emitting elements. The actuation sequence of each light emitting element may be synchronised. For example, the light emitting elements may each light simultaneously in accordance with the coded actuation sequence (such that the elements are effectively acting as a combined light source). In some embodiments the synchronisation of the light emitting elements may use one or more elements as individual light sources in providing a coded actuation sequence (for example a row of elements could be utilised together to provide a number of "bits" when activated).
When a beacon includes a plurality of light emitting elements this may also provide further advantages to the system. For example, multiple elements may have different positions or orientations to increase detectability by the video camera. If at least two of a plurality of light emitting elements are spaced apart by a known spacing distance the processor of the video analytics system may be configured to derive a depth location of the beacon within images from the video data using the spacing distance of the light emitting elements in the video data. In some examples a single beacon could include light emitting elements with a set spacing or alternatively multiple beacons could be placed on a subject or object at a set spacing apart.
In some embodiments the beacon may be a wearable device. For example, the beacon may be a lanyard. The beacon may be a unit which is connected to an existing lanyard 30 or may be integral with a lanyard. The beacon may be configured to attach to an item of -6 -clothing. For example, the beacon could be incorporated into an identification badge or could be attached to or integrated in epaulettes, in, by way of example, the uniforms of security personnel.
The video analytics system may provide an alert or notification in response to meeting a/some predetermined criteria, derived from observation of the beacon, within the surveillance zone. Alerts or notifications may be triggered based upon beacons being identified within the entire surveillance area of a particular camera or based upon a defined sub-zone or area (which could extend at least partially across the surveillance
field of a plurality of cameras).
The alert or notification could be triggered either remotely or locally and could, for example, be dependent upon the identity of the beacon. Advantageously, since embodiments of the invention may enable unambiguous identification with a high degree of reliability, for any given beacon, the alert or notification could be based upon a particular group of identities (for example certain personnel roles) or a specific individual's identity. The video analytics system of embodiments could be connected to a machine-readable storage (for example a database stored on a server), which may provide heuristic rules specifying the criteria to grant access to the area or zones covered by the at least one surveillance field. For example, individual records could be maintained indicating days and/or times when a particular identification is or is not permitted within the area. As such, notifications or alerts provided by embodiments of the invention may be highly tailored to the system users requirements for alert provision.
Whilst embodiments of the invention could be implemented on a local area network basis (and indeed, for high security environments, an air-gapped network may be a preferred configuration), typically greater flexibility may be achieved by configuring a system in accordance with embodiments to operate over a wider network. For example, the system according to some embodiments, may further comprises a network for receiving video output from the at least one camera and transmitting video data to the -7 -video analytics system. The video analytics system could, therefore, be remotely located relative to the area under surveillance. For example, a cloud-based system could be provided, in which a single centralised video analytics system could be employed, consuming video data from a plurality of cameras which themselves are positioned in a plurality of surveillance locations.
In some embodiments, the security zone is observed by at least two cameras, which may for example be arranged in close physical proximity. A first camera may contains an "IR cut filter" that blocks light at the same wavelength as the beacon emits light. The second camera may contains no such filter and so is sensitive to light in the beacon's IR wavelengths. The video analytics system may be configured to make a comparison of synchronised video output from the two cameras. For example, the comparison may calculate a differential in the luminescence of pixels in the videos. This may enable the video analytics system to locate bodies of high intensity IR pixel locations and the luminance frame can be scanned for such locations. These locations may represent a 1 in the beacon encoding scheme. The absence of the beacon at a previous locus of observation represents a 0 in the encoding scheme. By capturing the l's and O's of the beacon over a time series, complex encoding schemes, transmitted by the beacon can be ingested by the video analytics system.
In alternate embodiments an IR cut filter may be necessary. For example a "running average frame" may be determined by the video analytics system. Subtracting the current frame from this running average frame (and for example then applying a luminance threshold) may provide a difference frame that represents the moving foreground objects in the frame with background clutter removed. If the threshold is of a high value then the resultant difference frame will indicate solely the location of the beacons within the frame from which the signal can be extracted in a time series by the video analytics system. -8 -
In some embodiments, the movement of objects in the scene may be tracked. Multiple, Known computer vision techniques could be incorporated into embodiments of the invention. In some embodiments of the device the encoded beacon sequence is attached to a single moving object to allow decoding the sequence on a per object basis over a time sequence and allow identification of multiple beacons in a single frame in difficult environments.
Embodiments of the invention may provide a mechanism to store the data or meta-data. This data may be kept in memory and/or analysed live by the video analytics system and/or placed into a database where it can be analysed at a later date. Such analysis could be to derive associations amongst the data that is only visible from a prolonged time series, such as the number of times in a day that a security guard has entered an area. In another embodiments searching for and finding an incident on historical data then allows the partnered video data to be observed and interpreted by human operators.
The data or metadata collected by the video analytics system may include one or more of the following: the camera number, the pulsed flashes of the beacon, the beacon signal strength (number of pixels), the beacon centroid location (X,Y), the beacon's derived Z location (from a pinhole camera model, from a 3D camera using stereoscopy or structured light or range finding technology), the beacon's derived Z location from the separation of matching beacons on a single object and passed through a range finding algorithm, the time of the observations. In some embodiments the Meta data such as the beacon identification tag derived from a lookup table could also be stored in the database.
In a further aspect of the invention, there is provided a method of video surveillance monitoring, the method comprising: providing at least one camera to capture video of 30 a surveillance zone; providing at least one beacon to transmit an encoded light sequence; analysing video from the at least one camera to identify beacon output within the surveillance field and decoding the light sequence from the beacon to identify the beacon.
The method may comprise, providing an encoding algorithm for use by a controller of the at least one beacon to provide an encoded light sequence that changes over time. The method may also comprise using said encoding algorithm when decoding the light sequence from the beacon to identify the beacon. The algorithm is a function of a unique identification key and a sequential-based and/or time-based input.
The method may comprise providing a plurality of beacons. The method may, therefore, further include: assigning a unique identification key to each beacon; encoding the identification key in the encoded light sequence; and providing a computer readable data store of beacon identification keys.
The method may further comprise providing at least a first camera to observe a first surveillance zone and at least a second camera to observe a second surveillance zone. Analysing the video from the cameras further comprises using the identity of the beacon to track movement over time through the first and second surveillance zones.
The method may comprise defining at least one access zone within the surveillance field and generating a notification or alert in response to the detection of a beacon within the access zone. As noted above, the access zone could comprise all or part or a surveillance filed and could span at least part of the surveillance filed of a plurality of cameras. The alert or notification triggered is dependent upon the identity of the beacon.
The at least one camera may transmit video output across a network. At least one processor may be configured for receiving video input from the network to perform the 30 step of analysing video.
-10 -In a further aspect of the invention, there may be provided a wearable beacon comprising: a power source; at least one light emitting element; and a controller to provide an actuation sequence to the light emitting element, wherein the controller 5 comprises a machine-readable storage containing an encoding algorithm and a unique identification key for the beacon; and a processor configured to: obtain the stored unique identification key for the beacon; obtain a time-based or sequence-based value; use the stored encoding algorithm to derive a unique code as a function of the identification key and value; and output an actuation sequence based upon said unique 10 code for the light emitting element.
In some embodiments, the controller may include a machine readable storage medium containing instructions executable by a processor the medium comprising: instructions to obtain the stored unique identification key for the beacon; instructions to obtain a time-based or sequence-based value; instructions to use the stored encoding algorithm to derive a unique code as a function of the identification key and value; and instructions to output an actuation sequence based upon said unique code for the light emitting element.
A wearable beacon in accordance with some embodiments may comprise a counter for providing a sequence-based value. In some embodiments, the wearable beacon may further comprise a clock for providing a time-based value.
The wearable beacon may include at least one light emitting element which emits 25 infrared spectra light. In some embodiments the beacon may comprise a plurality of light emitting elements, the controller being further configured to synchronise the actuation sequence of the light emitting elements.
To increase the detectability of the emitted light the wearable beacon may comprise 30 light emitting element including at least one light guide to diffuse light from one, or more, point sources. For example, a light diffusing element may be a guide member, optical fibre of light panel which may provide an increased area for light emission. For example, in a lanyard a light diffusing element could extend through or around a neck band.
In some embodiments the light emitting element may be positioned adjacent a contrasting material. Such an arrangement may assist in identification of the element by a video analytics system.
Whilst the invention has been described above, it extends to any inventive combination of the features set out above or in the following description or drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the invention may be performed in various ways, and embodiments 15 thereof will now be described by way of example only, reference being made to the accompanying drawings, in which: Figure 1 is a schematic representation of a video surveillance system in accordance with an embodiment of the invention; Figure 2 is a flow chart representing a method of video surveillance in accordance with an embodiment; and Figure 3 is a schematic representation of an identity lanyard including a beacon 25 in accordance with an embodiment of the invention.
DETAILED DESCRIPTION
-12 -A video surveillance system 1, in accordance with an embodiment of the invention, is shown in figure 1. The system comprises a video analytics system 100, a plurality of video cameras SO connected to the video analytics system 100 via a network 300 (which could, for example, include the Internet) and at least one beacon 200. In the illustrated embodiment two cameras 50a and 50b are provided with each capturing video covering a related surveillance field 55a and 55b. The video data is transmitted over the network 300 which could be a wired or wireless network and may be local or cloud based.
The, or each, camera 50 may be sensitive to infrared wavelength light. Security cameras which are optimised for sensing Infrared radiation are commercially available (and may typically be used in prior art systems, with an infrared spectrum spotlight illuminating the scene with "black" light which though invisible to the human eye can be detected by the specifically attuned sensor in the camera). Infrared video cameras may be tailored to be sensitive to a specific region of the infrared spectrum. For example, Infrared cameras sensitive to 850nm range wavelengths are particularly common.
The video analytics system 100, comprises a processor 110, in communication with a data store 130 and a storage medium 120. It will be appreciated that the video analytics system 100 could be a network connected computer. The analytics system could be in communication with a plurality of separate camera systems and separate networks (with network comprising one or more cameras). It will be appreciated that, for example, a single networked video analytics system could be provided (for example, at a service provider) to operate a plurality of distinct surveillance systems (for example, each at separate client locations). As networked video surveillance systems are now relatively well established, it will be appreciated that embodiments of the invention could be implemented by taking advantage of existing systems with relatively simple modification and/or minimal hardware changes required.
The storage medium 120 may contain executable instructions 122, 124 for the 30 processor 110. The storage may for example include random access memory (RAM) or -13 -read only access memory (ROM) and may be any convenient electronic, magnetic optical or other physical storage device. The data store 130 may, for example, include locally stored data files, databases, websites, FTP servers and the like. As will be explained further below, the data store may be used to keep records relating to the security system for use by the processor 110, such as access rules and beacon identities.
The beacon 200, may typically be a wearable device such as an identification badge or insignia. The beacon 200 may include a power source 210 such as a battery. A controller 220 is included, to control the beacon and controls a light emitting element such as at least one LED 230. The controller 220 may for example, be an integrated circuit. The controller may include an input, which could for example be a wireless connection such as Bluetooth or Wi-Fi, to enable in the initial configuration and/or set-up of the beacon.
The light emitting element 230 may be selected depending upon the light for which the camera(s) 50 are attuned. As explained further below, the controller 220 will generally cause the LED 230 to flash or pulse. By using an infrared LED the pulsing of the LED is generally of low observability to the human eye. Infrared LED's are increasingly compact and powerful, enabling easy integration into a beacon which itself will be relatively compact and therefore wearable. The Infrared LED's of the type used, for example, in "black light" spotlights for CCTV equipment, radiate large volumes of light in the Infrared range of around 850nm. In operation they generally also produce a very dull red glow that is barely visible to the human eye. As such, whilst a pulsing LED 230, used in an embodiment of the invention, would be barely visible, it is unlikely to remain completely invisible to an observant subject. A user may find such a visible emission distracting. As such, in some embodiments, the beacon 200 may further comprise at least one masking LED at the same wavelength of the human observable red glow emitted by the infrared LED 230. Such a masking LED would provide a means to hide the pulsing Infrared light emitted from the infrared LED 230 from human observation. A further advantage of including a masking LED would be that it could provide visual confirmation that the wearable is operational.
-14 -Alternatively, some embodiments of the beacon 200 could employ Infrared LED's transmitting data in the 920nm range. Such LEDS are completely invisible to the human eye but suffer from the disadvantage that video cameras that can receive signals at this wavelength are less readily available than their 850nm counterparts. Further, 920nm LED's, having a longer wavelength have less energy than those at 850nm. As such, 920nm range LED's generally provide less illumination to act as a beacon.
Advantageously, modern Infrared LED's are typically visible in daylight to a distance of 30m, even in full sunlight. Further the use of multiple LED's can provide pulsing illumination in parallel and thereby increase the operational visibility and range of embodiments of the system. In some embodiments it may be desirable for the controller 220 to adjust the brightness of the emissions from the light source 230. For example, the beacon 200 may further comprise a light sensor to allow ambient light information to be provided to the controller. Such an arrangement may enable the brightness of the LED's to be maximised in well lit situations (to increase range) but avoid LED acting as a source of scene flooding when light is low (which may otherwise reduce the ability of the system to isolate the beacon signals).
The emission pattern of the LED(s) 230 is governed by the controller 220 and varies over time to provide an encoded signal which can provide an identification of the beacon. For example, the LED(s) 230 can be commanded to emit a signal such as a simple series of on/off saw tooth switches over a time series. Personnel of one category could have one set of signals encoded in the lights on/off pattern whilst others would have a different pattern. It will be appreciated that brightness, duration and many other factors could be employed to produce these different patterns. Further, if multiple LEDs are provided and controlled individually (but in a synchronised manner), then this may provide an additional level of complexity which can be encoded into the light emissions (for example each light could be treated as a separate "bit" of information).
Advantageously, providing a fine-grained identification pattern that could enable the -15 -encoded signal from the beacon to identify not only a group of personnel but also a unique individual.
For the purposes of increased security, it may be preferred to provide encoded signals 5 for the beacon which are not constant over time. This may avoid third parties observing and duplicating a signal to "clone" the identity of that beacon. As such, the controller 220 may use an encoding algorithm to generate the output signal for the emitter 230. It may be appreciated that such an encoding algorithm could be similar to those used to generate one-time identification/access codes (for example for secure access to 10 websites over the internet). Such algorithms are well established and can, for example, combine a unique identification key allocated to the beacon with a time or sequence based value, to generate an output to be used as a signal. For example the signal may be a numerical value which is then emitted by using a particular coded sequence of pulses or flashes for each digit.
Operation of the system will now be described further, with reference to the flow chart SOO of Figure 2. It will be appreciated that the method may be most effective in a live system but may also work on historical/stored video data. In step 510 the camera(s) 50, monitor a surveillance area 55, capturing video data. In order to maximise the available bandwidth, the video data may be encoded to a compressed bit stream such as an video stream conforming to the H.264 format, which may then be transmitted, over network 300, to a video analysis system 100. Should the video a nalytics system receive the video stream in an encoded/compressed format, it is then decoded back to raw video by hardware or software by the processor 110, prior to image processing as the first step to extract useable data for machine understanding of the scene.
As represented in box 530, the processor 110 of the video analysis system 100, interrogates the video data, to identify any beacons signal's from the beacons 200, that are present in the captured, transmitted and decoded video frames. The processor 110, 30 may for example, use a first set of process instructions 122 stored on the memory 120 -16 -to carry out the video analysis steps required to identify a beacon emission in the video frame.
Once the presence of a beacon 200 has been identified in a video stream/recording, 5 the analysis system 100 will proceed to decode the light sequence emitted by the beacon 200. The processor 100 executes the decoding instructions 122 stored on the memory 120. The decoding instructions 122 will include the same algorithm as used by the controller 220 and will have a corresponding counter or clock to synchronised with that of the controller. The decoding step 540 may for example provide an identification key 10 which the processor 110 can then lookup on a table 132 in the data storage 130 to provide an identification of the beacon 200 in step 550.
Once the beacon 200 is identified, the processor 110, may further look up the identity in a set of access control rules 134 stored in the data storage 130. If appropriate the analysis system 100 may then trigger a predetermined notification and/or alert in step 570. For example, an alert could be issued if a person without the required access rights enters a particular area or if a person enters an area at a non-authorised time. It should also be noted that some alerts could be triggered (as shown by line 575) in the event of a subject, for example identified by image recognition, being detected when a beacon signal has not been identified. It will be appreciated that the notifications/alerts triggered may depend upon the installation and/or user preferences such that the control rules 134 may also include a record of types of alert to be issued depending upon the situation detected.
In some embodiments, the identification of the beacon in step 550 may be further used to identify or track movements of the related user/objects. Such a step 555 may be used to track movement of a subject across areas monitored by multiple cameras (which may or may not be located in proximal areas). Advantageously this may enable tracking of multiple subjects across multiple locations even in a crowded environment.
-17 -Embodiments of the invention provide a method to extract the beacon pulses from the background elements of the video stream. To one skilled in the art it would be understood that the computer vision algorithms described herein are merely exemplars of a plurality of algorithms that could be employed to this effect and that other algorithms in addition to those specificity described are available to achieve the same ends. In one embodiment, the security zone is observed by two cameras in close physical proximity. The first camera contains an "IR cut filter" that blocks light at the same wavelength as the beacon emits light. The second camera contains no such filter and so is sensitive to light in the beacon's IR wavelengths. Frames are derived from the two cameras, with each frame taken within a short time period relative to the other (so the the frames are synchronised in time series). Pixel values from the luminance of these frames are then subtracted from each other and a luminance cut-off threshold applied to the resultant pixel difference frame. The threshold value applied can be chosen by numerous techniques known to one skilled in the art, such as mean global pixel value of the difference frame. The thresholded, difference frame, will now show the location of bodies of high intensity IR pixel locations and the luminance frame can be scanned for such locations. These locations represent a 1 in the beacon encoding scheme. The absence of the beacon at a previous locus of observation represents a Din the encoding scheme. By capturing the rs and O's of the beacon over a time series, complex encoding schemes, transmitted by the beacon can be ingested by the video analytics system.
In a further embodiment a single camera captures the luminance frame without an IR cut filter. The previous frames of the time series are kept as a "running average frame" and subtracting the current frame from this running average frame and then applying a luminance threshold yields a difference frame that represents the moving foreground objects in the frame with background clutter removed. Applying a bolean "And" between the pixels of high luminance in the original frame with the resultant difference frame will indicate solely the location of the beacons within the frame from which the signal can be extracted in a time series by the video analytics system.
-18 -It is noted that the colour value of the IR beacon may be bright white. Thus, in a further embodiment of the system, the difference frames derived as described above, are calculated using colour values (in RGB/YUV/HSV or other format), rather than luminance pixel values. The difference frame is then filtered to only show near RGB white values and the location of these white values are used to denote the beacon signal derived from colour frames.
It is also noted that the algorithm described above is a functional solution when employed to find a beacon signal for a single object in a simplsistic scene but this is unlikely to be applicable to all cases and scenes may be encountered in which more than one beacon is present and thus more than one encoding pulse time series will also be present. Moreover, in the observed frames, multiple moving objects carrying the beacons may cross over one another. In this case, the construction of the time series for a single beacon becomes problematic. Thus, in other embodiments, the movement of objects in the scene is tracked. One skilled in the art would know of the multiple computer vision techniques that are available to perform this function. Background foreground segmentation is often used in the first step of a tracking algorithm and these segmentation techniques split pixels into being a member of background clutter model or as a moving foreground object, based on statistical techniques such as Gaussian Mixture Modelling or Code-Book pixel look-ups. Foreground pixels are then passed into sophisticated tracking engines that combine individual pixels into single tracking objects by using connected-component labelling algorithms, in concert with hole filling algorithms (morphological operations). The individual objects are then tracked and compared to a model of expected movement to remove statistically outlying movements and the satisfactory movements recorded by the video analytics system. Yet more computationally advanced algorithms such as optical flow methods or segmentation by Convolution Neural Networks may also be used in tracking functionalities, but in sum, these techniques provide a mechanism to automatically split and track moving objects. In some embodiments of the device the encoded beacon sequence is attached to a single moving object to allow decoding the sequence on a per object basis over a time -19 -sequence and allow identification of multiple beacons in a single frame in difficult environments.
Embodiments of the invention provide a mechanism to store the data or meta-data. 5 This data can be kept in memory and analysed on the fly by the video analytics system or placed into a database where it can be analysed at a later date. This analysis could be to derive associations amongst the data that is only visible from a prolonged time series, such as the number of times in a day that a security guard has entered an area. In another embodiments searching for and finding an incident on historical data then 10 allows the partnered visible spectrum video data to be observed and interpreted by human operators.
In embodiments it is envisaged that the following data and meta data could be collected by the video analytics system. One skilled in the art would recognise that this list is not exclusive and that additional elements of data and meta data could prove useful in real-time analysis or be stored for subsequent review. Such data includes: the camera number, the pulsed flashes of the beacon, the beacon signal strength (number of pixels), the beacon centroid location (X,Y), the beacon's derived Z location (from a pinhole camera model, from a 3D camera using stereoscopy or structured light or range finding technology), the beacon's derived Z location from the separation of matching beacons on a single object and passed through a range finding algorithm, the time of the observations. In some embodiments the Meta data such as the beacon identification tag derived from a lookup table could also be stored in the database.
As noted above, the beacon 200 may typically be a wearable device. This could for example, include a badge or wrist band. However, typically the cameras of video surveillance systems are mounted high up, over the area being monitored and therefore it is advantageous to provide a beacon 200 which is located above a subject's waist. For example, in some embodiments, a beacon 200, could be incorporated into an epaulette, which are often present in the uniforms of security, police or military personnel.
-20 -Epaulettes are worn on both shoulders and as such, at least one is generally visible from an orientation likely to observed by a CCTV camera in a surveillance system; as such, embodiments may take advantage of this by having beacons with synchronised signals on each epaulette (for example a connection such as a wireless Bluetooth or Wi-Fi link may be provided between the pair of epaulettes to enable such set up, or employ a time based signal, which would always be synchronised between time matched wearables). As epaulettes are close to eye level it may be desirable to provide a shield or mirror to direct light from the light emitting elements 230, in a preferred emission direction -typically away from the users face.
One implementation of the beacon 200 is represented in figure 3. In this embodiment the beacon is incorporated into a Lanyard of the type which are worn around the neck and commonly used in workplaces or at functions and events. Such lanyards generally comprise a cord or strap 610 which is worn around the neck and an identification badge 600 attached thereto. A beacon 200 for use in embodiments of the invention could be integrated into a lanyard or provided as an add-on/modification to an existing lanyard. Components such as the battery 210 and controller 220 may for example be conveniently located within the badge 600 or a holder associated with the badge.
In some embodiments the badge (or associated badge holder), 610 could be provided with one or more light emitting element(s) 232 of the beacon. In other embodiments, the light emitting element(s) 234 could be incorporated into the strap 610 of the lanyard. Such an arrangement is particularly useful as it can enable the element and signal to be visible along a greater area. For example, a transparent or translucent portion of the strap 610 may be provided with multiple distributed LEDs or with LEDs connected to a light diffusing arrangement. This may enable the light emitting element 234 to be visible form all directions including the rear. The light diffusing element could, for example, be formed using optical fibres, light pipes or flexible light diffusing sheets. An adjacent region of the strap 610 may have a contrasting colour to the light emitting element or -21 -associated light diffuser. This may provide increased contrast for the video analysis to assist in identification of the beacon 200 and the pulsed signal.
Although the invention has been described above with reference to preferred embodiments, it will be appreciated that various changes or modification may be made 5 without departing from the scope of the invention as defined in the appended claims. For example, in some embodiments a beacon or beacons used on a subject may be used for additional image processing uses. For example if a beacon or beacons are provided which are a fixed or known distance apart on a subject it would be possible to measure this distance and thus generate the depth location (Z plane location) of the beacons 10 when processing the 2D frames of the captured video.
-22 -
Claims (32)
- CLAIMSWhat is claimed is: 1. A video surveillance system comprising: at least one camera to observe a surveillance field and provide resulting video data; at least one beacon comprising at least one light emitting element and a controller to provide an actuation sequence to the light emitting element; a video analytics system comprising a processor configured to: receive video data captured by the at least one camera; analyse the video for the presence of light emissions from a beacon; and, upon detection of a light emissions, decode the actuation sequence of the light emissions to verify the identity of the beacon and output a record of the transition of a beacon within the surveillance field.
- 2. A video surveillance system as claimed in claim 1, wherein the light emitting element emits non-visible spectra light and wherein the camera comprises a sensor for detecting non-visible spectra light.
- 3. A video surveillance system as claimed in claim 2, wherein the light emitting element emits infrared spectrum light.
- 4. A video surveillance system as claimed in claim 3, wherein the infrared spectrum light is in the range of 850nm and the camera comprises a sensor attuned to 8SOnm wavelengths.
- -23 - 5. A video surveillance system as claimed in claim 1, wherein the system comprises a plurality of beacons and wherein the controller of each beacon provides a unique actuation sequence.
- 6. A video surveillance system as claimed in claim 1, wherein the controller is provided with an encoding algorithm to provides a changing actuation sequence and the video analytic system processor uses the same encoding algorithm when decoding the actuation sequence to verify the identity of the beacon.
- 7. A video surveillance system as claimed in claim 6, wherein the processor comprises a machine-readable storage comprising a database of unique identification keys for a plurality of beacons.
- 8. A video surveillance system as claimed in claim 6, wherein the controller uses the encoding algorithm to derive an actuation sequence based upon an identification key and a sequence-based value and the processor verifies the identification key using a current sequence value.
- 9. A video surveillance system as claimed in claim 6, wherein the controller uses the encoding algorithm to derive an actuation sequence based upon an identification key and a time-based value and the processor verifies the identification key using a current time value.
- 10. A video surveillance system as claimed in claim 1, comprising at least a first camera to observe a first surveillance field and at least a second camera to observe a second surveillance field; and wherein the processor is further configured to use the verified identity of the beacon to track movement over time through the first and second surveillance fields.
- -24 - 11. A video surveillance system as claimed in claim 1, wherein the beacon comprises a plurality of light emitting elements and wherein the actuation sequence of each light emitting element is synchronised.
- 12. A video surveillance system as claimed in claim 11, wherein at least two of the plurality of light emitting elements are spaced apart by a known spacing distance and the processor of the video analytics system is configured to derive a depth location of the beacon within images from the video data using the spacing of the light emitting elements in the video data.
- 13. A video surveillance system as claimed in claim 1, wherein the beacon is a wearable device.
- 14. A video surveillance system as claimed in claim 13, wherein the beacon is a lanyard.
- 15. A video surveillance system as claimed in claim 14, wherein the beacon is configured to attach to an item of clothing.
- 16. A video surveillance system as claimed in claim 1, wherein the video analytics system provides an alert or notification in response to predetermined criteria for transition of a beacon within the surveillance field.
- 17. A video surveillance system as claimed in claim 1, wherein the system further comprises a network for receiving video output from the at least one camera and transmitting video date to the video analytics system.
- -25 - 18. A method of video surveillance monitoring, the method comprising: providing at least one camera to capture video of a surveillance field; providing at least one beacon to transmit an encoded light sequence; analysing video from the at least one camera to identify beacon output within the surveillance field and decoding the light sequence from the beacon to identify the beacon.
- 19. The method of claim 18, further comprising providing an encoding algorithm for use by a controller of the at least one beacon to provide an encoded light sequence that changes over time and using said coding algorithm when decoding the light sequence from the beacon to identify the beacon.
- 20. The method of claim 19, wherein the algorithm is a function of a unique identification key and a sequential-based and/or time-based input.
- 21. The method of claim 18, comprising providing a plurality of beacons and wherein the method further comprises: assigning a unique identification key to each beacon; encoding the identification key in the encoded light sequence; and providing a computer readable data store of beacon identification keys.
- 22. The method of claim 18, wherein the method further comprises: providing at least a first camera to observe a first surveillance field and at least a second camera to observe a second surveillance field; and wherein analysing the video from the cameras further comprises using the identity of the beacon to track movement over time through the first and second surveillance fields.
- -26 - 23. The method of claim 18, wherein the method comprises defining at least one access zone within the surveillance field and generating a notification or alert in response to the detection of a beacon within the access zone.
- 24. The method of claim 23, wherein the alert or notification triggered is dependent upon the identity of the beacon.
- 25. The method of claim 18, wherein the at least one camera transmits video output across a network and at least one processor is provided for receiving video input from the network to perform the step of analysing video.
- 26. A wearable beacon comprising: a power source; at least one light emitting element; and a controller to provide an actuation sequence to the light emitting element, wherein the controller comprises a machine-readable storage containing an encoding algorithm and a unique identification key for the beacon; and a processor configured to: obtain the stored unique identification key for the beacon; obtain a time-based or sequence-based value; use the stored encoding algorithm to derive a unique code as a function of the identification key and value; and output an actuation sequence based upon said unique code for the light emitting element.
- 27. A wearable beacon as claimed in claim 25, wherein the controller further comprises a counter for providing a sequence-based value.
- -27 - 28. A wearable beacon as claimed in claim 25, wherein the controller further comprises a clock for providing a time-based value.
- 29. A wearable beacon as claimed in claim 25, wherein the at least one light emitting element emits infrared spectra light.
- 30. A wearable beacon as claimed in claim 25, wherein the beacon comprises a plurality of light emitting elements, the controller being further configured to synchronise the actuation sequence of the light emitting elements.
- 31. A wearable beacon as claimed in claim 25, wherein the light emitting element comprises at least one light guide to diffuse light from one or more point sources.
- 32. A wearable beacon as claimed in claim 25, wherein the at least one light emitting element is positioned adjacent a contrasting material to assist in identification of the element by a video analytics system.-28 -
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1916245.2A GB2589080B (en) | 2019-11-08 | 2019-11-08 | Surveillance system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1916245.2A GB2589080B (en) | 2019-11-08 | 2019-11-08 | Surveillance system |
Publications (3)
Publication Number | Publication Date |
---|---|
GB201916245D0 GB201916245D0 (en) | 2019-12-25 |
GB2589080A true GB2589080A (en) | 2021-05-26 |
GB2589080B GB2589080B (en) | 2022-01-19 |
Family
ID=69062302
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1916245.2A Active GB2589080B (en) | 2019-11-08 | 2019-11-08 | Surveillance system |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2589080B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2603131A (en) * | 2021-01-26 | 2022-08-03 | Ethersec Ind Ltd | Surveillance system |
WO2024056153A1 (en) * | 2022-09-12 | 2024-03-21 | Eaton Intelligent Power Limited | A method of monitoring a device and synchronising sensor data |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2380883A (en) * | 2001-06-20 | 2003-04-16 | Roke Manor Research | Location and identification of participants in a sporting event by means of optically readable tags |
WO2011154949A2 (en) * | 2010-06-10 | 2011-12-15 | Audhumbla Ltd. | Optical tracking system and method for herd management therewith |
US20140225916A1 (en) * | 2013-02-14 | 2014-08-14 | Research In Motion Limited | Augmented reality system with encoding beacons |
EP3029972A1 (en) * | 2014-12-02 | 2016-06-08 | Accenture Global Services Limited | Smart beacon data security |
WO2016178572A1 (en) * | 2015-05-07 | 2016-11-10 | Eldolab Holding B.V. | Method for changing the identification code of a light source in visible light communication systems |
-
2019
- 2019-11-08 GB GB1916245.2A patent/GB2589080B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2380883A (en) * | 2001-06-20 | 2003-04-16 | Roke Manor Research | Location and identification of participants in a sporting event by means of optically readable tags |
WO2011154949A2 (en) * | 2010-06-10 | 2011-12-15 | Audhumbla Ltd. | Optical tracking system and method for herd management therewith |
US20140225916A1 (en) * | 2013-02-14 | 2014-08-14 | Research In Motion Limited | Augmented reality system with encoding beacons |
EP3029972A1 (en) * | 2014-12-02 | 2016-06-08 | Accenture Global Services Limited | Smart beacon data security |
WO2016178572A1 (en) * | 2015-05-07 | 2016-11-10 | Eldolab Holding B.V. | Method for changing the identification code of a light source in visible light communication systems |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2603131A (en) * | 2021-01-26 | 2022-08-03 | Ethersec Ind Ltd | Surveillance system |
WO2022161900A1 (en) * | 2021-01-26 | 2022-08-04 | Ethersec Industries Ltd | Surveillance system |
GB2603131B (en) * | 2021-01-26 | 2023-06-07 | Ethersec Ind Ltd | Surveillance system |
WO2024056153A1 (en) * | 2022-09-12 | 2024-03-21 | Eaton Intelligent Power Limited | A method of monitoring a device and synchronising sensor data |
Also Published As
Publication number | Publication date |
---|---|
GB2589080B (en) | 2022-01-19 |
GB201916245D0 (en) | 2019-12-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9536154B2 (en) | Monitoring method and camera | |
US20160019427A1 (en) | Video surveillence system for detecting firearms | |
US20210287469A1 (en) | System and method for provisioning a facial recognition-based system for controlling access to a building | |
US7479980B2 (en) | Monitoring system | |
US20180247504A1 (en) | Identification of suspicious persons using audio/video recording and communication devices | |
US20140307076A1 (en) | Systems and methods for monitoring personal protection equipment and promoting worker safety | |
CN109426798A (en) | A kind of border intrusion detection method, apparatus and system | |
CN110111515A (en) | A kind of border intrusion detection method, apparatus, server and system | |
KR102585066B1 (en) | Combined fire alarm system using stand-alone fire alarm and visible light camera | |
US20190073883A1 (en) | System for tracking the location of people | |
US20170309147A1 (en) | Lighting device and lighting system | |
GB2589080A (en) | Surveillance system | |
US20130308939A1 (en) | Infrared Communication System and Method | |
CA2393932C (en) | Human object surveillance using size, shape, movement | |
US11445340B2 (en) | Anomalous subject and device identification based on rolling baseline | |
WO2018019553A1 (en) | Monitoring an area using illumination | |
CN111914050A (en) | Visual 3D monitoring platform based on specific places | |
US10296790B2 (en) | Coded visual markers for a surveillance system | |
US20240303992A1 (en) | Surveillance system | |
US20240087324A1 (en) | Monitoring system | |
CN111723598A (en) | Machine vision system and implementation method thereof | |
WO2008127360A2 (en) | Real time threat detection system | |
JP2011227675A (en) | Notification device | |
Chan et al. | MI3: Multi-intensity infrared illumination video database | |
TWM572541U (en) | Dedicated device and system for anti-terrorism service |