[go: up one dir, main page]

WO2024056153A1 - A method of monitoring a device and synchronising sensor data - Google Patents

A method of monitoring a device and synchronising sensor data Download PDF

Info

Publication number
WO2024056153A1
WO2024056153A1 PCT/EP2022/075318 EP2022075318W WO2024056153A1 WO 2024056153 A1 WO2024056153 A1 WO 2024056153A1 EP 2022075318 W EP2022075318 W EP 2022075318W WO 2024056153 A1 WO2024056153 A1 WO 2024056153A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal
observation
synchronisation
observer
data acquisition
Prior art date
Application number
PCT/EP2022/075318
Other languages
French (fr)
Inventor
Daniel ZUCCHETTO
Padhraig RYAN
Original Assignee
Eaton Intelligent Power Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eaton Intelligent Power Limited filed Critical Eaton Intelligent Power Limited
Priority to PCT/EP2022/075318 priority Critical patent/WO2024056153A1/en
Publication of WO2024056153A1 publication Critical patent/WO2024056153A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the present disclosure relates generally to a method of monitoring a device using a monitoring system and synchronising sensor data. Aspects of the disclosure relate to a method, to a device monitoring system, and to a non-transitory, computer-readable storage medium.
  • Monitoring systems are often used to monitor a particular device and activities performed by or using that device. In many such applications, it is advantageous to acquire direct sensor data, as well as audio data and/or image data relating to the operation I activity being performed. In this manner, the sensor data and the audio/image data can be linked and causal relations can be inferred.
  • sensor data may be combined with the image data of a video recording to enable labelling of events, e.g. labelling image frames with relevant sensor data and/or labelling sensor data with event information derived from respective image frames.
  • the video recording system may also include an audio track attached to the video track, with the two tracks already being synchronized by the video recording system.
  • the sensor data is typically provided by separate sensor devices that operate on respective timing systems or clocks, which may be aligned with different time standard or differ due to a lack of reliability or precision.
  • Such sensor devices may also be embedded in the monitored device, or take other forms that preclude wired connections. Consequently, the sensor devices cannot rely on electrical pulses for synchronization. Instead, the sensor devices are often reliant on wireless communication systems, such as radio transmitters and/or receivers.
  • Such wireless communication signals may experience interference, delay, and may need to be retransmitted, introducing an uncertain amount of latency in the communication. Consequently, the timing of such wireless communication signals cannot be used to establish synchronization.
  • the monitoring system comprises a data acquisition system, an observer system and a device-coupled system.
  • the data acquisition system is connected to the observer system and wirelessly connected to the device-coupled system.
  • the observer system comprises one or more observation sensors for observing a space and generating observation signals comprising audio and/or image sensor data relating to the observed space for transmission to the data acquisition system.
  • the device-coupled system comprises: one or more device-coupled sensors for measuring respective device parameters and generating sensor data for wireless transmission to the data acquisition system; and a signal generating unit for outputting an observable synchronisation signal.
  • the method comprises: receiving a first observation signal from the observer system when the synchronisation signal is not observable by the one or more observation sensors; determining a baseline signal for the observed space based on the first observation signal; receiving a second observation signal from the observer system when the synchronisation signal is observable by the one or more observation sensors; comparing the second observation signal to the baseline signal to detect the synchronisation signal output from the monitored device in the observed space; and synchronising sensor data received from the device-coupled system and the observer system by matching timing data associated with the output and detection of the synchronisation signal.
  • the monitoring system solves the problem of having to synchronise timestamps from a device-coupled wireless sensor that provides ‘direct’ data relating to the use/operation of the device, with video/audio data from a remote sensor that observes the monitored device within a particular space, such as a designated work or use area, obtaining ‘indirect’ sensor data relating to the interaction of the device with its surroundings.
  • a device-coupled wireless sensor that provides ‘direct’ data relating to the use/operation of the device
  • video/audio data from a remote sensor that observes the monitored device within a particular space, such as a designated work or use area
  • obtaining ‘indirect’ sensor data relating to the interaction of the device with its surroundings.
  • the synchronisation signal is not observable by the one or more observation sensors when the first observation signal is received because the signal generating unit is inactive, i.e. not generating the synchronisation signal, and/or the monitored device, or a part thereof, is absent or otherwise undetectable within the observed space from the perspective of the observer system.
  • the one or more observation sensors may, for example, include a sound sensor configured to generate an observation signal comprising audio sensor data, i.e. audio data that is indicative of audible noise in the observed space.
  • determining the baseline signal comprises determining a baseline noise signal based on the first observation signal. It shall be appreciated that the observed space may or may not relate to an explicitly delimited space and/or different observations sensors may monitor respective, overlapping, or matching spaces.
  • determining the baseline noise signal comprises determining frequency components of the first observation signal.
  • detecting the synchronisation signal comprises: determining frequency components of the second observation signal; and removing the frequency components of the baseline noise signal. The removal of the baseline noise signal in this manner provides for enhanced detection of the synchronisation signal.
  • the frequency components are determined using one or more spectral analysis algorithms.
  • the one or more spectral analysis algorithms include a Fast Fourier Transform (FFT).
  • the one or more observation sensors may, for example, include a camera configured to generate an observation signal comprising image sensor data, i.e. image data indicative of visual aspects of the observed space.
  • determining the baseline signal comprises determining a baseline image frame.
  • the baseline image frame is determined as an average of N image frames received from the camera, where N is a positive integer.
  • detecting the synchronisation signal may, for example, comprise removing the baseline image frame from an image frame of the second observation signal.
  • detecting the synchronisation signal comprises identifying the monitored device, or a part thereof, in a remaining image frame using one or more image processing techniques.
  • the one or more image processing techniques include applying a neural network.
  • identifying the monitored device further comprises filtering a background area from the remaining image frame.
  • the signal generating unit may comprise a light source for emitting the observable synchronisation signal in a reference light colour.
  • detecting the synchronisation signal further comprises: determining a number of pixels of the reference light colour in the background-filtered image portion.
  • the device-coupled system may include a first timing device for acquiring timing data associated with the one or more device-coupled sensors and the observer system may include a second timing device for acquiring timing data associated with the audio and/or image sensor data.
  • the method further comprises commanding the signal generating unit to output the synchronisation signal, i.e. to activate the signal generating unit.
  • the device-coupled system may, for example, record timing data associated with the output synchronisation signal.
  • the method may further comprise commanding the observer system to determine each of the first and second observation signals.
  • the data acquisition system includes a memory storing one or more reference synchronisation signal patterns.
  • the signal generating unit may be configured to generate at least one of the one or more reference synchronisation signal patterns.
  • the method may further comprise matching the detected synchronisation signal to one of the reference synchronisation signal patterns.
  • the detected synchronisation signal may be matched to one of the reference synchronisation signal patterns by matching the frequency components of the first observation signal (e.g. once the frequency components of the baseline noise signal have been removed) to one or more frequency components of one of the reference synchronisation signal patterns.
  • the reference synchronisation signal patterns may comprise a light illumination pattern
  • matching the detected synchronisation signal to one of the reference synchronisation signal patterns may comprise: comparing a series of images frames of the second observation signal to the reference synchronisation signal patterns, the first image frame of the series being the image frame used to detect the synchronisation signal.
  • the sensor data received from the device-coupled system and the observer system is synchronised in dependence on matching the detected synchronisation signal to one of the reference synchronisation signals patterns.
  • a non-transitory, computer-readable storage medium storing instructions thereon that when executed by a processor cause the processor to perform a method as defined above.
  • the monitoring system comprises: an observer system; a system that is attachable to, or embeddable in, a device so as to form a device-coupled system, in use; and a data acquisition system connectable to the observer system and wirelessly connectable to the device-coupled system.
  • the observer system comprises one or more observation sensors for observing a space and generating observation signals comprising audio and/or image sensor data relating to the observed space for transmission to the data acquisition system, in use.
  • the device-coupled system comprises: one or more device-coupled sensors for measuring respective device parameters and generating sensor data for wireless transmission to the data acquisition system, in use; and a signal generating unit for outputting an observable synchronisation signal.
  • the data acquisition system is configured to: receive a first observation signal from the observer system when the synchronisation signal is not observable by the one or more observation sensors; determine a baseline signal for the observed space based on the first observation signal; receive a second observation signal from the observer system when the synchronisation signal is observable by the one or more observation sensors; compare the second observation signal to the baseline signal to detect the synchronisation signal output from the monitored device in the observed space; and synchronise sensor data received from the devicecoupled system and the observer system by matching timing data associated with the output and detection of the synchronisation signal.
  • the monitoring system further comprises the monitored device into which the system is embedded, or to which the system is attached in use, to form the device-coupled system. It will be appreciated that preferred and/or optional features of each aspect of the disclosure may be incorporated alone or in appropriate combination in the other aspects of the disclosure also.
  • Figure 1 shows a schematic view of an exemplary device monitoring system
  • Figure 2 shows the steps of an example method of monitoring a device using the monitoring system shown in Figure 1 ;
  • Figures 3 shows exemplary sub-steps of the method shown in Figure 2;
  • Figures 4 shows further exemplary sub-steps of the method shown in Figure 2.
  • Embodiments of the disclosure relate to a method, and to a system, for monitoring a device and, in particular, a use I operation of the device.
  • the monitored device may take various suitable forms within the scope of this disclosure where it is desired to monitor the use/operation of the device by combining sensor data determined directly, i.e. acquired from one or more device-coupled sensors, with sensor data determined remotely or indirectly, i.e. acquired by a sensor (such as a camera or sound sensor), that observes the use/operation of the device within a particular space from a distance.
  • a sensor such as a camera or sound sensor
  • the device monitoring system therefore includes an observer system having one or more observation sensors, such as a camera and/or a microphone/sound sensor, for observing a space in which the monitored device is used/operated.
  • the observer system generates signals comprising audio and/or image sensor data relating to the observed space for transmission to a data acquisition system.
  • the observation sensor(s) may be arranged to observe a practice bay at a training facility or driving range and, thereby determine audio and/or video recordings comprising audio/image data associated with the use of the golf club in the practice bay.
  • the monitoring system also includes a device-coupled system comprising one or more device-coupled sensors for measuring respective device parameters.
  • the golf club may include a device-coupled system that is attached to, or embedded into, the golf club, featuring an accelerometer, gyroscope, and/or force sensor, for measuring various parameters that may be indicative of the speed and rhythm of motion of the golf club during a golf swing.
  • the monitored device may take the form of a pneumatic drill and the device-coupled system may include an accelerometer attached to, or embedded into, the drill to measure an angle of orientation of the drill during use.
  • the device-coupled system wirelessly communicates the determined sensor data to the data acquisition system, for example using a wireless communication system such as a radio transceiver.
  • the devicecoupled system also includes a signal generating unit for outputting a synchronisation signal that is observable by the observation sensor(s).
  • the signal generating unit may take the form of a light source attached to, or embedded in, the golf club for generating a light signal in an on/off pattern.
  • the data acquisition system receives a first observation signal from the observation sensor(s) when the synchronisation signal is unobservable by the observation sensor (e.g. when the monitored device is absent from the observed space and/or the signal generating unit is inactive) and determines a baseline signal for the observed space based thereon.
  • the baseline signal may take the form of an average image frame obtained by the camera, which depicts the practice bay without the golf club being present.
  • the data acquisition system subsequently receives a second observation signal from the observer system when the synchronisation signal is observable by the observation sensor(s), for example when the golf club is within the observed space and the signal generating unit is active.
  • the data acquisition system is configured to compare the second observation signal to the baseline signal to detect the synchronisation signal output from the signal generating unit in the observed space. If the synchronisation signal is detected, the data acquisition system synchronises the sensor data received from the device- coupled system and the observer system, for example by matching timing data associated with the output and detection of the synchronisation signal.
  • the monitoring system is able to synchronise data relating to the operation/use of the device from the device-coupled system and the observer system, allowing the sensor data and audio/image data to be linked and causal relations to be inferred. It is expected that the monitoring system therefore provides for improved monitoring of a device and the use/operation of the device.
  • the device monitoring system 1 includes a data acquisition system 2, a monitored device 4, an observer system 6, and a device-coupled system 8.
  • the monitored device 4 may take various suitable forms within the scope of this disclosure where it is desired to monitor the use/operation of the device by combining sensor data determined directly, i.e. acquired from the devicecoupled system 8, with sensor data determined remotely or indirectly, i.e. acquired by the observer system 6, which observes the use/operation of the device 4 within a particular space.
  • the observer system 6 includes one or more observation sensors 10 configured to observe a space in which the monitored device 4 is used/operated and to determine corresponding audio and/or image sensor data. In this manner, the observer system 6 provides data for analysing the interaction of the device 4 with the surrounding environment, allowing for insights into the effects of the use/operation of the monitored device 4 on other elements to be determined.
  • the observation sensor(s) 10 may therefore take any suitable form for observing a space and acquiring corresponding audio and/or image sensor data, including a camera for recording a video as a series of image frames and/or a microphone, amongst other suitable sensor devices.
  • the observer system 6 may be equipped with one or more observation sensors 10, in the form of a video system for recording the swing of the golfer from one or multiple perspectives.
  • Each camera may therefore be arranged to observe a practice bay at a training facility or driving range and record audio and image data corresponding to the observed practice bay.
  • the image sensor data may subsequently be used to analyse the various aspects of the golfer’s interaction with the golf club during the golf swing, for example using automated analysis to record the angle of the golfer’s legs and torso relative to the ground at different phases of the swing.
  • the observer system 6 is therefore connected to the data acquisition system 2, to which the audio and/or image data is transmitted in the form of observation signals.
  • the observer system 6 may be connected to the data acquisition system 2 via a direct, wired connection system or otherwise include means or systems for wirelessly communicating the observations signals to the data acquisition system 2.
  • the device-coupled system 8 is attached to, or otherwise embedded in, the device 4 and includes one or more device-coupled sensors 12 and a wireless communication system 14.
  • the one or more sensors 12 are configured to obtain direct measurements of respective device parameters, i.e. physical characteristics of the device or measurable thereat.
  • the one or more sensors 12 may therefore take any suitable form for measuring physical characteristics of the device 4 including, amongst others, sensors for measuring power characteristics of an electrical device, such as current, voltage and/or power levels, sensors for measuring position, orientation, or motion of the device 4, such as accelerometers and/or gyroscopes, as well as force sensors or pressure transducers.
  • sensors for measuring power characteristics of an electrical device such as current, voltage and/or power levels
  • sensors for measuring position, orientation, or motion of the device 4 such as accelerometers and/or gyroscopes, as well as force sensors or pressure transducers.
  • the wireless communication system 14 may take any suitable form for wirelessly transmitting the sensor data to the data acquisition system 2.
  • the wireless communication system 14 therefore allows for the transmission of sensor data to the data acquisition system 2 in the absence of any wired connection between the two systems.
  • the device-coupled system 8 may typically be embedded into the device 4 or the typical use of the device 4 may otherwise preclude wired connection to the device-coupled system 8 (or render such a connection impractical), for example impairing the use of the device 4.
  • the wireless communication system 14 may therefore take the form of a wireless transceiver, such as a radio frequency transceiver for communicating the sensor data using radio signals.
  • the device-coupled system 8 further includes a first clock or timing device 16 determining timing data, such as timestamps, associated with the sensor data, and the observer system 6 further includes a second clock or timing device 18 determining timing data, such as timestamps, associated with the image and/or audio sensor data determined by the observer system 6.
  • a video recording system may determine a video track and an audio track, attached to the video track, with the timing data for the two tracks being provided by the second timing device 18 such that the two tracks are already synchronized by the video recording system.
  • the first timing device 16 of the device-coupled system 8 typically comes from an embedded device that may not have a clock aligned with UTC, or otherwise differ due to a lack of reliability or precision.
  • the first and second timing devices 16, 18 may therefore be expected to operate on different timeframes, according to different timing standards, or otherwise be out of synchronisation. Consequently, the sensor data respectively received at the data acquisition system 2 from the observer system 6 and the device-coupled system 8 is asynchronous.
  • the wireless communication system 14 of the device-coupled system 8 enables the transmission of sensor data to the data acquisition system 2, the uncertain amount of latency associated with wireless communication, e.g. due to interference, delay, and/or the need to retransmit data, makes synchronisation of the sensor data with the audio and/or image sensor data of the observer system 6 difficult. For example, it is not possible to rely on electrical pulses to synchronise the timing devices 16, 18 since the devicecoupled system 8 does not have a wired connection to the data acquisition system 2.
  • the device-coupled system 8 further includes the signal generating unit 20, which is further attached to, or embedded into, the monitored device 4 and configured to output synchronisation signal(s) that are observable by the observer system 6.
  • the signal generating unit 20 may comprise a light source (such as an LED or a display), which may be configured to output one or more colours of light, and/or a sound source (such as a buzzer) configured to output a noise signal at one or more sound frequencies.
  • the signal generating unit 20 may include a buzzer for emitting a tone with a predefined frequency and/or an LED for emitting light of a particular colour.
  • the signal generating unit 20 corresponds to the sensor type of the observer system 6 though and, in particular, to the physical characteristics sensed by the observation sensor(s) 10.
  • the signal generating unit 10 may be configured to generate an audio signal for detection by the observer system 6.
  • the signal generating unit 20 may be configured to generate a visual signal for detection by the observer system 6.
  • the synchronisation signal may therefore take various forms that are suitable for detection in the audio and/or image sensor data of the observer system 6.
  • the role of the data acquisition system 2 is to synchronise the sensor data received from the observer system 6 and the device-coupled system 8.
  • the data acquisition system 2 is configured to: (i) receive observation signals from the observer system 6 when the monitored device is absent from the observed space and/or the signal generating unit is inactive, (ii) determine a baseline signal for the observed space based on such observation signals, (iii) receive further observation signals from the observer system 6 when the synchronisation signal is observable by the observation sensor(s) 10, (iv) compare the further observation signals to the baseline signal to detect the synchronisation signal, and (v) synchronise the sensor data received from the device-coupled system 8 and the observer system 6 by matching timing data associated with the output synchronisation signal and the detection of the synchronisation signal.
  • the data acquisition system 2 may further act as a command and control interface for operating the observer system 6 and/or the device-coupled system 8, as shall be described in more detail.
  • the sensor data from the device-coupled system 8 can be synchronised with the audio and/or image sensor data from the observer system 6 and linked to infer causal relations. It is expected that the monitoring system 1 therefore provides for improved monitoring and observation data.
  • the data acquisition system 2 may take the form of a suitable computer system for carrying out the data processing, communication and commands as described herein.
  • the data acquisition system 2 may therefore incorporate a data processing module 22, a communication module 24, a memory module 26 and a control module 28, as shown in Figure 1.
  • each of these units or modules may be provided, at least in part, by suitable software running on any suitable computing substrate using conventional or customer processors and memory. Some or all of the units or modules may use a common computing substrate (for example, they may run on the same server) or separate substrates, or different combinations of the modules may be distributed between multiple computing devices.
  • the data processing module 22 is configured to perform the data processing tasks described herein and may therefore include one or more image and/or audio signal processing algorithms for processing the sensor data.
  • the data processing module 22 may include one or more spectral analysis algorithms, such as a Fast Fourier Transform (FFT), for determining frequency components of an audio signal.
  • FFT Fast Fourier Transform
  • the data processing module 22 may additionally or alternatively include one or more image processing algorithms or machine learning algorithms for object recognition in an image frame.
  • the data processing module 22 may include a neural network, such as a convolutional neural network trained for recognition of the monitored device, as shall be described in more detail.
  • the communication module 24 is configured to communicate with the observer system 6 and the device-coupled system 8 and to receive sensor data from each system 6, 8.
  • the communication module 24 may therefore include a corresponding wireless communication system for communication with the device-coupled system 8, along with a wired or wireless communication system for communication with the observer system 6.
  • the memory module 26 is arranged to store the sensor data received from the observer system 6 and the device-coupled system 8.
  • the memory module 26 may take the form of a computer-readable storage medium (e.g., a non-transitory computer-readable storage medium).
  • the computer- readable storage medium may comprise any mechanism for storing information in a form readable by a machine or electronic processors/computational device, including, without limitation: a magnetic storage medium (e.g., floppy diskette); optical storage medium (e.g., CD-ROM); magneto optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; or electrical or other types of medium for storing such information/instructions.
  • a magnetic storage medium e.g., floppy diskette
  • optical storage medium e.g., CD-ROM
  • magneto optical storage medium e.g., magneto optical storage medium
  • ROM read only memory
  • RAM random access memory
  • EPROM and EEPROM erasable programmable memory
  • flash memory or electrical or other types of medium for storing such information/instructions.
  • the control module 28 is an optional module that may be included to generate command signals to the observer system 6 and/or the device-coupled system 8.
  • the control module 28 may be configured to command the observer system 6 to determine observation signals.
  • the control module 28 may be configured to command the device coupled system 8, for example to active the device-coupled sensor(s) 12, to command the output of the synchronisation signal and/or to configure the output synchronisation signal, as shall be described in more detail.
  • Figure 2 shows an example method 100 of operating the device monitoring system 1 , in accordance with an embodiment of the disclosure.
  • the method 100 shall be discussed alongside an example application, namely for analysing a golfer’s swing at a driving range or training facility.
  • the observer system 6 may be equipped with one or more observation sensors 10, in the form of a video system for recording the swing of the golfer from one or multiple perspectives. Each camera may therefore be arranged to observe the practice bay and to record audio and image data corresponding to the observed practice bay.
  • the video sensor data may be used to determine information such as the angle of the golfer’s legs and torso relative to the ground at different phases of the swing.
  • the monitored device 4 takes the form of a golf club and the device-coupled system 8 may be embedded in a handle or a clubhead of the golf club, for example.
  • the device-coupled sensors 12 embedded in the golf club may comprise sensors such as an accelerometer, gyroscope, or force sensor.
  • the signal generating unit 20 may take the form of a light source embedded in the grip, clubhead, or other component of the instrument, and arranged to output a synchronisation signal in the form of a series of light pulses.
  • the device monitoring system 1 operates the observer system 6 to observe a space, such as the practice bay, and to determine a first observation signal for the observed space.
  • the first observation signal is determined while the synchronization signal is unobservable by the observer system 6.
  • the signal generating unit 20 may be inactive and/or the monitored device 4, such as the golf club, (or the signal generating unit 20 part thereof), may be arranged outside of the field of view of the observation sensor(s) 10 or otherwise be concealed within the observed space from observation by the sensors 10.
  • the determination of the first observation signal may be controlled by a command signal from the control module 28, for example.
  • the observer system 6 may record a series of N image frames, where N is a positive integer and/or record audio for M seconds, where M is a positive integer.
  • the integers N and/or M may be pre-programmed or determined based on one or more user inputs to the data acquisition system 2. It shall be appreciated that in each case the image frames and/or audio recording will relate to background features of the practice bay.
  • the device monitoring system 1 determines a baseline signal for the observer system 6 based on the first observation signal.
  • the baseline signal is a representative signal for the observed space when there is no observable synchronization signal.
  • the baselines signal therefore provides a reference for subsequent detection of the synchronization signal.
  • the baseline signal may therefore take various suitable form in dependence on the sensor type of the observer system 6, but may, for example, be determined as an average of the audio and/or image sensor data acquired in a prescribed observation period, such as a period of M seconds or containing N image frames.
  • the observer system 6 when the observer system 6 includes a microphone, the observation system 6 may be operated to observe the space, such as the practice bay, and to record M seconds of audio forming the first observation signal, in step 102. Accordingly, in step 104, the data acquisition system 2 may perform a spectral analysis on the first observation signal to determine the frequency components of the audio signal, for example using one or more spectral analysis techniques that are known in the art, such as a Fast Fourier Transform (FFT). The determined frequency components may therefore be used as the baseline signal for subsequently detecting the synchronization signal.
  • FFT Fast Fourier Transform
  • the observation system 6 when the observer system 6 includes a camera, the observation system 6 may be operated to observe the space, such as the practice bay, and to record N image frames forming the first observation signal, in step 102. Accordingly, in step 104, the data acquisition system 2 may use one or more image processing techniques to determine the baseline signal as an average image frame based on the N image frames. It shall be appreciated that the determined baseline signal may therefore include the frequency components derived from M seconds of audio recorded at the practice bay and/or an average image frame determined based on N image frames, where each of the N image frames may depict the practice bay without the golf club.
  • the baseline signal may be stored in the memory module 26 of the data acquisition system 2 for subsequent use in detecting the synchronisation signal.
  • the monitored device 4 may be moved into the space observed by the observer system 6 and/or the signal generating unit 20 may be activated such that the observer system 6 observes the synchronisation signal within the observed space.
  • the golf club acting as the monitored device 4 in this example
  • the observation sensors 10 i.e. the practice bay
  • the method 100 may therefore further include step 106 for commanding the signal generating unit 20 of the device-mounted system 8 to output the synchronization signal.
  • the data acquisition system 2 may be configured to transmit a command signal to the signal generating unit 20, via the wireless communication system 14, to cause the signal generating unit 20 to output the synchronization signal.
  • unknown delays exist between the transmission of the command signal and its reception by the devicecoupled system 8, e.g. due to signal modulation, demodulation, encoding, decoding, and retransmissions.
  • the device-coupled system 8 may be configured to store, reset or initiate a time recording at the first timing device 16, providing an accurate timestamp for reference points, such as the start time, of the synchronisation signal.
  • the command signal may also define a particular synchronization signal pattern for the signal generating unit 20 or otherwise provide an on/off trigger to the signal generating unit 20.
  • the communicated synchronization signal pattern could therefore be used for subsequent detection purpose, for example by creating a unique signal pattern associated with the monitored device that distinguishes that device from other devices in the vicinity, as shall be described in more detail.
  • the signal generating unit 20 could be outputting synchronisation signals continuously or periodically, where the synchronisation signals have a random or time-varying form.
  • the monitored device 4 may be moved into the observed space such that the synchronisation signal may be observed by the observation sensor(s) 10 once the baseline signal has been determined.
  • step 108 the device monitoring system 1 operates the observer system 6 to determine a second observation signal once the synchronisation signal becomes observable within the space observed by the observation sensor(s) 10.
  • the observer system 6 may be commanded or otherwise continue to observe the space and determines a second observation signal, in step 108, that is distinguished from the first observation signal, determined in step 102, in that the synchronisation signal is now observable by the observation sensor(s) 10.
  • the golf club may have been moved into the practice bay and the signal generating unit 20 may output the synchronisation signal in a form that is observable by the observation sensor(s) 10 observing the practice bay.
  • the observation signals may be determined continuously or periodically, for example at a predetermined sampling frequency, or the determination of the second observation signal may be controlled by a command signal from the control module 28, for example.
  • the observer system 6 may, for example, record a series of P image frames, where P is a positive integer and/or record audio for Q seconds, where Q is a positive integer.
  • the integers P and/or Q may be pre-programmed or determined based on one or more user inputs to the data acquisition system 2.
  • step 110 the device monitoring system 1 processes the second observation signal to detect the synchronization signal.
  • the data acquisition system 2 receives the second observation signal from the observer system 6 and compares the second observation signal to the baseline signal to detect the synchronization signal.
  • the methods used for detecting the synchronization signal will vary in dependence on the type of sensors on the observer system 6 and, in particular, whether the observation signal and baseline signal comprise audio and/or image sensor data. Although one or more methods may be suitable for comparing the second observation signal to the baseline signal, and detecting the synchronization signal, as shall be appreciated by the skilled person in the art, the following non-limiting examples are provided to give a clear indication of at least one method for detecting the synchronisation signal based on audio sensor data and, separately, based on image sensor data from the video recording.
  • the data acquisition system 2 may detect the synchronisation signal by checking whether there are any peak(s) in the frequency components of the second observation signal at the frequency(s) of the audio synchronisation signal.
  • the method 100 may therefore further comprise sub-steps 112 to 116 for detecting the synchronisation signal, as shown in Figure 3.
  • the data acquisition system 2 performs a spectral analysis on the second observation signal to determine the frequency components for comparison to the baseline signal.
  • the data acquisition system may use one or more spectral analysis techniques that are known in the art, such as a Fast Fourier Transform (FFT), for determining the frequency components in substantially the same manner as described in step 104.
  • FFT Fast Fourier Transform
  • the data acquisition system 2 removes or subtracts the baseline signal from the second observation signal. In other words, the data acquisition system 2 may subtract the background frequency components of the baseline signal from the frequency components determined for the second observation signal, leaving one or more remaining frequency components that distinguish the second observation signal form the baseline signal.
  • the data acquisition system 2 compares the remaining one or more remaining frequency components to one or more reference frequency components corresponding to the synchronisation signal to determine whether there is a match. In an example, the data acquisition system 2 may, for example, compare the remaining frequency component(s) of the second observation signal to a database of reference synchronisation signals and/or respective frequency components stored in the memory module 26 of the data acquisition system 2.
  • the device monitoring system 1 may operate the observer system 6 to determine a subsequent observation signal, in step 108, for comparison to the baseline signal, in step 110.
  • the data acquisition system 2 may validate the detection of the synchronisation signal and the method 100 proceeds to synchronise the sensor data, in step 118.
  • the data acquisition system 2 may detect the synchronisation signal by detecting a visible signal, such as a light source of a predefined colour, in one or more image frames of the second observation signal.
  • the data acquisition system 2 may apply one or more image processing techniques to isolate and refine an image portion corresponding to the monitored device 4, or an expected position of the signal generating unit 20 on the device 4, for example to remove noise sources before checking for pixels within that image portion matching the colour output of the synchronisation signal.
  • the method 100 may therefore further comprise sub-steps 212 to 216 for detecting the synchronisation signal, as shown in Figure 4.
  • the data acquisition system 2 subtracts or removes the contents of the baseline signal, i.e. the average image frame, determined in step 102, from each image frame of the second observation signal, determined in step 108.
  • the data acquisition system 2 may be configured to detect the monitored device 4 in the remaining image frame.
  • the data acquisition system 2 may use one or more computer vision techniques, such as object recognition, to identify the device 4 in the remaining image frame.
  • One or more computer vision techniques that are known in the art may be used for this purpose, which shall not be described in detail here to avoid obscuring the disclosure. Nonetheless, it shall be appreciated that such computer vision techniques may involve the use of a neural network, such as a convolutional neural network, trained to detect the device of interest, such as a golf club.
  • a neural network such as a convolutional neural network
  • a training dataset can be generated by collecting images of the device of interest, such as the golf club (i.e. either generic golf clubs or a particular make and model for example), with a variety of backgrounds that are representative of the physical environments in which the device monitoring system 1 may be deployed.
  • the training data should incorporate a range of lighting intensity and light source angles, to train a generalizable model.
  • the neural network may be a YOLO architecture as is well known in the art, although various other architectures are envisaged, such as R-CNN and Fast R- CNN.
  • the training dataset may be split into three parts corresponding to a training, validation and test dataset.
  • optimal hyperparameters may be found by optimizing recall, precision, or another performance metric such as F-score.
  • Object detection may initially find the entire golf club, and a separate object detection model may then be used to find the club head or the grip (in which the signal generating unit 20 is embedded) in a constrained part of this image. Based on these models, the expected coordinates of the signal generating unit 20 can be estimated, based on the orientation of the device (such as a golf club) relative to the plane of the observer system 6.
  • the data acquisition system 2 may return to determining one or more further image frames, in step 108.
  • the parts of the remaining image portion that are considered background, and therefore not belonging to the device are effectively removed from the image portion to produce a background-filtered image frame.
  • the combination of baseline frame removal and background removal helps in removing noise sources (i.e. extraneous light and/or audio sources) that may bring inaccuracies in the synchronization system.
  • the data acquisition system 2 proceeds to process the background-filtered image frame to detect the synchronisation signal.
  • the data acquisition system 2 may determine the number of pixels of a prescribed colour (corresponding to the synchronisation signal) in the background-filtered image frame, and compare the determined number of pixels to a threshold (e.g. K, where K is a positive integer) for detecting the synchronisation signal.
  • a threshold e.g. K, where K is a positive integer
  • the data acquisition system 2 may return to determining one or more further image frames, in step 108.
  • the data acquisition system 2 detects the synchronisation signal and the method 100 proceeds to synchronise the sensor data, in step 118.
  • the data acquisition system 2 proceeds to synchronize the sensor data received from the device-coupled system 8 and the observer system 6.
  • the data acquisition system 2 is configured to achieve the synchronisation by matching timing data, such as timestamps, received from the first and second timing devices 16, 18, each being associated with a reference point, such as a start and detection point, of the synchronisation signal as output from the signal generating unit 20 and detected in the second observation signal.
  • the sensor data may be synchronised by matching a timestamp associated with the start of the output of the synchronisation signal from the first timing device 16, with a timestamp from the second timing device 18 associated with the first point of detection of the synchronisation signal in the audio/image sensor data of the second observation signal.
  • different reference points may be used. Synchronising the sensor data in this manner ensures that the various sources of sensor data are matched to a common time frame, rather than local timeframes associated with any one system 6, 8 in isolation.
  • the device monitoring system 1 provides for improved data acquisition relating to the monitored device 4 and the use/operation of the device 4. It is expected that the method 100 will therefore provide for more accurate analysis of the device 4 and greater insights into causal relations between the device parameters and the audio/image data.
  • the matched sensor data could be used to gain enhanced insights into the interaction between body position and golf club motion for each swing phase, ultimately leading to better feedback for the golfer.
  • the device monitoring system 1 may be used to monitor a pneumatic drill, where the device-coupled sensor(s) 12 may include an accelerometer embedded into the device and the observation sensor(s) 10 may include a video camera recording a work area.
  • the accelerometer may offer insights into the angle of orientation of the drill to determine if this differs from guidelines.
  • it may be advantageous to align the timestamp of the accelerometer sensor data with the image sensor data, to enable automated analysis of events occurring in the work area (i.e. in the vicinity of the drilling) that may be associated with a change in the usage pattern of the drill. For example, a co-worker may approach the drill user and attract their attention, this may distract the user and may result in a dangerous and unsuitable angle of orientation of the drill. The synchronization of these timestamps can therefore lead to targeted initiatives to improve workplace safety.
  • the sensor data synchronisation can be performed based on the detection of the synchronisation signal in the image sensor data, the audio sensor data, or both. In case both are being used, it is possible to cross-check the resulting timing data and consider the synchronisation to be successful (in step 118) only if both of them agree on the results, otherwise the device monitoring system 1 may be configured to repeat the synchronization procedure 100.
  • the data acquisition system 2 may be further configured to determine a signal pattern of the detected synchronisation signal, such as an on/off sequence of a light source, and to compare the determined pattern to a database of one or more reference synchronisation signal patterns to check whether a match exists. In this manner, the comparison may be used to provide a more robust verification of the detected synchronisation signal.
  • the data acquisition system 2 may be configured to synchronise the sensor data received from the device-coupled system 8 and the observer system 6 in dependence on determining a match between the detected synchronisation signal and one of the reference synchronisation signals patterns.
  • the data acquisition system 2 may store one or more reference synchronisation signal patterns, e.g. in the memory module 26, for comparison to the detected synchronisation signal.
  • the reference synchronisation signals may be preprogrammed or predetermined, for example during a pairing process.
  • the data acquisition system 2 may proceed to determine a pattern of the detected synchronisation signal, for example by recording a start time of the detected signal and timing data associated with changes in the synchronisation signal, such as an on/off pattern of a light source or changes in frequency of an audio tone.
  • the data acquisition system 2 may then compare the detected synchronisation signal to a database of one or more reference synchronisation signal patterns and check whether a match exists.
  • the data acquisition system 2 may use one or more known signal comparison techniques that shall be not described here to avoid obscuring the disclosure. If a match is determined, the data acquisition system 2 may proceed to synchronise the sensor data, in step 118. If no match is detected, the data acquisition system 2 may return to acquiring further observation signals, in step 108, and may save the detected synchronisation pattern to a memory.
  • the data acquisition system 2 may also store a device ID and/or data record associated with each of the one or more reference synchronisation signal patterns such that the monitored device 4 can be identified by the detected synchronisation signal.
  • the data acquisition system 2 may also be configured to update the data record associated with the matched reference synchronisation signals pattern using the synchronised sensor data.
  • each data record may store historic sensor data associated with the monitored device 4 from which the synchronisation signal is detected and the data acquisition system 2 may update the data record once the sensor data has been synchronised in step 118.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

Aspects of the disclosure relate to a method of monitoring a device using a monitoring system. The monitoring system comprises a data acquisition system, an observer system and a device-coupled system. The data acquisition system is connected to the observer system and wirelessly connected to the device-coupled system. The observer system comprises one or more observation sensors for observing a space and generating observation signals comprising audio and/or image sensor data relating to the observed space for transmission to the data acquisition system. The device-coupled system comprises: one or more device-coupled sensors for measuring respective device parameters and generating sensor data for wireless transmission to the data acquisition system; and a signal generating unit for outputting an observable synchronisation signal. The method comprises: receiving a first observation signal from the observer system when the synchronisation signal is not observable by the one or more observation sensors; determining a baseline signal for the observed space based on the first observation signal; receiving a second observation signal from the observer system when the synchronisation signal is observable by the one or more observation sensors; comparing the second observation signal to the baseline signal to detect the synchronisation signal output from the monitored device in the observed space; and synchronising sensor data received from the device-coupled system and the observer system by matching timing data associated with the output and detection of the synchronisation signal.

Description

A METHOD OF MONITORING A DEVICE AND SYNCHRONISING SENSOR DATA
TECHNICAL FIELD
The present disclosure relates generally to a method of monitoring a device using a monitoring system and synchronising sensor data. Aspects of the disclosure relate to a method, to a device monitoring system, and to a non-transitory, computer-readable storage medium.
BACKGROUND
Monitoring systems are often used to monitor a particular device and activities performed by or using that device. In many such applications, it is advantageous to acquire direct sensor data, as well as audio data and/or image data relating to the operation I activity being performed. In this manner, the sensor data and the audio/image data can be linked and causal relations can be inferred. For example, sensor data may be combined with the image data of a video recording to enable labelling of events, e.g. labelling image frames with relevant sensor data and/or labelling sensor data with event information derived from respective image frames.
Typically, the video recording system may also include an audio track attached to the video track, with the two tracks already being synchronized by the video recording system. The sensor data, by contrast, is typically provided by separate sensor devices that operate on respective timing systems or clocks, which may be aligned with different time standard or differ due to a lack of reliability or precision. Such sensor devices may also be embedded in the monitored device, or take other forms that preclude wired connections. Consequently, the sensor devices cannot rely on electrical pulses for synchronization. Instead, the sensor devices are often reliant on wireless communication systems, such as radio transmitters and/or receivers.
However, such wireless communication signals may experience interference, delay, and may need to be retransmitted, introducing an uncertain amount of latency in the communication. Consequently, the timing of such wireless communication signals cannot be used to establish synchronization.
It is against this background that the disclosure has been devised. SUMMARY OF THE DISCLOSURE
According to an aspect of the present disclosure there is provided a method of monitoring a device using a monitoring system. The monitoring system comprises a data acquisition system, an observer system and a device-coupled system. The data acquisition system is connected to the observer system and wirelessly connected to the device-coupled system. The observer system comprises one or more observation sensors for observing a space and generating observation signals comprising audio and/or image sensor data relating to the observed space for transmission to the data acquisition system. The device-coupled system comprises: one or more device-coupled sensors for measuring respective device parameters and generating sensor data for wireless transmission to the data acquisition system; and a signal generating unit for outputting an observable synchronisation signal. The method comprises: receiving a first observation signal from the observer system when the synchronisation signal is not observable by the one or more observation sensors; determining a baseline signal for the observed space based on the first observation signal; receiving a second observation signal from the observer system when the synchronisation signal is observable by the one or more observation sensors; comparing the second observation signal to the baseline signal to detect the synchronisation signal output from the monitored device in the observed space; and synchronising sensor data received from the device-coupled system and the observer system by matching timing data associated with the output and detection of the synchronisation signal.
In this manner, the monitoring system solves the problem of having to synchronise timestamps from a device-coupled wireless sensor that provides ‘direct’ data relating to the use/operation of the device, with video/audio data from a remote sensor that observes the monitored device within a particular space, such as a designated work or use area, obtaining ‘indirect’ sensor data relating to the interaction of the device with its surroundings. By synchronising the timestamps, causal relations can be inferred from the sensor data and the interactions of the device, or its use, within the external environment, leading to enhanced monitoring of the device.
It shall be appreciated that the synchronisation signal is not observable by the one or more observation sensors when the first observation signal is received because the signal generating unit is inactive, i.e. not generating the synchronisation signal, and/or the monitored device, or a part thereof, is absent or otherwise undetectable within the observed space from the perspective of the observer system.
The one or more observation sensors may, for example, include a sound sensor configured to generate an observation signal comprising audio sensor data, i.e. audio data that is indicative of audible noise in the observed space. Optionally, determining the baseline signal comprises determining a baseline noise signal based on the first observation signal. It shall be appreciated that the observed space may or may not relate to an explicitly delimited space and/or different observations sensors may monitor respective, overlapping, or matching spaces.
In an example, determining the baseline noise signal comprises determining frequency components of the first observation signal. In another example, detecting the synchronisation signal comprises: determining frequency components of the second observation signal; and removing the frequency components of the baseline noise signal. The removal of the baseline noise signal in this manner provides for enhanced detection of the synchronisation signal. Optionally, the frequency components are determined using one or more spectral analysis algorithms. Optionally, the one or more spectral analysis algorithms include a Fast Fourier Transform (FFT).
The one or more observation sensors may, for example, include a camera configured to generate an observation signal comprising image sensor data, i.e. image data indicative of visual aspects of the observed space. Optionally, determining the baseline signal comprises determining a baseline image frame.
In an example, the baseline image frame is determined as an average of N image frames received from the camera, where N is a positive integer.
In an example, detecting the synchronisation signal may, for example, comprise removing the baseline image frame from an image frame of the second observation signal.
Optionally, detecting the synchronisation signal comprises identifying the monitored device, or a part thereof, in a remaining image frame using one or more image processing techniques. Optionally, the one or more image processing techniques include applying a neural network. Optionally, wherein identifying the monitored device further comprises filtering a background area from the remaining image frame. In examples, the signal generating unit may comprise a light source for emitting the observable synchronisation signal in a reference light colour. Optionally, detecting the synchronisation signal further comprises: determining a number of pixels of the reference light colour in the background-filtered image portion.
In examples, the device-coupled system may include a first timing device for acquiring timing data associated with the one or more device-coupled sensors and the observer system may include a second timing device for acquiring timing data associated with the audio and/or image sensor data.
Optionally, the method further comprises commanding the signal generating unit to output the synchronisation signal, i.e. to activate the signal generating unit. The device-coupled system may, for example, record timing data associated with the output synchronisation signal.
In examples, the method may further comprise commanding the observer system to determine each of the first and second observation signals.
Optionally, the data acquisition system includes a memory storing one or more reference synchronisation signal patterns. The signal generating unit may be configured to generate at least one of the one or more reference synchronisation signal patterns. The method may further comprise matching the detected synchronisation signal to one of the reference synchronisation signal patterns.
For example, the detected synchronisation signal may be matched to one of the reference synchronisation signal patterns by matching the frequency components of the first observation signal (e.g. once the frequency components of the baseline noise signal have been removed) to one or more frequency components of one of the reference synchronisation signal patterns.
In another example, the reference synchronisation signal patterns may comprise a light illumination pattern, and matching the detected synchronisation signal to one of the reference synchronisation signal patterns may comprise: comparing a series of images frames of the second observation signal to the reference synchronisation signal patterns, the first image frame of the series being the image frame used to detect the synchronisation signal.
Optionally, the sensor data received from the device-coupled system and the observer system is synchronised in dependence on matching the detected synchronisation signal to one of the reference synchronisation signals patterns.
According to another aspect of the present disclosure there is provided a non-transitory, computer-readable storage medium storing instructions thereon that when executed by a processor cause the processor to perform a method as defined above.
According to another aspect of the disclosure there is provided a monitoring system. The monitoring system comprises: an observer system; a system that is attachable to, or embeddable in, a device so as to form a device-coupled system, in use; and a data acquisition system connectable to the observer system and wirelessly connectable to the device-coupled system. The observer system comprises one or more observation sensors for observing a space and generating observation signals comprising audio and/or image sensor data relating to the observed space for transmission to the data acquisition system, in use. The device-coupled system comprises: one or more device-coupled sensors for measuring respective device parameters and generating sensor data for wireless transmission to the data acquisition system, in use; and a signal generating unit for outputting an observable synchronisation signal. The data acquisition system is configured to: receive a first observation signal from the observer system when the synchronisation signal is not observable by the one or more observation sensors; determine a baseline signal for the observed space based on the first observation signal; receive a second observation signal from the observer system when the synchronisation signal is observable by the one or more observation sensors; compare the second observation signal to the baseline signal to detect the synchronisation signal output from the monitored device in the observed space; and synchronise sensor data received from the devicecoupled system and the observer system by matching timing data associated with the output and detection of the synchronisation signal.
Optionally, the monitoring system further comprises the monitored device into which the system is embedded, or to which the system is attached in use, to form the device-coupled system. It will be appreciated that preferred and/or optional features of each aspect of the disclosure may be incorporated alone or in appropriate combination in the other aspects of the disclosure also.
BRIEF DESCRIPTION OF THE DRAWINGS
Examples of the disclosure will now be described with reference to the accompanying drawings, in which:
Figure 1 shows a schematic view of an exemplary device monitoring system;
Figure 2 shows the steps of an example method of monitoring a device using the monitoring system shown in Figure 1 ; and
Figures 3 shows exemplary sub-steps of the method shown in Figure 2; and
Figures 4 shows further exemplary sub-steps of the method shown in Figure 2.
DETAILED DESCRIPTION
Embodiments of the disclosure relate to a method, and to a system, for monitoring a device and, in particular, a use I operation of the device. The monitored device may take various suitable forms within the scope of this disclosure where it is desired to monitor the use/operation of the device by combining sensor data determined directly, i.e. acquired from one or more device-coupled sensors, with sensor data determined remotely or indirectly, i.e. acquired by a sensor (such as a camera or sound sensor), that observes the use/operation of the device within a particular space from a distance. To demonstrate the range of useful applications of the monitoring system, example applications to monitoring a golf club and a pneumatic drill shall be discussed herein.
The device monitoring system therefore includes an observer system having one or more observation sensors, such as a camera and/or a microphone/sound sensor, for observing a space in which the monitored device is used/operated. The observer system generates signals comprising audio and/or image sensor data relating to the observed space for transmission to a data acquisition system. For example, the observation sensor(s) may be arranged to observe a practice bay at a training facility or driving range and, thereby determine audio and/or video recordings comprising audio/image data associated with the use of the golf club in the practice bay.
In addition to the observer system, the monitoring system also includes a device-coupled system comprising one or more device-coupled sensors for measuring respective device parameters. To give an example, when the monitored device takes the form of a golf club, the golf club may include a device-coupled system that is attached to, or embedded into, the golf club, featuring an accelerometer, gyroscope, and/or force sensor, for measuring various parameters that may be indicative of the speed and rhythm of motion of the golf club during a golf swing. In another example, the monitored device may take the form of a pneumatic drill and the device-coupled system may include an accelerometer attached to, or embedded into, the drill to measure an angle of orientation of the drill during use. The device-coupled system wirelessly communicates the determined sensor data to the data acquisition system, for example using a wireless communication system such as a radio transceiver.
Ordinarily it would not be possible to accurately synchronise the sensor data received from the device-coupled system and the observer system. Advantageously though, the devicecoupled system also includes a signal generating unit for outputting a synchronisation signal that is observable by the observation sensor(s). For example, the signal generating unit may take the form of a light source attached to, or embedded in, the golf club for generating a light signal in an on/off pattern. In this manner, the data acquisition system receives a first observation signal from the observation sensor(s) when the synchronisation signal is unobservable by the observation sensor (e.g. when the monitored device is absent from the observed space and/or the signal generating unit is inactive) and determines a baseline signal for the observed space based thereon. For example, the baseline signal may take the form of an average image frame obtained by the camera, which depicts the practice bay without the golf club being present.
The data acquisition system subsequently receives a second observation signal from the observer system when the synchronisation signal is observable by the observation sensor(s), for example when the golf club is within the observed space and the signal generating unit is active. The data acquisition system is configured to compare the second observation signal to the baseline signal to detect the synchronisation signal output from the signal generating unit in the observed space. If the synchronisation signal is detected, the data acquisition system synchronises the sensor data received from the device- coupled system and the observer system, for example by matching timing data associated with the output and detection of the synchronisation signal.
In this manner, the monitoring system is able to synchronise data relating to the operation/use of the device from the device-coupled system and the observer system, allowing the sensor data and audio/image data to be linked and causal relations to be inferred. It is expected that the monitoring system therefore provides for improved monitoring of a device and the use/operation of the device.
The device monitoring system shall now be discussed in more detail with reference to Figure 1 , which schematically illustrates an example device monitoring system 1.
As shown in Figure 1 , the device monitoring system 1 includes a data acquisition system 2, a monitored device 4, an observer system 6, and a device-coupled system 8.
As noted previously, it is envisaged that the monitored device 4 may take various suitable forms within the scope of this disclosure where it is desired to monitor the use/operation of the device by combining sensor data determined directly, i.e. acquired from the devicecoupled system 8, with sensor data determined remotely or indirectly, i.e. acquired by the observer system 6, which observes the use/operation of the device 4 within a particular space.
The observer system 6 includes one or more observation sensors 10 configured to observe a space in which the monitored device 4 is used/operated and to determine corresponding audio and/or image sensor data. In this manner, the observer system 6 provides data for analysing the interaction of the device 4 with the surrounding environment, allowing for insights into the effects of the use/operation of the monitored device 4 on other elements to be determined. The observation sensor(s) 10 may therefore take any suitable form for observing a space and acquiring corresponding audio and/or image sensor data, including a camera for recording a video as a series of image frames and/or a microphone, amongst other suitable sensor devices. For example, the observer system 6 may be equipped with one or more observation sensors 10, in the form of a video system for recording the swing of the golfer from one or multiple perspectives. Each camera may therefore be arranged to observe a practice bay at a training facility or driving range and record audio and image data corresponding to the observed practice bay. The image sensor data may subsequently be used to analyse the various aspects of the golfer’s interaction with the golf club during the golf swing, for example using automated analysis to record the angle of the golfer’s legs and torso relative to the ground at different phases of the swing.
The observer system 6 is therefore connected to the data acquisition system 2, to which the audio and/or image data is transmitted in the form of observation signals. The observer system 6 may be connected to the data acquisition system 2 via a direct, wired connection system or otherwise include means or systems for wirelessly communicating the observations signals to the data acquisition system 2.
In contrast, the device-coupled system 8 is attached to, or otherwise embedded in, the device 4 and includes one or more device-coupled sensors 12 and a wireless communication system 14. The one or more sensors 12 are configured to obtain direct measurements of respective device parameters, i.e. physical characteristics of the device or measurable thereat. The one or more sensors 12 may therefore take any suitable form for measuring physical characteristics of the device 4 including, amongst others, sensors for measuring power characteristics of an electrical device, such as current, voltage and/or power levels, sensors for measuring position, orientation, or motion of the device 4, such as accelerometers and/or gyroscopes, as well as force sensors or pressure transducers. The examples above are not intended to be limiting though and it shall be appreciated that the choice of device-mounted sensors will vary in dependence on the device and the desired data regarding its use/operation.
The wireless communication system 14 may take any suitable form for wirelessly transmitting the sensor data to the data acquisition system 2. The wireless communication system 14 therefore allows for the transmission of sensor data to the data acquisition system 2 in the absence of any wired connection between the two systems. For example, the device-coupled system 8 may typically be embedded into the device 4 or the typical use of the device 4 may otherwise preclude wired connection to the device-coupled system 8 (or render such a connection impractical), for example impairing the use of the device 4. For this purpose, the wireless communication system 14 may therefore take the form of a wireless transceiver, such as a radio frequency transceiver for communicating the sensor data using radio signals.
As shown in Figure 1 , the device-coupled system 8 further includes a first clock or timing device 16 determining timing data, such as timestamps, associated with the sensor data, and the observer system 6 further includes a second clock or timing device 18 determining timing data, such as timestamps, associated with the image and/or audio sensor data determined by the observer system 6. For example, a video recording system may determine a video track and an audio track, attached to the video track, with the timing data for the two tracks being provided by the second timing device 18 such that the two tracks are already synchronized by the video recording system. By contrast, the first timing device 16 of the device-coupled system 8 typically comes from an embedded device that may not have a clock aligned with UTC, or otherwise differ due to a lack of reliability or precision.
The first and second timing devices 16, 18 may therefore be expected to operate on different timeframes, according to different timing standards, or otherwise be out of synchronisation. Consequently, the sensor data respectively received at the data acquisition system 2 from the observer system 6 and the device-coupled system 8 is asynchronous.
Although the wireless communication system 14 of the device-coupled system 8 enables the transmission of sensor data to the data acquisition system 2, the uncertain amount of latency associated with wireless communication, e.g. due to interference, delay, and/or the need to retransmit data, makes synchronisation of the sensor data with the audio and/or image sensor data of the observer system 6 difficult. For example, it is not possible to rely on electrical pulses to synchronise the timing devices 16, 18 since the devicecoupled system 8 does not have a wired connection to the data acquisition system 2.
Therefore an alternative mechanism is required to synchronise the sensor data and improve the accuracy. For this purpose, the device-coupled system 8 further includes the signal generating unit 20, which is further attached to, or embedded into, the monitored device 4 and configured to output synchronisation signal(s) that are observable by the observer system 6. For example, the signal generating unit 20 may comprise a light source (such as an LED or a display), which may be configured to output one or more colours of light, and/or a sound source (such as a buzzer) configured to output a noise signal at one or more sound frequencies. For example, the signal generating unit 20 may include a buzzer for emitting a tone with a predefined frequency and/or an LED for emitting light of a particular colour.
It shall be appreciated that the signal generating unit 20 corresponds to the sensor type of the observer system 6 though and, in particular, to the physical characteristics sensed by the observation sensor(s) 10. For example, if the observation sensor(s) 10 include a microphone, then the signal generating unit 10 may be configured to generate an audio signal for detection by the observer system 6. Similarly, if the observation sensor(s) 10 include a camera, then the signal generating unit 20 may be configured to generate a visual signal for detection by the observer system 6. The synchronisation signal may therefore take various forms that are suitable for detection in the audio and/or image sensor data of the observer system 6.
The role of the data acquisition system 2 is to synchronise the sensor data received from the observer system 6 and the device-coupled system 8. In particular, the data acquisition system 2 is configured to: (i) receive observation signals from the observer system 6 when the monitored device is absent from the observed space and/or the signal generating unit is inactive, (ii) determine a baseline signal for the observed space based on such observation signals, (iii) receive further observation signals from the observer system 6 when the synchronisation signal is observable by the observation sensor(s) 10, (iv) compare the further observation signals to the baseline signal to detect the synchronisation signal, and (v) synchronise the sensor data received from the device-coupled system 8 and the observer system 6 by matching timing data associated with the output synchronisation signal and the detection of the synchronisation signal. In examples, the data acquisition system 2 may further act as a command and control interface for operating the observer system 6 and/or the device-coupled system 8, as shall be described in more detail.
In this manner, the sensor data from the device-coupled system 8 can be synchronised with the audio and/or image sensor data from the observer system 6 and linked to infer causal relations. It is expected that the monitoring system 1 therefore provides for improved monitoring and observation data.
For this purpose, the data acquisition system 2 may take the form of a suitable computer system for carrying out the data processing, communication and commands as described herein. The data acquisition system 2 may therefore incorporate a data processing module 22, a communication module 24, a memory module 26 and a control module 28, as shown in Figure 1.
That is, in the described example four functional elements, units or modules are shown. Each of these units or modules may be provided, at least in part, by suitable software running on any suitable computing substrate using conventional or customer processors and memory. Some or all of the units or modules may use a common computing substrate (for example, they may run on the same server) or separate substrates, or different combinations of the modules may be distributed between multiple computing devices.
The data processing module 22 is configured to perform the data processing tasks described herein and may therefore include one or more image and/or audio signal processing algorithms for processing the sensor data. For example, the data processing module 22 may include one or more spectral analysis algorithms, such as a Fast Fourier Transform (FFT), for determining frequency components of an audio signal. The data processing module 22 may additionally or alternatively include one or more image processing algorithms or machine learning algorithms for object recognition in an image frame. For example, the data processing module 22 may include a neural network, such as a convolutional neural network trained for recognition of the monitored device, as shall be described in more detail.
The communication module 24 is configured to communicate with the observer system 6 and the device-coupled system 8 and to receive sensor data from each system 6, 8. The communication module 24 may therefore include a corresponding wireless communication system for communication with the device-coupled system 8, along with a wired or wireless communication system for communication with the observer system 6.
The memory module 26 is arranged to store the sensor data received from the observer system 6 and the device-coupled system 8. For the purpose of receiving and/or storing such data, the memory module 26 may take the form of a computer-readable storage medium (e.g., a non-transitory computer-readable storage medium). The computer- readable storage medium may comprise any mechanism for storing information in a form readable by a machine or electronic processors/computational device, including, without limitation: a magnetic storage medium (e.g., floppy diskette); optical storage medium (e.g., CD-ROM); magneto optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; or electrical or other types of medium for storing such information/instructions.
The control module 28 is an optional module that may be included to generate command signals to the observer system 6 and/or the device-coupled system 8. For example, the control module 28 may be configured to command the observer system 6 to determine observation signals. Additionally, or alternatively, the control module 28 may be configured to command the device coupled system 8, for example to active the device-coupled sensor(s) 12, to command the output of the synchronisation signal and/or to configure the output synchronisation signal, as shall be described in more detail.
The operation of the device monitoring system 1 shall now be described in more detail with additional reference to Figures 2 to 4.
Figure 2 shows an example method 100 of operating the device monitoring system 1 , in accordance with an embodiment of the disclosure.
For context, the method 100 shall be discussed alongside an example application, namely for analysing a golfer’s swing at a driving range or training facility.
This example is not intended to be limiting on the scope of the disclosure though and, in other examples, it shall be appreciated that the device monitoring system 1 may be applied to monitor various other devices.
In this scenario, the observer system 6 may be equipped with one or more observation sensors 10, in the form of a video system for recording the swing of the golfer from one or multiple perspectives. Each camera may therefore be arranged to observe the practice bay and to record audio and image data corresponding to the observed practice bay. As is conventional with golf swing analysis, the video sensor data may be used to determine information such as the angle of the golfer’s legs and torso relative to the ground at different phases of the swing. In this example, the monitored device 4 takes the form of a golf club and the device-coupled system 8 may be embedded in a handle or a clubhead of the golf club, for example. In this application, the device-coupled sensors 12 embedded in the golf club may comprise sensors such as an accelerometer, gyroscope, or force sensor. These embedded sensors may offer insight into the speed and rhythm of motion of the golf club during different phases of the swing. Furthermore, the signal generating unit 20 may take the form of a light source embedded in the grip, clubhead, or other component of the instrument, and arranged to output a synchronisation signal in the form of a series of light pulses. In this context it shall be appreciated that there are advantages in aligning the timing data of the video and embedded sensors, as this provides insights into the interaction between body position and golf club motion for each swing phase. In step 102, the device monitoring system 1 operates the observer system 6 to observe a space, such as the practice bay, and to determine a first observation signal for the observed space. Importantly, the first observation signal is determined while the synchronization signal is unobservable by the observer system 6. For example, the signal generating unit 20 may be inactive and/or the monitored device 4, such as the golf club, (or the signal generating unit 20 part thereof), may be arranged outside of the field of view of the observation sensor(s) 10 or otherwise be concealed within the observed space from observation by the sensors 10. The determination of the first observation signal may be controlled by a command signal from the control module 28, for example.
In order to determine the first observation signal, the observer system 6 may record a series of N image frames, where N is a positive integer and/or record audio for M seconds, where M is a positive integer. The integers N and/or M may be pre-programmed or determined based on one or more user inputs to the data acquisition system 2. It shall be appreciated that in each case the image frames and/or audio recording will relate to background features of the practice bay. Once the observer system 6 has recorded such images and/or audio, the first observation signal is communicated to the data acquisition system 2 for processing.
In step 104, the device monitoring system 1 determines a baseline signal for the observer system 6 based on the first observation signal. The baseline signal is a representative signal for the observed space when there is no observable synchronization signal. In this manner, the baselines signal therefore provides a reference for subsequent detection of the synchronization signal. The baseline signal may therefore take various suitable form in dependence on the sensor type of the observer system 6, but may, for example, be determined as an average of the audio and/or image sensor data acquired in a prescribed observation period, such as a period of M seconds or containing N image frames.
Various methods are known in the art for determining the baseline signal, which shall not be described in detail here to avoid obscuring the disclosure. Nonetheless, to give an example, when the observer system 6 includes a microphone, the observation system 6 may be operated to observe the space, such as the practice bay, and to record M seconds of audio forming the first observation signal, in step 102. Accordingly, in step 104, the data acquisition system 2 may perform a spectral analysis on the first observation signal to determine the frequency components of the audio signal, for example using one or more spectral analysis techniques that are known in the art, such as a Fast Fourier Transform (FFT). The determined frequency components may therefore be used as the baseline signal for subsequently detecting the synchronization signal. In another example, when the observer system 6 includes a camera, the observation system 6 may be operated to observe the space, such as the practice bay, and to record N image frames forming the first observation signal, in step 102. Accordingly, in step 104, the data acquisition system 2 may use one or more image processing techniques to determine the baseline signal as an average image frame based on the N image frames. It shall be appreciated that the determined baseline signal may therefore include the frequency components derived from M seconds of audio recorded at the practice bay and/or an average image frame determined based on N image frames, where each of the N image frames may depict the practice bay without the golf club. The baseline signal may be stored in the memory module 26 of the data acquisition system 2 for subsequent use in detecting the synchronisation signal.
Once the baseline signal has been determined, the monitored device 4 may be moved into the space observed by the observer system 6 and/or the signal generating unit 20 may be activated such that the observer system 6 observes the synchronisation signal within the observed space. For example, following an initial calibration step, during which the first observation signal is determined for a particular practice bay, the golf club (acting as the monitored device 4 in this example) may be moved into the space observed by the observation sensors 10 (i.e. the practice bay) and the signal generating unit 20 may be activated.
In an example, the method 100 may therefore further include step 106 for commanding the signal generating unit 20 of the device-mounted system 8 to output the synchronization signal. For example, upon determining the baseline signal, in step 104, the data acquisition system 2 may be configured to transmit a command signal to the signal generating unit 20, via the wireless communication system 14, to cause the signal generating unit 20 to output the synchronization signal. However, it shall be appreciated that unknown delays exist between the transmission of the command signal and its reception by the devicecoupled system 8, e.g. due to signal modulation, demodulation, encoding, decoding, and retransmissions. Hence, there is an unknown delay between the time of transmission and the activation of the signal generating unit 20. Accordingly, upon outputting the synchronisation signal, the device-coupled system 8 may be configured to store, reset or initiate a time recording at the first timing device 16, providing an accurate timestamp for reference points, such as the start time, of the synchronisation signal. It shall be appreciated that the command signal may also define a particular synchronization signal pattern for the signal generating unit 20 or otherwise provide an on/off trigger to the signal generating unit 20. In examples, the communicated synchronization signal pattern could therefore be used for subsequent detection purpose, for example by creating a unique signal pattern associated with the monitored device that distinguishes that device from other devices in the vicinity, as shall be described in more detail.
In other examples, it shall be appreciated that the signal generating unit 20 could be outputting synchronisation signals continuously or periodically, where the synchronisation signals have a random or time-varying form. In which case, the monitored device 4 may be moved into the observed space such that the synchronisation signal may be observed by the observation sensor(s) 10 once the baseline signal has been determined. In this case, it remains important that associated timing data is recorded by the device-coupled system 8, providing an accurate timestamp for one or more reference points of the synchronisation signal.
In step 108, the device monitoring system 1 operates the observer system 6 to determine a second observation signal once the synchronisation signal becomes observable within the space observed by the observation sensor(s) 10. In other words, the observer system 6 may be commanded or otherwise continue to observe the space and determines a second observation signal, in step 108, that is distinguished from the first observation signal, determined in step 102, in that the synchronisation signal is now observable by the observation sensor(s) 10. For example, at this point, the golf club may have been moved into the practice bay and the signal generating unit 20 may output the synchronisation signal in a form that is observable by the observation sensor(s) 10 observing the practice bay.
In this context it shall be appreciated that the observation signals may be determined continuously or periodically, for example at a predetermined sampling frequency, or the determination of the second observation signal may be controlled by a command signal from the control module 28, for example. In order to determine the second observation signal, the observer system 6 may, for example, record a series of P image frames, where P is a positive integer and/or record audio for Q seconds, where Q is a positive integer. The integers P and/or Q may be pre-programmed or determined based on one or more user inputs to the data acquisition system 2.
In step 110, the device monitoring system 1 processes the second observation signal to detect the synchronization signal. In particular, the data acquisition system 2 receives the second observation signal from the observer system 6 and compares the second observation signal to the baseline signal to detect the synchronization signal.
The methods used for detecting the synchronization signal will vary in dependence on the type of sensors on the observer system 6 and, in particular, whether the observation signal and baseline signal comprise audio and/or image sensor data. Although one or more methods may be suitable for comparing the second observation signal to the baseline signal, and detecting the synchronization signal, as shall be appreciated by the skilled person in the art, the following non-limiting examples are provided to give a clear indication of at least one method for detecting the synchronisation signal based on audio sensor data and, separately, based on image sensor data from the video recording.
In relation to audio sensor data, the data acquisition system 2 may detect the synchronisation signal by checking whether there are any peak(s) in the frequency components of the second observation signal at the frequency(s) of the audio synchronisation signal. In an example, the method 100 may therefore further comprise sub-steps 112 to 116 for detecting the synchronisation signal, as shown in Figure 3.
In sub-step 112, the data acquisition system 2 performs a spectral analysis on the second observation signal to determine the frequency components for comparison to the baseline signal. For example, the data acquisition system may use one or more spectral analysis techniques that are known in the art, such as a Fast Fourier Transform (FFT), for determining the frequency components in substantially the same manner as described in step 104.
In sub-step 114, the data acquisition system 2 removes or subtracts the baseline signal from the second observation signal. In other words, the data acquisition system 2 may subtract the background frequency components of the baseline signal from the frequency components determined for the second observation signal, leaving one or more remaining frequency components that distinguish the second observation signal form the baseline signal. In sub-step 116, the data acquisition system 2 compares the remaining one or more remaining frequency components to one or more reference frequency components corresponding to the synchronisation signal to determine whether there is a match. In an example, the data acquisition system 2 may, for example, compare the remaining frequency component(s) of the second observation signal to a database of reference synchronisation signals and/or respective frequency components stored in the memory module 26 of the data acquisition system 2.
If the data acquisition system 2 does not determine a match between the remaining frequency component(s) of the second observation signal and the reference synchronisation signal, the synchronisation signal is not detected and the device monitoring system 1 may operate the observer system 6 to determine a subsequent observation signal, in step 108, for comparison to the baseline signal, in step 110.
In contrast, if the data acquisition system 2 is able to determine a match between the remaining frequency component(s) of the second observation signal and the reference synchronisation signal, the data acquisition system may validate the detection of the synchronisation signal and the method 100 proceeds to synchronise the sensor data, in step 118.
First though, in relation to image sensor data, the data acquisition system 2 may detect the synchronisation signal by detecting a visible signal, such as a light source of a predefined colour, in one or more image frames of the second observation signal. For this purpose, the data acquisition system 2 may apply one or more image processing techniques to isolate and refine an image portion corresponding to the monitored device 4, or an expected position of the signal generating unit 20 on the device 4, for example to remove noise sources before checking for pixels within that image portion matching the colour output of the synchronisation signal. The method 100 may therefore further comprise sub-steps 212 to 216 for detecting the synchronisation signal, as shown in Figure 4.
In sub-step 212, the data acquisition system 2 subtracts or removes the contents of the baseline signal, i.e. the average image frame, determined in step 102, from each image frame of the second observation signal, determined in step 108. In sub-step 214, the data acquisition system 2 may be configured to detect the monitored device 4 in the remaining image frame. For this purpose, the data acquisition system 2 may use one or more computer vision techniques, such as object recognition, to identify the device 4 in the remaining image frame.
One or more computer vision techniques that are known in the art may be used for this purpose, which shall not be described in detail here to avoid obscuring the disclosure. Nonetheless, it shall be appreciated that such computer vision techniques may involve the use of a neural network, such as a convolutional neural network, trained to detect the device of interest, such as a golf club.
For this purpose, a training dataset can be generated by collecting images of the device of interest, such as the golf club (i.e. either generic golf clubs or a particular make and model for example), with a variety of backgrounds that are representative of the physical environments in which the device monitoring system 1 may be deployed. The training data should incorporate a range of lighting intensity and light source angles, to train a generalizable model. The neural network may be a YOLO architecture as is well known in the art, although various other architectures are envisaged, such as R-CNN and Fast R- CNN. Further information concerning the use of a YOLO architecture, for example, can be found in “You only look once: Unified, real-time object detection" (2016) by Redmon, J., Divvala, S., Girshick, R. and Farhadi, A., as published in Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 779-788).
As is conventional, the training dataset may be split into three parts corresponding to a training, validation and test dataset. The skilled person shall appreciate that optimal hyperparameters may be found by optimizing recall, precision, or another performance metric such as F-score. Object detection may initially find the entire golf club, and a separate object detection model may then be used to find the club head or the grip (in which the signal generating unit 20 is embedded) in a constrained part of this image. Based on these models, the expected coordinates of the signal generating unit 20 can be estimated, based on the orientation of the device (such as a golf club) relative to the plane of the observer system 6.
If the monitored device 4 is not detected in the remaining image portion, the data acquisition system 2 may return to determining one or more further image frames, in step 108. However, upon detecting the monitored device 4 in the remaining image portion, in substep 214, the parts of the remaining image portion that are considered background, and therefore not belonging to the device, are effectively removed from the image portion to produce a background-filtered image frame. The combination of baseline frame removal and background removal helps in removing noise sources (i.e. extraneous light and/or audio sources) that may bring inaccuracies in the synchronization system.
In sub-step 216, the data acquisition system 2 proceeds to process the background-filtered image frame to detect the synchronisation signal. To give an example, the data acquisition system 2 may determine the number of pixels of a prescribed colour (corresponding to the synchronisation signal) in the background-filtered image frame, and compare the determined number of pixels to a threshold (e.g. K, where K is a positive integer) for detecting the synchronisation signal.
If the number of pixels does not exceed the threshold, the data acquisition system 2 may return to determining one or more further image frames, in step 108.
However, when the number of pixels exceeds the threshold amount, the data acquisition system 2 detects the synchronisation signal and the method 100 proceeds to synchronise the sensor data, in step 118.
Returning to Figure 2, in step 118, the data acquisition system 2 proceeds to synchronize the sensor data received from the device-coupled system 8 and the observer system 6. The data acquisition system 2 is configured to achieve the synchronisation by matching timing data, such as timestamps, received from the first and second timing devices 16, 18, each being associated with a reference point, such as a start and detection point, of the synchronisation signal as output from the signal generating unit 20 and detected in the second observation signal. For example, the sensor data may be synchronised by matching a timestamp associated with the start of the output of the synchronisation signal from the first timing device 16, with a timestamp from the second timing device 18 associated with the first point of detection of the synchronisation signal in the audio/image sensor data of the second observation signal. In other examples, different reference points may be used. Synchronising the sensor data in this manner ensures that the various sources of sensor data are matched to a common time frame, rather than local timeframes associated with any one system 6, 8 in isolation.
In this manner, the device monitoring system 1 provides for improved data acquisition relating to the monitored device 4 and the use/operation of the device 4. It is expected that the method 100 will therefore provide for more accurate analysis of the device 4 and greater insights into causal relations between the device parameters and the audio/image data.
For example, in relation to the analysis of the golfer’s swing, the matched sensor data could be used to gain enhanced insights into the interaction between body position and golf club motion for each swing phase, ultimately leading to better feedback for the golfer.
In another example, the device monitoring system 1 may be used to monitor a pneumatic drill, where the device-coupled sensor(s) 12 may include an accelerometer embedded into the device and the observation sensor(s) 10 may include a video camera recording a work area. In this scenario, the accelerometer may offer insights into the angle of orientation of the drill to determine if this differs from guidelines. Furthermore, it may be advantageous to align the timestamp of the accelerometer sensor data with the image sensor data, to enable automated analysis of events occurring in the work area (i.e. in the vicinity of the drilling) that may be associated with a change in the usage pattern of the drill. For example, a co-worker may approach the drill user and attract their attention, this may distract the user and may result in a dangerous and unsuitable angle of orientation of the drill. The synchronization of these timestamps can therefore lead to targeted initiatives to improve workplace safety.
It is noted that the steps of the method 100 are only provided as a non-limiting example of the disclosure though and many modifications may be made to the above-described examples without departing from the scope of the appended claims.
In the above example, it shall be appreciated that the sensor data synchronisation can be performed based on the detection of the synchronisation signal in the image sensor data, the audio sensor data, or both. In case both are being used, it is possible to cross-check the resulting timing data and consider the synchronisation to be successful (in step 118) only if both of them agree on the results, otherwise the device monitoring system 1 may be configured to repeat the synchronization procedure 100.
In other examples, the data acquisition system 2 may be further configured to determine a signal pattern of the detected synchronisation signal, such as an on/off sequence of a light source, and to compare the determined pattern to a database of one or more reference synchronisation signal patterns to check whether a match exists. In this manner, the comparison may be used to provide a more robust verification of the detected synchronisation signal. For example, the data acquisition system 2 may be configured to synchronise the sensor data received from the device-coupled system 8 and the observer system 6 in dependence on determining a match between the detected synchronisation signal and one of the reference synchronisation signals patterns.
For this purpose, the data acquisition system 2 may store one or more reference synchronisation signal patterns, e.g. in the memory module 26, for comparison to the detected synchronisation signal. The reference synchronisation signals may be preprogrammed or predetermined, for example during a pairing process.
For example, having detected the synchronisation signal, in step 110, the data acquisition system 2 may proceed to determine a pattern of the detected synchronisation signal, for example by recording a start time of the detected signal and timing data associated with changes in the synchronisation signal, such as an on/off pattern of a light source or changes in frequency of an audio tone. The data acquisition system 2 may then compare the detected synchronisation signal to a database of one or more reference synchronisation signal patterns and check whether a match exists. For this purpose, the data acquisition system 2 may use one or more known signal comparison techniques that shall be not described here to avoid obscuring the disclosure. If a match is determined, the data acquisition system 2 may proceed to synchronise the sensor data, in step 118. If no match is detected, the data acquisition system 2 may return to acquiring further observation signals, in step 108, and may save the detected synchronisation pattern to a memory.
In further examples, the data acquisition system 2 may also store a device ID and/or data record associated with each of the one or more reference synchronisation signal patterns such that the monitored device 4 can be identified by the detected synchronisation signal. The data acquisition system 2 may also be configured to update the data record associated with the matched reference synchronisation signals pattern using the synchronised sensor data. For example, each data record may store historic sensor data associated with the monitored device 4 from which the synchronisation signal is detected and the data acquisition system 2 may update the data record once the sensor data has been synchronised in step 118.

Claims

Claims
1. A method of monitoring a device using a monitoring system comprising a data acquisition system, an observer system and a device-coupled system, the data acquisition system being connected to the observer system and wirelessly connected to the devicecoupled system, the observer system comprising one or more observation sensors for observing a space and generating observation signals comprising audio and/or image sensor data relating to the observed space for transmission to the data acquisition system, the device-coupled system comprising: one or more device-coupled sensors for measuring respective device parameters and generating sensor data for wireless transmission to the data acquisition system; and a signal generating unit for outputting an observable synchronisation signal, the method comprising: receiving a first observation signal from the observer system when the synchronisation signal is not observable by the one or more observation sensors; determining a baseline signal for the observed space based on the first observation signal; receiving a second observation signal from the observer system when the synchronisation signal is observable by the one or more observation sensors; comparing the second observation signal to the baseline signal to detect the synchronisation signal output from the monitored device in the observed space; and synchronising sensor data received from the device-coupled system and the observer system by matching timing data associated with the output and detection of the synchronisation signal.
2. A method according to claim 1 , wherein the one or more observation sensors include a sound sensor configured to generate an observation signal comprising audio sensor data; and wherein determining the baseline signal comprises determining a baseline noise signal based on the first observation signal.
3. A method according to claim 2, wherein determining the baseline noise signal comprises determining frequency components of the first observation signal, and detecting the synchronisation signal comprises: determining frequency components of the second observation signal; and removing the frequency components of the baseline noise signal; optionally, wherein the frequency components are determined using one or more spectral analysis algorithms, optionally, wherein the one or more spectral analysis algorithms include a Fast Fourier Transform (FFT)
4. A method according to any preceding claim, wherein the one or more observation sensors include a camera configured to generate an observation signal comprising image sensor data; and wherein determining the baseline signal comprises determining a baseline image frame.
5. A method according to claim 4, wherein the baseline image frame is determined as an average of N image frames received from the camera, where N is a positive integer.
6. A method according to claim 5, wherein detecting the synchronisation signal comprises removing the baseline image frame from an image frame of the second observation signal.
7. A method according to claim 6, wherein detecting the synchronisation signal comprises identifying the monitored device, or a part thereof, in a remaining image frame using one or more image processing techniques, optionally, wherein: the one or more image processing techniques include applying a neural network, and/or wherein identifying the monitored device further comprises filtering a background area from the remaining image frame to produce a background-filtered image frame.
8. A method according to claim 7, wherein the signal generating unit comprises a light source for emitting the observable synchronisation signal in a reference light colour; and wherein detecting the synchronisation signal further comprises: determining a number of pixels of the reference light colour in the background-filtered image frame.
9. A method according to any preceding claim, wherein the device-coupled system includes a first timing device for acquiring timing data associated with the one or more device-coupled sensors and the observer system includes a second timing device for acquiring timing data associated with the audio and/or image sensor data.
10. A method according to claim 9, further comprising commanding the signal generating unit to output the synchronisation signal, wherein the device-coupled system records timing data associated with the output synchronisation signal.
11. A method according to any preceding claim, further comprising commanding the observer system to determine each of the first and second observation signals.
12. A method according to any preceding claim, wherein the data acquisition system includes a memory storing one or more reference synchronisation signal patterns; wherein the signal generating unit is configured to generate at least one of the one or more reference synchronisation signal patterns; and wherein the method further comprises matching the detected synchronisation signal to one of the reference synchronisation signal patterns.
13. A method according to claim 12, wherein the sensor data received from the devicecoupled system and the observer system is synchronised in dependence on matching the detected synchronisation signal to one of the reference synchronisation signals patterns.
14. A monitoring system comprising: an observer system; a system that is attachable to, or embeddable in, a device so as to form a device-coupled system, in use; and a data acquisition system connectable to the observer system and wirelessly connectable to the device-coupled system, wherein the observer system comprises one or more observation sensors for observing a space and generating observation signals comprising audio and/or image sensor data relating to the observed space for transmission to the data acquisition system, in use, the device-coupled system comprises: one or more device-coupled sensors for measuring respective device parameters and generating sensor data for wireless transmission to the data acquisition system, in use; and a signal generating unit for outputting an observable synchronisation signal; and the data acquisition system is configured to: receive a first observation signal from the observer system when the synchronisation signal is not observable by the one or more observation sensors; determine a baseline signal for the observed space based on the first observation signal; receive a second observation signal from the observer system when the synchronisation signal is observable by the one or more observation sensors; compare the second observation signal to the baseline signal to detect the synchronisation signal output from the monitored device in the observed space; and synchronise sensor data received from the device-coupled system and the observer system by matching timing data associated with the output and detection of the synchronisation signal.
15. A monitoring system according to claim 14, further comprising the monitored device into which the system is embedded, or to which the system is attached in use, to form the device-coupled system.
PCT/EP2022/075318 2022-09-12 2022-09-12 A method of monitoring a device and synchronising sensor data WO2024056153A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2022/075318 WO2024056153A1 (en) 2022-09-12 2022-09-12 A method of monitoring a device and synchronising sensor data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2022/075318 WO2024056153A1 (en) 2022-09-12 2022-09-12 A method of monitoring a device and synchronising sensor data

Publications (1)

Publication Number Publication Date
WO2024056153A1 true WO2024056153A1 (en) 2024-03-21

Family

ID=83558220

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/075318 WO2024056153A1 (en) 2022-09-12 2022-09-12 A method of monitoring a device and synchronising sensor data

Country Status (1)

Country Link
WO (1) WO2024056153A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9211439B1 (en) * 2010-10-05 2015-12-15 Swingbyte, Inc. Three dimensional golf swing analyzer
EP3742737A2 (en) * 2019-05-24 2020-11-25 Sony Interactive Entertainment Inc. Image acquisition system and method
US20210104264A1 (en) * 2010-08-26 2021-04-08 Blast Motion Inc. Multi-sensor event correlation system
GB2589080A (en) * 2019-11-08 2021-05-26 Ethersec Ind Ltd Surveillance system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210104264A1 (en) * 2010-08-26 2021-04-08 Blast Motion Inc. Multi-sensor event correlation system
US9211439B1 (en) * 2010-10-05 2015-12-15 Swingbyte, Inc. Three dimensional golf swing analyzer
EP3742737A2 (en) * 2019-05-24 2020-11-25 Sony Interactive Entertainment Inc. Image acquisition system and method
GB2589080A (en) * 2019-11-08 2021-05-26 Ethersec Ind Ltd Surveillance system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BY REDMON, JDIVVALA, SGIRSHICK, RFARHADI, A: "You only look once: Unified, real-time object detection", 2016, article "Proceedings of the IEEE conference on computer vision and pattern recognition", pages: 779 - 788

Similar Documents

Publication Publication Date Title
CN109579853B (en) Inertial Navigation Indoor Positioning Method Based on BP Neural Network
US9350923B2 (en) System and method for tracking
US9679420B2 (en) Vehicle event recording system and method
US10398359B2 (en) Movement analysis system, wearable movement tracking sensors, and associated methods
US9175962B2 (en) Pedestrian observation system, recording medium, and estimation of direction of travel
EP2782046B1 (en) Information processing device, sensor device, information processing system, and storage medium
US11113515B2 (en) Information processing device and information processing method
US9576500B2 (en) Training supporting apparatus and system for supporting training of walking and/or running
CN108988974B (en) Time delay measuring method and device and system for time synchronization of electronic equipment
US10845460B2 (en) Sound source position detection device, sound source position detection method, sound source position detection program, and recording medium
KR101775581B1 (en) Data processing method for providing information on analysis of user's athletic motion and analysis device of user's athletic motion using the same, and data processing method for providing information on analysis of user's golf swing and golf swing analysis device for the same
CN206657470U (en) With the rod-type pump controller that oil pumper is used together
JP2013192591A (en) Motion analysis information collecting apparatus, motion analysis device and motion analysis method
US20240366159A1 (en) Synchronizing sensors using heart rate signals
WO2024056153A1 (en) A method of monitoring a device and synchronising sensor data
US11208208B2 (en) Systems and methods for synchronizing events in shifted temporal reference systems
CN106680769A (en) Property inspection method of positioning system for rotary baseline interferometer
US11402897B2 (en) Position estimation apparatus, position estimation method, and program
SE543581C2 (en) System for analyzing movement in sport
Malawski et al. Synchronization of External Inertial Sensors and Built-in Camera on Mobile Devices
WO2023150715A3 (en) Systems and methods for measuring and analyzing the motion of a swing
JP7295732B2 (en) Acoustic measuring device and program
WO2016063661A1 (en) Information processing device, information processing method, and program
GB2607774A (en) Method and system for identifying one of a ball impact and a custom tap
US12259400B2 (en) Information processing device, information processing system, and information processing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22783475

Country of ref document: EP

Kind code of ref document: A1