[go: up one dir, main page]

SE2350404A1 - A robotic work tool adapted for monitoring using detected sounds - Google Patents

A robotic work tool adapted for monitoring using detected sounds

Info

Publication number
SE2350404A1
SE2350404A1 SE2350404A SE2350404A SE2350404A1 SE 2350404 A1 SE2350404 A1 SE 2350404A1 SE 2350404 A SE2350404 A SE 2350404A SE 2350404 A SE2350404 A SE 2350404A SE 2350404 A1 SE2350404 A1 SE 2350404A1
Authority
SE
Sweden
Prior art keywords
sound
work tool
robotic work
user
detected
Prior art date
Application number
SE2350404A
Inventor
Andreas Jönsson
Original Assignee
Husqvarna Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Husqvarna Ab filed Critical Husqvarna Ab
Priority to SE2350404A priority Critical patent/SE2350404A1/en
Priority to PCT/SE2024/050180 priority patent/WO2024210783A1/en
Publication of SE2350404A1 publication Critical patent/SE2350404A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D34/00Mowers; Mowing apparatus of harvesters
    • A01D34/006Control or measuring arrangements
    • A01D34/008Control or measuring arrangements for automated or remotely controlled operation
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D34/00Mowers; Mowing apparatus of harvesters
    • A01D34/01Mowers; Mowing apparatus of harvesters characterised by features relating to the type of cutting apparatus
    • A01D34/412Mowers; Mowing apparatus of harvesters characterised by features relating to the type of cutting apparatus having rotating cutters
    • A01D34/42Mowers; Mowing apparatus of harvesters characterised by features relating to the type of cutting apparatus having rotating cutters having cutters rotating about a horizontal axis, e.g. cutting-cylinders
    • A01D34/62Other details
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/243Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/247Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/656Interaction with payloads or external entities
    • G05D1/689Pointing payloads towards fixed or moving targets
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/16Actuation by interference with mechanical vibrations in air or other fluid
    • G08B13/1654Actuation by interference with mechanical vibrations in air or other fluid using passive vibration detection systems
    • G08B13/1672Actuation by interference with mechanical vibrations in air or other fluid using passive vibration detection systems using sonic detecting means, e.g. a microphone operating in the audio frequency range
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19647Systems specially adapted for intrusion detection in or around a vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19697Arrangements wherein non-video detectors generate an alarm themselves
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D2101/00Lawn-mowers
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2105/00Specific applications of the controlled vehicles
    • G05D2105/15Specific applications of the controlled vehicles for harvesting, sowing or mowing in agriculture or forestry
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2105/00Specific applications of the controlled vehicles
    • G05D2105/80Specific applications of the controlled vehicles for information gathering, e.g. for academic research
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2105/00Specific applications of the controlled vehicles
    • G05D2105/80Specific applications of the controlled vehicles for information gathering, e.g. for academic research
    • G05D2105/85Specific applications of the controlled vehicles for information gathering, e.g. for academic research for patrolling or reconnaissance for police, security or military applications
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2107/00Specific environments of the controlled vehicles
    • G05D2107/20Land use
    • G05D2107/23Gardens or lawns
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2109/00Types of controlled vehicles
    • G05D2109/10Land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2111/00Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
    • G05D2111/20Acoustic signals, e.g. ultrasonic signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2111/00Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
    • G05D2111/30Radio signals
    • G05D2111/36Radio signals generated or reflected by cables or wires carrying current, e.g. boundary wires or leaky feeder cables

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Multimedia (AREA)
  • Harvester Elements (AREA)

Abstract

The present disclosure relates to a method for a robotic work tool control unit (110) that is used in a robotic work tool (100). The method comprises determining (S100) if a microphone arrangement (170, 171), comprised in the robotic work tool (100), has detected a sound (210), and if that is the case, the method comprises determining (S200) if the detected sound requires a user to be contacted. If that is the case, the method comprises contacting (S300) a user; and using (S400) the detected sound (210) for determining a direction (D) towards the detected sound (210).

Description

TITLE A robotic work tool adapted for monitoring using detected sounds TECHNICAL FIELD The present disclosure relates to a robotic work tool, such as a robotic lawn mower, that is adapted for monitoring using detected sounds.
BACKGROUND Robotic work tools such as for example robotic lawn mowers are becoming increasingly more popular. ln a typical deployment a work area, such as a garden, the work area is enclosed by a boundary wire with the purpose of keeping the robotic lawn mower inside the work area.
Alternatively, or as a supplement, the robotic lawn mower can be equipped with a navigation system that is adapted for satellite navigation by means of GPS (Global Positioning System) or some other Global Navigation Satellite System (GNSS) system, for example using Real Time Kinematic (RTK). ln addition to this, the navigation system is adapted for navigation by means of a local base station that can be housed in a charging station and provide a navigation signal that further increases the navigation accuracy.
The robotic lawn mower is adapted to cut grass on a user's lawn automatically and can be charged automatically without intervention of the user, and does not need to be manually managed after being set once.
When not used for cutting grass, the robotic lawn mower is normally positioned at a charging station for charging re-chargeable batteries that form a power source for operation of the robotic lawn mower. When sufficiently charged, the robotic lawn mower can be used for other duties if grass cutting is not required, for example patrolling and monitoring of the work area or another area that constitutes a patrol area.
This is for example disclosed in WO 2016097897A1 where a robotic vehicle (e.g., a mower or watering device) is used for monitoring a parcel including determining whether monitored data is indicative of a qualifying event at operation, and selectively initiating an alarm or notification function in response to the data indicating occurrence of a trigger event at operation. To accomplish this, the robotic vehicle comprises a monitoring module that can be configured to receive position information from a positioning module and image or video data from a camera. lt is, however, desired to provide a robotic work tool, such as a robotic lawn mower, that is adapted to perform a more efficient monitoring, for example inside, or also outside, a patrol area, being adapted to detect abnormal events.
SUMMARY The object of the present disclosure is to provide a method for a robotic work tool control unit that is used in a robotic work tool. The method comprises determining if a microphone arrangement, comprised in the robotic work tool, has detected a sound, and if that is the case, the method comprises determining if the detected sound requires a user to be contacted. lf that is the case, the method comprises contacting a user, and using the detected sound for determining a direction towards the detected sound. ln this manner, if deemed necessary, a user is contacted when a sound is detected and a direction towards the sound is determined. This means that only certain sounds trigger user contact and thus provides a high degree of security and reliability.
According to some aspects, the robotic work tool control unit is used in a robotic lawn mower. This means that sounds detected outdoor can be investigated.
According to some aspects, the method further comprises determining if the detected sound requires investigation. lf that is the case, the method comprises controlling the robotic work tool to direct at least one sensor device, comprised in the robotic work tool and adapted for acquiring sensor images, in the determined direction, and controlling the sensor device to acquire sensor images.
This means that sensor images of the sound source, for example a burglar, can be obtained.
According to some aspects, each sensor device (172) is constituted by one of a camera device; a radar device a LIDAR device; and an ultrasonic device.
According to some aspects, at least one sensor device is constituted by an infrared camera device.
This means that many types of sensor devices can be used and even combined.
According to some aspects, the method further comprises relaying the acquired sensor images to a user. This means that pictures of an object, an animal or a person such as a burg|ar can be relayed and displayed to a user, for example via a user terminal such as a smartphone.
According to some aspects, the method further comprises presenting how a detected object moves to a user via a user terminal. This means that when at least one sensor device is in the form of a radar, LIDAR and/or an ultrasonic device, corresponding detections can be used to track how one more objects move, and display the movement on the user terminal.
According to some aspects, the method further comprises analyzing the acquired sensor images and determining if the acquired sensor images relate to an object that requires further action. ln this way, an object can be classified as unimportant such that unnecessary actions are taken. On the other hand, if deemed necessary, suitable actions can be taken.
According to some aspects, analyzing comprises comparing the acquired sensor images with predefined sensor images which relate to one or more objects which do not require further action. According to some further aspects, analyzing comprises comparing the acquired sensor images which relate to one or more objects with predefined sensor images which do require further action. ln this way, an efficient and reliable analysis of the acquired sensor images can be performed.
According to some aspects, the method comprises storing the detected sound and classifying the stored sound in dependence of the type of object that is associated with the sound. This means that certain sounds are associated with objects which do require further action, and certain other sounds are associated with objects which do not require further action. This enables an even more reliable determination regarding whether further action is required.
According to some aspects, the method comprises determining that a detected sound requires further action when the detected sound has been classified to relate to one or more objects which require further action. This means that certain sounds are associated with objects which do require further action.
According to some aspects, the stored sounds are deleted after a certain time or when a certain operation or set of operations have been executed. ln this manner, memory management can be enhanced.
According to some aspects, the method comprises determining that a detected sound requires investigation when the detected sound has been classified to relate to a predetermined set of sounds and/or sound characteristics, and/or if a user has instructed the control unit that investigation is required, by using a user terminal. According to some further aspects, the method comprises determining that a detected sound requires a user to be contacted when the detected sound has been classified to relate to a predetermined set of sounds and/or sound characteristics.
This means that not all sounds trigger an investigation, or even a user to be contacted. Only certain sounds, for example sounds associated with persons moving, braking glass etc. trigger a user to be contacted and or an investigation to be performed.
According to some aspects, the predetermined set of sounds and/or sound Characteristics relates to at least one of - sounds of certain time durance, - sounds of certain frequency intervals, and - sounds with certain patterns regarding volume and frequency intervals.
This means that for example abrupt sounds, sounds that relate to a person walking and braking glass can be distinguished.
According to some aspects, the method comprises controlling the robotic work tool to move along a certain patrol path within a certain pre-defined patrol area.
According to some aspects, the pre-defined patrol area is defined by using at least one of - a boundary- or guide wire, - satellite navigation, - a local radio base station that is used for providing a navigation signal (e.g. UWB, Ultra Wideband), - VSLAM (Visual simultaneous localization and mapping), - a permanent magnet, - a physical barrier, and - an edge detecting sensor, such as a camera or radar detecting a grass edge. ln this way, a larger area can be covered and surveilled in a controlled manner.
According to some aspects, the method comprises controlling the robotic work tool to move along a certain patrol path at certain times that are predetermined, user-induced or determined stochastically. According to some aspects, the patrol path is pre- determined, and/or determined or changed stochastically within the patrol area before or during movement along the patrol path. ln this way, many types of varied control paths can be obtained. This also provides flexibility and unpredictability for the patrolling.
According to some aspects, the method comprises running at least one functionality that generates a sound and determining that a detected sound requires investigation when the detected sound has been classified to re|ate to abnormal running conditions for said functionality. ln this manner, the control unit can control the robotic lawn mower to perform self- diagnostic measures, for example running different motors and turning in different directions. ln this manner, for example worn bearings can be detected by means of sound analysis.
The present disclosure also relates to robotic work tools and robotic work tool systems that are associated with above advantages.
BRIEF DESCRIPTION OF THE DRAWINGS The present disclosure will now be described more in detail with reference to the appended drawings, where: Figure 1A shows a perspective side view of a robotic lawn mower; Figure 1B shows a schematic overview of the robotic lawn mower; Figure 2 schematically illustrates a robotic lawn mower system; Figure 3 shows a schematic view of a control unit; Figure 4 shows a computer program product; and Figure 5 shows a flowchart for methods according to the present disclosure.
DETAILED DESCRIPTION Aspects of the present disclosure will now be described more fully hereinafter with reference to the accompanying drawings. The different devices, systems, computer programs and methods disclosed herein can, however, be realized in many different forms and should not be construed as being limited to the aspects set forth herein. Like numbers in the drawings refer to like elements throughout.
The terminology used herein is for describing aspects of the disclosure only and is not intended to limit the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. ln the following, a robotic lawn mower will be described, but this should not be regarded as limiting, and it will be understood that any type of robotic work tool is possible for implementing the present disclosure.
Figure 1A shows a perspective view of a non-limiting example of a robotic lawn mower 100 and Figure 1B shows a schematic overview of the robotic lawn mower 100. The robotic lawn mower 100 is adapted for a forward travelling direction F, has a body 140 and a plurality of wheels 130; in this example the robotic lawnmower 100 has four wheels 130, two front wheels and two rear wheels. The robotic lawn mower 100 comprises a control unit 110 and at least one electric motor 150, where at least some of the wheels 130 are drivably connected to at least one electric motor 150. lt should be noted that even if the description herein is focused on electric motors, combustion engines may alternatively be used in combination with an electric motor arrangement. The robotic lawn mower 100 may be a multi-chassis type or a mono-chassis type. A multi-chassis type comprises more than one body parts that are movable with respect to one another. A mono-chassis type comprises only one main body part. ln this example embodiment, the robotic lawnmower 100 is of a mono-chassis type, having a main body part 140. The main body part 140 substantially houses all components of the robotic lawnmower 100.
The robotic lawnmower 100 also comprises a grass cutting device 160, such as a rotating blade 160 driven by a cutter motor 165. The robotic lawnmower 100 also has at least one rechargeable electric power source such as a battery 155 for providing power to the electric motor arrangement 150 and/or the cutter motor 165.
With reference also to Figure 2, according to some aspects, the battery 155 is arranged to be charged by means of received charging current from a charging station 215, received through charging skids 156 or other suitable charging connectors. lnductive charging without galvanic contact, only by means of electric contact, is also conceivable. The battery is generally constituted by a rechargeable electric power source 155 that comprises one or more batteries that can be separately arranged or be arranged in an integrated manner to form a combined battery. Other types of power sources and charging methods are of course conceivable, for example a fuel cell arrangement and contactless charging. Contactless charging can for example include for inductive charging or charging via high-frequency signals.
For a robotic lawn mower, this means that sounds detected outdoor can be investigated. For indoor robotic work tools, sounds detected indoor can be investigated.
The control unit 110 is adapted to control the operation of the robotic lawn mower 100.
According to some aspects, the robotic lawnmower 100 may further comprise at least one navigation system 175. ln one embodiment, the navigation system 175 comprises one or more sensors for deduced navigation. Examples of sensors for deduced reckoning are odometers, accelerometers, gyroscopes, and compasses to mention a few examples. ln one embodiment, the navigation system 175 comprises a beacon navigation sensor and/or a satellite navigation sensor. The beacon navigation sensor may be a Radio Frequency receiver, such as an Ultra Wide Band (UWB) receiver or sensor, configured to receive signals from a Radio Frequency beacon, such as a UWB beacon or other type of local base station 214 that can be housed in the charging station 215 or at any other suitable location and provide a navigation signal that further increases the navigation accuracy.
The satellite navigation sensor may be a GPS (Global Positioning System) device or other Global Navigation Satellite System (GNSS) device, according to some aspects for example using Real Time Kinematic (RTK).
The robotic lawn mower 100 thus comprises a navigation system 175 that according to some aspects is adapted for satellite navigation and/or navigation by means of one or more local beacons in the form of one or more local base stations 214. The control unit 110 is according to some further aspects adapted to receive position data from the navigation system 175 and instructions from a user terminal 280, said instructions comprising directions for movement of the robotic lawn mower 100.
The robotic lawn mower 100 also comprises at least one sensor device 172 adapted to acquire sensor images and a microphone arrangement 171A, 171B adapted to detect sound 210, 220, 230. ln this example, at least one sensor device is in the form of a camera device 172 that is adapted to provide images of the environment in front of the camera device 172.
The control unit 110 is adapted to control the operation of the robotic lawn mower 100 and the control unit 110 is adapted to determine if the microphone arrangement 171A, 171 B has detected a sound 210. lf that is the case, the control unit 110 is adapted to determine if the detected sound requires a user to be contacted, and if that is the case, the control unit 110 is adapted to contact a user by means of a communication interface 113 comprised in the control unit 110, and to determine a direction D towards the detected sound 210 by means of the detected sound 210. ln this manner, if deemed necessary, a user is contacted when a sound is detected, and a direction towards the sound is determined. This means that only certain sounds trigger user contact and thus provides a high degree of security and reliability.
According to some aspects, the control unit 110 is adapted to determine if the detected sound requires investigation; and if that is the case, the control unit 110 is adapted to control the robotic lawn mower 100 to direct 120 at least one sensor device 172 in the determined direction (D), and to control the sensor device 172 to acquire sensor images, for example camera images. This can for example be performed by turning the robotic lawn mower, as illustrated in Figure 2, which is necessary in the case of a fixed sensor device 172, or by turning the sensor device 172 in case the sensor device 172 in itself is movable. A combination is of course conceivable to obtain a high degree of efficiency.
This means that sensor images of the sound source 211, in Figure 2 exemplified by a burglar 211, can be obtained.
As mentioned above, the robotic lawn mower 100 comprises at least one sensor device, and each sensor device 172 is according to some aspects constituted by one of a camera device, a radar device a LIDAR device, and an ultrasonic device. According to some aspects, at least one sensor device is constituted by an infrared camera device.
This means that many types of sensor devices can be used and even combined.
According to some aspects, the acquired sensor images are re|ayed to a user, which for example means that pictures of the burglar 211 can be re|ayed and displayed to a user, for example via a user terminal 280 such as a smartphone. The sensor images may for example be re|ayed to a user via the communication interface 113.
According to some aspects, the control unit 110 is adapted to present how a detected object moves to a user via the user terminal 280. This means that when at least one sensor device is in the form of a radar, LIDAR and/or an ultrasonic device, radar, LIDAR and/or ultrasonic detections can be used to track how one more objects move, and display the movement on the user terminal 280. The user terminal may use an application or other software that displays the area where the robotic lawn mower 100 is deployed, such as a map, and moving objects may be shown on that map.
According to some aspects, the control unit 110 is adapted to analyze the acquired sensor images and determine if the acquired sensor images relate to an object 211 that requires further action. ln this way, an object can be classified as unimportant such that unnecessary actions are taken. On the other hand, if deemed necessary, suitable actions can be taken.
According to some aspects, for that purpose, the control unit 110 is adapted to compare the acquired sensor images with predefined sensor images which relate to 11 one or more objects which do not require further action, for example sensor images that for example relate to animals 221, wind and passing vehicles 231.
According to some further aspects, also for that purpose, the control unit 110 is adapted to compare the acquired sensor images which relate to one or more objects 211 with predefined sensor images which do require further action, for example sensor image of persons 211, and possibly unknown persons. ln this way, an efficient and reliable analysis of the acquired sensor images can be performed.
According to some aspects, the control unit 110 is adapted to store the detected sound 210, 220, 230 and classify the stored sound in dependence of the type of object 211, 221, 231 that is associated with the sound 210, 220, 230. This means that certain sounds 210 are associated with objects 211 which do require further action, and certain other sounds 220, 230 are associated with objects 221, 231 which do not require further action. This enables an even more reliable determination regarding whether further action is required.
According to some aspects, the control unit 110 is adapted to determine that a detected sound 220, 230 requires further action when the detected sound has been classified to relate to one or more objects 221, 231 which requires further action. This means that certain sounds 210 are associated with objects 211 which do require further action.
According to some aspects, the control unit 110 is adapted to delete the stored sounds 210, 220, 230 after a certain time or when a certain operation or ser of operations have been executed. This provides increased flexibility.
According to some aspects, the control unit 110 is adapted to determine that a detected sound 210 requires investigation when the detected sound has been classified to relate to a predetermined set of sounds and/or sound characteristics, and/or if a user has instructed the control unit 110 that investigation is required, by using a user terminal 280. 12 According to some aspects, the control unit 110 is adapted to determine that a detected sound 210 requires a user to be contacted when the detected sound has been classified to re|ate to a predetermined set of sounds and/or sound characteristics.
This means that not all sounds 221, 231 trigger an investigation, or even a user to be contacted. Only certain sounds, for example sounds associated with persons moving, braking glass etc. trigger a user to be contacted and or an investigation to be performed.
According to some aspects, the predetermined set of sounds and/or sound characteristics relates to at least one of - sounds of certain time durance, - sounds of certain frequency intervals, and - sounds with certain patterns regarding volume and frequency intervals.
This means that for example abrupt sounds, sounds that re|ate to a person walking and braking glass can be distinguished.
According to some aspects, the control unit 110 is adapted to control the robotic lawn mower 100 to move along a certain patrol path 240A, 240B within a certain pre-defined patrol area 241. The pre-defined patrol area is for example defined by using at least one of a boundary- or guide wire 250, satellite navigation and a local radio base station 214 that is used for providing a navigation signal, e.g. UWB, Ultra Wideband). Other examples are VSLAM (Visual simultaneous localization and mapping), a permanent magnet, a physical barrier, and an edge detecting sensor, such as a camera or radar detecting a grass edge. ln this way, a larger area can be covered and surveilled in a controlled manner.
According to some aspects, assuming that the robotic lawn mower 100 is adapted to track its position within the patrol area 241, a user can steer the robotic lawn mower 100 by means of the user terminal 280 such that a path such as the patrol path 240A, 240B is recorded as the robotic lawn mower 100. Alternatively, the path can be created 13 by a user in an app interface comprised in the user terminal 280, e.g. via a touch SCFeen.
According to some further aspects, if the robotic lawn mower 100 cannot track its position within the patrol area 241, it can e.g. be instructed via a human-machine interface (HMI) to follow a boundary- and or a guide wire 250, or a permanent magnet, or a physical barrier, or an edge of for example a lawn, using an edge detector sensor. lt may even be a patrol area 241 that consists of a hybrid system that both is boundary wire free, using a virtual boundary, and uses boundary wire 250. Moreover, according to some even further aspects, the patrol path 240A, 240B should be varied periodically or stochastically, preferably within a predetermined range, in order to avoid that wheel tracks form on the ground.
According to some aspects, the control unit 110 is adapted to control the robotic lawn mower 100 to move along a certain patrol path 240A, 240B at certain times that are predetermined, user-induced or determined stochastically. This provides flexibility and unpredictability for the patrolling.
According to some aspects, the patrol path 240A, 240B is pre-determined, and/or determined or changed stochastically within the patrol area 241 before or during movement along the patrol path 240A, 240B. ln this way, many types of varied control paths can be obtained. This also provides flexibility and unpredictability for the patrolling.
According to some aspects, the control unit 110 is adapted to control the robotic lawn mower 100 to run at least one functionality that generates a sound and to determine that a detected sound requires investigation when the detected sound has been classified to relate to abnormal running conditions for said functionality. ln this manner, the control unit 110 can control the robotic lawn mower 100 to perform self-diagnostic measures, for example running different motors and turning in different directions. ln this manner, for example worn bearings can be detected by means of sound analysis. 14 With reference to Figure 5, the present disclosure relates to a method for a robotic work tool control unit 110 that is used in a robotic work tool 100. ln the following, the robotic work tool is exemplified by a robotic lawn mower 100, although any suitable robotic work tool is conceivable. The method comprises determining S100 if a microphone arrangement 170, 171, comprised in the robotic lawn mower 100, has detected a sound 210, and if that is the case, the method comprises determining S200 if the detected sound requires a user to be contacted. lf that is the case, the method comprises contacting S300 a user, and using S400 the detected sound 210 for determining a direction D towards the detected sound 210.
According to some aspects, the method further comprises determining S500 if the detected sound requires investigation. lf that is the case, the method comprises controlling S600 the robotic lawn mower 100 to direct 120 at least one sensor device 172, comprised in the robotic lawn mower 100 and adapted for acquiring sensor images, in the determined direction D, and controlling S700 the sensor device 172 to acquire sensor images.
According to some aspects, the method further comprises relaying S800 the acquired sensor images to a user. This can be made directly or via a remote server 260 such as an lnternet server.
According to some aspects, the method further comprises presenting how a detected object moves to a user via a user terminal.
According to some aspects, the method further comprises analyzing S900 the acquired sensor images and determining if the acquired sensor images relate to an object 211 that requires further action.
According to some aspects, analyzing S900 comprises comparing S910 the acquired sensor images with predefined sensor images which relate to one or more objects 221, 231 which do not require further action.
According to some aspects, analyzing S900 comprises comparing S920 the acquired sensor images which relate to one or more objects 211 with predefined sensor images which do require further action.
According to some aspects, the method comprises storing the detected sound 210, 220, 230 and classifying the stored sound in dependence of the type of object 211, 221, 231 that is associated with the sound 210, 220, 230.
According to some aspects, the method comprises determining that a detected sound 220, 230 requires further action when the detected sound has been classified to relate to one or more objects 221, 231 which requires further action.
According to some aspects, the stored sounds 210, 220, 230 are deleted after a certain time or when a certain operation or set of operations have been executed.
According to some aspects, the method comprises determining that a detected sound 210 requires investigation when the detected sound has been classified to relate to a predetermined set of sounds and/or sound characteristics, and/or if a user has instructed the control unit 110 that investigation is required, by using a user terminal 280.
According to some aspects, the method comprises determining that a detected sound 210 requires a user to be contacted when the detected sound has been classified to relate to a predetermined set of sounds and/or sound characteristics.
According to some aspects, the method comprises controlling the robotic lawn mower 100 to move along a certain patrol path 240A, 240B within a certain pre-defined patrol area 241.
According to some aspects, the pre-defined patrol area is defined by using at least one of - a boundary wire 250, - satellite navigation, and - a local radio base station 214 that is used for providing a navigation signal. 16 According to some aspects, the method comprises controlling the robotic lawn mower 100 to move along a certain patrol path 240A, 240B at certain times that are predetermined, user-induced or determined stochastically.
According to some aspects, the patrol path 240A, 240B is pre-determined, and/or determined or changed stochastically within the patrol area 241 before or during movement along the patrol path 240A, 240B.
According to some aspects, the method comprises running at least one functionality that generates a sound and determining that a detected sound requires investigation when the detected sound has been classified to relate to abnormal running conditions for said functionality. ln Figure 3 it is schematically illustrated, in terms of a number of functional units, the components of the control unit 110 according to embodiments of the discussions herein. Processing circuitry 111 is provided using any combination of one or more of a suitable central processing unit CPU, multiprocessor, microcontroller, digital signal processor DSP, etc., capable of executing software instructions stored in a computer program product, e.g. in the form of a storage medium 112. The processing circuitry 111 may further be provided as at least one application specific integrated circuit ASIC, or field programmable gate array FPGA. The processing circuitry thus comprises a plurality of digital logic components.
Particularly, the processing circuitry 111 is configured to cause the control unit 110 to perform a set of operations, or steps to control the operation of the robotic lawn mower 100 including, but not being limited to, controlling the radar transceivers 170, processing measurements results received via the radar transceivers 170, and the propulsion of the robotic lawn mower 100. For example, the storage medium 112 may store the set of operations, and the processing circuitry 111 may be configured to retrieve the set of operations from the storage medium 112 to cause the control unit 110 to perform the set of operations. The set of operations may be provided as a set of executable instructions. Thus, the processing circuitry 111 is thereby arranged to execute at least parts of the methods as herein disclosed. 17 The storage medium 112 may also comprise persistent storage, which, for example, can be any single one or combination of magnetic memory, optical memory, solid state memory or even remotely mounted memory.
According to some aspects, the control unit 110 further comprises an interface 113 for communications with at least one external device such as the user terminal 280. As such the interface 113 may comprise one or more transmitters and receivers, comprising analogue and digital components and a suitable number of ports for wireline communication. The communication interface 113 can be adapted for communication with other devices, such as the remote server 260, the charging station 215, and/or other robotic working tools. Examples of such wireless communication devices are Bluetooth®, WiFi® (lEEE802.1 1 b), Global System Mobile (GSM) and LTE (Long Term Evolution), to name a few.
Figure 4 shows a computer program product 400 comprising computer executable instructions 410 stored on media 420 to execute any of the methods disclosed herein.
According to some aspects, similar control units may be present in at least one of the remote server 260 and the user terminal 280, where one or more of control units are adapted to perform some or all the methods discussed together.
With reference to Figure 2, the present disclosure also relates to robotic work tool system 270 comprising a robotic work tool 100 as described herein, where the robotic work tool system 220 further comprises a user terminal 280 that is arranged for wireless communication.
According to some aspects, the robotic work tool system 270 further comprises a remote server 260 that is adapted for wireless communication with the robotic work tool 100 and the user terminal 280. According to some aspects, the remote server can be a part of a cloud service 263 and be adapted to communicate 262 via a communications system 261. 18 According to some aspects, the robotic work tool system 270 is a robotic lawn mower system 270.
The present disclosure is not limited to the examples described above, but may vary freely within the scope of the appended claims. lt should be noted that in Figure 2, the robotic lawn mower 100 is patrolling 240A 240B when detecting a sound. Of course, the robotic lawn mower 100 can detect a sound that triggers one or more of the actions as described herein under other circumstances, for example during normal operations such as lawn mowing, moving to the charging station 215 or during charging in the charging station. This means that the robotic lawn mower 100 always is ready to detect sounds and to take proper actions. lt is possible that a user is interested in knowing if one or more animals move on the premises, and then sounds which have been classified to relate to one or more objects which require further action include sounds 220 from moving or sounding animals 221. Furthermore, sensor images which relate to one or more objects with predefined sensor images which do require further action may include sensor images, such as camera pictures, that show different types of animals 221 of interest.
Other sounds can be included as sounds which require further action or at least a user to be contacted, such as for example sounds of thunderstorms, heavy wind, moving doors or gates, etc.
According to some aspects, the method comprises leaving a certain pre-defined patrol area 241 to a predetermined extent E when moving in the determined direction D.

Claims (25)

1. A method for a robotic work tool control unit (110) that is used in a robotic work tool (100), where the method comprises: determining (S100) ifa microphone arrangement (170, 171), comprised in the robotic work tool (100), has detected a sound (210); and if that is the case, the method comprises determining (S200) if the detected sound requires a user to be contacted; and if that is the case, the method comprises contacting (S300) a user; and using (S400) the detected sound (210) for determining a direction (D) towards the detected sound (210).
2. The method according to claim 1, wherein the robotic work tool control unit (110) is used in a robotic lawn mower (100).
3. The method according to any one of the claims 1 or 2, further comprising determining (S500) if the detected sound requires investigation; and if that is the case, the method comprises controlling (S600) the robotic work tool (100) to direct (120) at least one sensor device (172), comprised in the robotic work tool (100) and adapted for acquiring sensor images, in the determined direction (D); and controlling (S700) the sensor device (172) to acquire sensor images.
4. The method according to claim 3, wherein each sensor device (172) is constituted by one of a camera device, a radar device, a LIDAR device, and an ultrasonic device.
5. The method according to claim 4, wherein at least one sensor device is constituted by an infrared camera device.
6. The method according to any one of the claims 3-5, further comprising relaying (S800) the acquired sensor images to a user.
7. The method according to any one of the claims 3-6, further comprising presenting how a detected object moves to a user via a user terminal.
8. The method according to any one of the claims 3-7, further comprising analyzing (S900) the acquired sensor images and determining if the acquired sensor images re|ate to an object (211) that requires further action.
9. The method according to c|aim 8, wherein analyzing (S900) comprises comparing (S910) the acquired sensor images with predefined sensor images which re|ate to one or more objects (221, 231) which do not require further action.
10. The method according to any one of the claims 8 or 9, wherein analyzing (S900) comprises comparing (S920) the acquired sensor images which re|ate to one or more objects (211) with predefined sensor images which do require further action.
11. The method according to any one of the claims 9 or 10, wherein the method comprises storing the detected sound (210, 220, 230) and c|assifying the stored sound in dependence of the type of object (211, 221, 231) that is associated with the sound (210, 220, 230).
12. The method according to c|aim 11, wherein the method comprises determining that a detected sound (210) requires further action when the detected sound has been ciassified to re|ate to one or more objects (211)which requires further action.
13. The method according to any one of the claims 11 or 12, wherein the stored sounds (210, 220, 230) are deleted after a certain time or when a certain operation or set of operations have been executed.
14. The method according to any one of the claims 3-13, wherein the method comprises determining that a detected sound (210) requires investigation when thedetected sound has been classified to relate to a predetermined set of sounds and/or sound characteristics, and/or if a user has instructed the control unit (110) that investigation is required, by using a user terminal (280).
15. The method according to any one of the previous claims, wherein the method comprises determining that a detected sound (210) requires a user to be contacted when the detected sound has been classified to relate to a predetermined set of sounds and/or sound characteristics.
16. The method according to any one of the c|aims 14 or 15, wherein the predetermined set of sounds and/or sound characteristics relates to at least one of - sounds of certain time durance, - sounds of certain frequency intervals, and - sounds with certain patterns regarding volume and frequency intervals.
17. The method according to any one of the previous claims, wherein the method comprises controlling the robotic work tool (100) to move along a certain patrol path (240A, 240B) within a certain pre-defined patrol area (241).
18. The method according to claim 17, wherein the pre-defined patrol area is defined by using at least one of - a boundary- or guide wire (250), - satellite navigation, - a local radio base station (214) that is used for providing a navigation signal, - VSLAM, Visual simultaneous localization and mapping, - a permanent magnet, - a physical barrier, and - an edge detecting sensor, such as a camera or radar detecting a grass edge.
19. The method according to claim any one of the c|aims 17 or 18, wherein the method comprises controlling the robotic work tool (100) to move along a certain patrol path (240A, 240B) at certain times that are predetermined, user-induced or determined stochastically.
20. The method according to any one of the claims 17-19, wherein the patrol path (240A, 240B) is pre-determined, and/or determined or changed stochastically within the patrol area (241) before or during movement along the patrol path (240A, 240B).
21. The method according to any one of the previous claims, wherein the method comprises running at least one functionality that generates a sound and determining that a detected sound requires investigation when the detected sound has been classified to relate to abnormal running conditions for said functionality.
22. A robotic work tool (100) comprising a navigation system (175), at least one sensor device 172 adapted to acquire sensor images, a microphone arrangement (171A, 171 B) adapted to detect sound (210, 220, 230), and a control unit (1 10) adapted to control the operation of the robotic work tool (100), wherein the control unit (110) is adapted to - determine if the microphone arrangement (171A, 171B) has detected a sound (210); and if that is the case, the control unit (110) is adapted to - determine if the detected sound requires a user to be contacted, and if that is the case, the control unit (110) is adapted to - contact a user by means of a communication interface (113) comprised in the control unit (110), and to - determine a direction (D) towards the detected sound (210) by means of the detected sound (210).
23. The robotic work tool (100) according to claim 22, wherein the microphone arrangement (171A, 171B) comprises at least two spatially separated microphones, and where the control unit (110) is adapted to perform the method according to any one of the claims 2-21 _
24. A robotic work tool system (270) comprising a robotic work tool (100) according to any one of the claims 22 or 23, wherein the robotic work tool system (220) further comprises a user terminal (280) that is arranged for wireless communication.
25. The robotic work tool system (270) according to claim 24, further comprising a remote server (260) that is adapted for wireless communication with the robotic work tool (100) and the user terminal (280).
SE2350404A 2023-04-05 2023-04-05 A robotic work tool adapted for monitoring using detected sounds SE2350404A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE2350404A SE2350404A1 (en) 2023-04-05 2023-04-05 A robotic work tool adapted for monitoring using detected sounds
PCT/SE2024/050180 WO2024210783A1 (en) 2023-04-05 2024-02-26 A robotic work tool adapted for monitoring using detected sounds

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE2350404A SE2350404A1 (en) 2023-04-05 2023-04-05 A robotic work tool adapted for monitoring using detected sounds

Publications (1)

Publication Number Publication Date
SE2350404A1 true SE2350404A1 (en) 2024-10-06

Family

ID=90361523

Family Applications (1)

Application Number Title Priority Date Filing Date
SE2350404A SE2350404A1 (en) 2023-04-05 2023-04-05 A robotic work tool adapted for monitoring using detected sounds

Country Status (2)

Country Link
SE (1) SE2350404A1 (en)
WO (1) WO2024210783A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2839769A2 (en) * 2013-08-23 2015-02-25 LG Electronics Inc. Robot cleaner and method for controlling the same
WO2016097897A1 (en) * 2014-12-18 2016-06-23 Husqvarna Ab Robotic patrol vehicle
EP3298874A1 (en) * 2016-09-22 2018-03-28 Honda Research Institute Europe GmbH Robotic gardening device and method for controlling the same
WO2020197768A1 (en) * 2019-03-25 2020-10-01 The Toro Company Autonomous working machine with computer vision-based monitoring and security system
US20220355481A1 (en) * 2019-07-05 2022-11-10 Lg Electronics Inc. Moving robot and method of controlling the same

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ATE166170T1 (en) * 1991-07-10 1998-05-15 Samsung Electronics Co Ltd MOVABLE MONITORING DEVICE
KR101356165B1 (en) * 2012-03-09 2014-01-24 엘지전자 주식회사 Robot cleaner and controlling method of the same
DE102015111392A1 (en) * 2015-07-14 2017-01-19 Vorwerk & Co. Interholding Gmbh Method for operating a surface treatment device
IT202100019139A1 (en) * 2021-07-20 2023-01-20 Luca Trombetta MOBILE ANTI-THEFT

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2839769A2 (en) * 2013-08-23 2015-02-25 LG Electronics Inc. Robot cleaner and method for controlling the same
WO2016097897A1 (en) * 2014-12-18 2016-06-23 Husqvarna Ab Robotic patrol vehicle
EP3298874A1 (en) * 2016-09-22 2018-03-28 Honda Research Institute Europe GmbH Robotic gardening device and method for controlling the same
WO2020197768A1 (en) * 2019-03-25 2020-10-01 The Toro Company Autonomous working machine with computer vision-based monitoring and security system
US20220355481A1 (en) * 2019-07-05 2022-11-10 Lg Electronics Inc. Moving robot and method of controlling the same

Also Published As

Publication number Publication date
WO2024210783A1 (en) 2024-10-10

Similar Documents

Publication Publication Date Title
US9563204B2 (en) Mower with object detection system
US8031085B1 (en) Context-based sound generation
US11294398B2 (en) Personal security robotic vehicle
US8340438B2 (en) Automated tagging for landmark identification
EP2169507B1 (en) Distributed knowledge base method for vehicular localization and work-site management
US8224500B2 (en) Distributed knowledge base program for vehicular localization and work-site management
EP2169505B1 (en) Distributed knowledge base for vehicular localization and work-site management
CN109564437A (en) For the dangerous system and method using unmanned vehicle monitoring to user
EP3919238B1 (en) Mobile robot and control method therefor
JP2018503194A (en) Method and system for scheduling unmanned aircraft, unmanned aircraft
WO2016097897A1 (en) Robotic patrol vehicle
US11467273B2 (en) Sensors for determining object location
EP3761136B1 (en) Control device, mobile body, and program
EP4017248A1 (en) Improved operation for a robotic work tool
TW201944790A (en) Tracking stolen robotic vehicles
EP3695699B1 (en) Robotic vehicle for boundaries determination
WO2023274339A1 (en) Self-propelled working system
SE2350404A1 (en) A robotic work tool adapted for monitoring using detected sounds
CN111971207B (en) Method of retrieving a robotic vehicle, robotic vehicle and processing device