WO2016143641A1 - 姿勢検知装置および姿勢検知方法 - Google Patents
姿勢検知装置および姿勢検知方法 Download PDFInfo
- Publication number
- WO2016143641A1 WO2016143641A1 PCT/JP2016/056496 JP2016056496W WO2016143641A1 WO 2016143641 A1 WO2016143641 A1 WO 2016143641A1 JP 2016056496 W JP2016056496 W JP 2016056496W WO 2016143641 A1 WO2016143641 A1 WO 2016143641A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- head
- posture
- unit
- image
- predetermined
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1116—Determining posture transitions
- A61B5/1117—Fall detection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique using image analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6889—Rooms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/103—Static body considered as a whole, e.g. static pedestrian or occupant recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/04—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/08—Elderly
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0407—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
- G08B21/043—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting an emergency event, e.g. a fall
Definitions
- the present invention relates to an attitude detection device and an attitude detection method for detecting an attitude of a monitoring target.
- Japan is an aging society, more specifically the ratio of population over 65 years old to the total population due to the improvement of living standards accompanying the post-war high economic growth, improvement of sanitary environment and improvement of medical standards, etc. It is a super-aging society with an aging rate exceeding 21%.
- the total population was about 126.5 million, while the elderly population over the age of 65 was about 25.56 million.
- the total population was about 124.11 million.
- the elderly population will be about 34.56 million.
- nurses who need nursing or nursing care due to illness, injury, elderly age, etc., or those who need nursing care are those who need nursing in a normal society that is not an aging society.
- Patent Document 1 discloses a fall detection system as one of such devices.
- the fall detection system disclosed in Patent Document 1 is a distance image sensor that detects a distance value of each pixel in a predetermined detection area, and a person's fall based on the distance value of each pixel detected by the distance image sensor.
- a fall detection device that sets a rectangular parallelepiped based on the outer shape of the person detected by the distance image sensor and detects the fall of the person based on the aspect ratio of the rectangular parallelepiped.
- the distance image sensor scans a laser beam in a two-dimensional region, and receives a laser beam reflected by an object by a two-dimensional scanner, thereby acquiring a distance value of each pixel.
- examples of the distance image sensor include a sensor capable of acquiring three-dimensional information such as a stereo camera or a sensor combining an LED and a CMOS.
- the fall detection device sets a rectangular parallelepiped based on the outer shape of the person detected by the distance image sensor, and falls over the person based on the aspect ratio of the rectangular parallelepiped. Is detected. For this reason, for example, when a part of the body such as a foot is shielded from the distance image sensor by furniture such as a desk or a chair, the setting of the rectangular parallelepiped becomes inaccurate, and the fall detection device erroneously falls a person. It will be detected. For this reason, in order to eliminate the shielding, a method of detecting the distance value of each pixel in the detection area from a plurality of angles by using a plurality of distance image sensors can be considered. In this method, a plurality of distance image sensors are used. The cost increases by using.
- the present invention is an invention made in view of the above-described circumstances, and an object of the present invention is an attitude detection apparatus and an attitude detection method capable of more accurately determining an attitude of a monitoring target such as a fall or a fall with a simpler configuration. Is to provide.
- an image of a predetermined detection area is acquired by an image acquisition unit, a head is extracted from the acquired image of the detection area, and the predetermined position in the extracted head is detected. Is determined, and it is determined whether the posture is a predetermined posture based on the determined parameter. Therefore, the posture detection apparatus and the posture detection method according to the present invention use a predetermined parameter related to the head that is difficult to be shielded even by a single image acquisition unit, so that the posture of the monitoring target can be more accurately configured with a simpler configuration. Can be determined.
- FIG. 1 is a block diagram illustrating a configuration of an attitude detection device according to an embodiment.
- FIG. 2 is a diagram for explaining an installation state of an image acquisition unit in the posture detection apparatus.
- the posture detection apparatus acquires an image of a detection area, and based on the acquired image, for example, a monitoring target (monitored person, watched person, It is determined whether or not the subject is in a predetermined posture set in advance.
- a monitoring target monitoring person, watched person, It is determined whether or not the subject is in a predetermined posture set in advance.
- Such an attitude detection device D includes, for example, an image acquisition unit 1 and a control processing unit 2 including a head extraction unit 22 and an attitude determination unit 23, as shown in FIGS.
- the storage unit 3, the input unit 4, the output unit 5, the interface unit (IF unit) 6, and the communication interface unit (communication IF unit) 7 are further provided.
- the image acquisition unit 1 is an apparatus that is connected to the control processing unit 2 and acquires an image of a predetermined detection area under the control of the control processing unit 2.
- the predetermined detection area is, for example, a space where the monitoring target is normally located or scheduled to be normally located.
- the image acquisition unit 1 receives a communication signal storing an image of the detection area from the web camera via the network.
- a communication interface such as a data communication card or a network card.
- the image acquisition unit 1 may be the communication IF unit 7 and can also be used as the communication IF unit 7.
- the image acquisition unit 1 may be a digital camera connected to the control processing unit 2 via a cable.
- a digital camera is, for example, an imaging optical system that forms an optical image of a detection area on a predetermined imaging surface, and a light receiving surface that is aligned with the imaging surface, and an optical image of the detection area.
- the digital camera with a communication function further includes a communication interface unit that is connected to the image processing unit and transmits / receives a communication signal to / from the attitude detection device D via a network.
- Such a digital camera (including a digital camera with a communication function) is arranged with the detection area aligned in an appropriate direction and the photographing direction.
- the monitoring target OJ is a central position in a room (room) RM in which the monitoring target is located so that the monitoring target is not hidden from the digital camera.
- the photographing direction (the optical axis direction of the imaging optical system) aligned with the vertical direction (normal direction on the horizontal ceiling surface of the ceiling).
- the digital camera may be a visible light camera, but may be an infrared camera combined with an infrared projector that projects near-infrared light so that it can be photographed even in the dark at night.
- the input unit 4 is connected to the control processing unit 2 and inputs various commands such as a command for instructing monitoring and various data necessary for monitoring, for example, the name of the monitoring target, to the posture detection device D.
- a device for example, a keyboard or a mouse.
- the output unit 5 is connected to the control processing unit 2, and according to the control of the control processing unit 2, the command and data input from the input unit 4 and the determination result determined by the posture detection device D (for example, the monitoring target is For example, a display device such as a CRT display, an LCD and an organic EL display, a printing device such as a printer, and the like.
- a touch panel may be configured from the input unit 4 and the output unit 5.
- the input unit 4 is a position input device that detects and inputs an operation position such as a resistive film method or a capacitance method
- the output unit 5 is a display device.
- a position input device is provided on the display surface of the display device, one or more input content candidates that can be input to the display device are displayed, and the user touches the display position where the input content to be input is displayed. Then, the position is detected by the position input device, and the display content displayed at the detected position is input to the posture detection device D as the operation input content of the user.
- the posture detection device D that is easy for the user to handle is provided.
- the IF unit 6 is a circuit that is connected to the control processing unit 2 and inputs / outputs data to / from an external device according to the control of the control processing unit 2.
- an interface circuit of RS-232C that is a serial communication system
- the communication IF unit 7 is connected to the control processing unit 2 and communicates with the communication terminal apparatus TA via a network such as a LAN, a telephone network, and a data communication network by wire or wirelessly according to the control of the control processing unit 2. It is a communication apparatus for performing.
- the communication IF unit 7 generates a communication signal containing data to be transferred input from the control processing unit 2 in accordance with a communication protocol used in the network, and generates the generated communication signal via the network via a communication terminal device Send to TA.
- the communication IF unit 7 receives a communication signal from another device such as the communication terminal device TA via the network, extracts data from the received communication signal, and the control processing unit 2 can process the extracted data.
- the data is converted into format data and output to the control processing unit 2.
- the storage unit 3 is a circuit that is connected to the control processing unit 2 and stores various predetermined programs and various predetermined data under the control of the control processing unit 2.
- the various predetermined programs include, for example, a control processing program such as a posture detection program for detecting a predetermined posture in the monitoring target from the image of the detection area.
- the various predetermined data includes a threshold th for determining whether or not the predetermined posture is used.
- the storage unit 3 includes, for example, a ROM (Read Only Memory) that is a nonvolatile storage element, an EEPROM (Electrically Erasable Programmable Read Only Memory) that is a rewritable nonvolatile storage element, and the like.
- the storage unit 3 includes a RAM (Random Access Memory) serving as a so-called working memory of a CPU (Central Processing Unit) that stores data generated during execution of the predetermined program.
- the storage unit 3 may include a relatively large capacity hard disk.
- the control processing unit 2 is a circuit for controlling each unit of the posture detection device D according to the function of each unit and detecting a predetermined posture in the monitoring target.
- the control processing unit 2 includes, for example, a CPU (Central Processing Unit) and its peripheral circuits.
- the control processing unit 2 includes a control unit 21, a head extraction unit 22, a posture determination unit 23, and a final determination unit 24, and the posture determination unit 23 includes The parameter calculation unit 231 and the temporary determination unit 232 are functionally configured.
- the control part 21 is for controlling each part of the attitude
- the head extraction unit 22 extracts a head (an image area representing the head in the image, a head image) from the image of the detection area acquired by the image acquisition unit 1.
- a known image processing technique is used to extract the head.
- the shape of the head is assumed to be an elliptical shape, and the image of the detection area is subjected to a so-called generalized Hough transform, thereby extracting the elliptical shape in the image of the detection area, that is, the head.
- Such an image processing technique is disclosed in, for example, the literature; “Makoto Murakami,“ Research on Feature Representation and Region Extraction in Human Head Recognition ”, March 2003, Waseda University.
- a head shape such as an ellipse or circle of the outline or a head shape such as an approximate shape as a template prepared in advance or by fitting a closed curve such as so-called Snake
- these methods are used in combination with color information such as skin color and black color, and movement information that determines whether a person is based on the presence or absence of movement.
- the area where the image processing is performed is selected from the images in the detection area.
- the color information, the motion information, etc. may be used to limit the region to a highly probable region, and the head extraction unit 22 sends the extracted head (head image region) to the posture determination unit 23. Notice.
- the posture determination unit 23 obtains a predetermined parameter in the head extracted by the head extraction unit 22, and determines whether or not the posture is a predetermined posture based on the obtained parameter. It is. More specifically, the posture determination unit 23 determines whether or not the predetermined posture is based on whether or not a predetermined parameter in the head extracted by the head extraction unit 22 is equal to or greater than a predetermined threshold th. Is determined.
- the posture determination unit 23 functionally includes a parameter calculation unit 231 and a temporary determination unit 232.
- the parameter calculation unit 231 obtains a predetermined parameter in the head extracted by the head extraction unit 22.
- the predetermined parameter an appropriate parameter that can determine the posture of the monitoring target can be used. For example, when deciding whether or not a fall has occurred, the height of the head is used as the parameter because the height of the head is different between the fall and fall postures and other postures such as standing and sitting. it can.
- the height of the head can be used as the parameter.
- the size of the head on the image (the length of the short side in the region of the image showing the head) is The size depends on the height. That is, at the same position on the plane, the higher the head height, the larger the head size appears on the image. Therefore, in each case described above, the size of the head can be used as the parameter. That is, it is possible to estimate the height of the head by using the size of the head as the parameter, and based on the estimated height of the head, the monitoring target such as standing, sitting and falling Can determine posture.
- the temporary determination unit 232 determines whether or not the predetermined posture is based on whether or not the predetermined parameter in the head obtained by the parameter calculation unit 231 is equal to or greater than a predetermined threshold th. According to this, it is possible to easily determine whether or not the predetermined posture is merely by determining whether or not the parameter is equal to or greater than the threshold th. More specifically, for example, when the height of the head is used as the parameter and it is determined whether or not the vehicle falls, the head that can distinguish the posture of falling and other postures such as standing and sitting. The height of the part is set in advance as the predetermined threshold (first threshold, fall / fall determination head height threshold) th1.
- the height of the bed BT may be set to the threshold value th1.
- the predetermined threshold 2-1 threshold, standing position determination head height threshold
- the height of the possible head is set in advance as the predetermined threshold value (second-threshold value 2-2, sitting position falling head height threshold value) th22.
- the thresholds th1, t21, and th22 are similarly set in advance by replacing the height of the head with the size of the head. These threshold values th1, th21, and th22 may be appropriately set by preparing a plurality of samples in advance and performing statistical processing.
- the thresholds th1 and th22 are set based on the height of the standing position.
- the posture detection device D can prevent the posture of the monitoring target from falling and falling. It becomes possible to determine whether or not there is.
- the thresholds th1 and th22 are preferably set based on the height of the sitting position. By setting the thresholds th1 and th22 so that the height is lower than the height of the sitting position based on the height of the sitting position, the posture detection device D can detect whether or not the posture of the monitoring target is falling over. Can be determined.
- the temporary determination unit 232 notifies the final determination unit 24 of the determination result as the determination result of the posture determination unit 23.
- the image acquisition unit 1 acquires a plurality of images of the detection area at different times
- the head extraction unit 22 acquires the plurality of images of the detection area acquired by the image acquisition unit 1.
- a head is extracted from the image
- the posture determination unit 23 determines the predetermined in the head extracted by the head extraction unit 22 for each of the plurality of images of the detection area acquired by the image acquisition unit 1. Based on the parameters, it is determined whether the posture is a predetermined posture.
- the final determination unit 24 finally determines whether or not the predetermined posture is based on a plurality of determination results determined by the posture determination unit 23. For example, when the final determination unit 24 determines that the plurality of determination results determined by the posture determination unit 23 are the predetermined posture continuously for a predetermined number of times (that is, always for a predetermined fixed time), Finally, the predetermined posture is determined. When the final determination unit 24 finally determines that the posture is the predetermined posture, the final determination unit 24 notifies the control unit 21 accordingly. When the control unit 21 receives a notification from the final determination unit 24 that the posture of the monitoring target is finally the predetermined posture, the control unit 21 obtains information indicating that the posture of the monitoring target is finally the predetermined posture. Output.
- FIG. 3 is a flowchart illustrating the operation of the posture detection apparatus according to the embodiment.
- the control processing unit 2 executes initialization of each necessary unit and executes control processing by executing a control processing program.
- the control unit 21, the head extraction unit 22, the posture determination unit 23, and the final determination unit 24 are functionally configured in the unit 2, and the parameter calculation unit 231 and the temporary determination unit 232 are functional in the posture determination unit 23. Configured.
- an image of the detection area is acquired by the image acquisition unit 1, and the acquired image of the detection area is acquired from the image acquisition unit 1 to the control processing unit 2. (S1).
- the head (region of the image showing the head) is extracted by the head extraction unit 22 of the control processing unit 2, and this extraction is performed.
- the head is notified to the posture determination unit 23 of the control processing unit 2 (S2).
- a predetermined parameter in the head extracted by the head extraction unit 22, for example, the size of the head is obtained by the parameter calculation unit 231 of the posture determination unit 23, and the obtained parameter (this In the example, the size of the head) is notified from the parameter calculation unit 231 to the temporary determination unit 232 of the posture determination unit 23 (S3).
- the provisional determination unit 232 determines whether the posture is a predetermined posture. (S4). More specifically, in one example, the provisional determination unit 232 determines whether or not the size of the head obtained by the parameter calculation unit 231 is equal to or greater than a threshold th1 for determining a fall and fall. It is determined whether or not it falls. If the result of this determination is that the size of the head is greater than or equal to the threshold th1, the provisional determination unit 232 determines that the head has not fallen over, ie, is not in the predetermined posture (No), and the predetermined posture Is notified to the final determination unit 24, and the process S6 is executed.
- a threshold th1 for determining a fall and fall. It is determined whether or not it falls. If the result of this determination is that the size of the head is greater than or equal to the threshold th1, the provisional determination unit 232 determines that the head has not fallen over, ie, is not in the predetermined posture (No), and the predetermined posture Is notified to
- the provisional determination unit 232 determines that the head falls and falls, that is, the predetermined posture (Yes), and the predetermined The final determination unit 24 is notified of the determination result indicating that the posture is the same as that, and the process S5 is executed.
- process S6 upon receiving a determination result indicating that the posture is not the predetermined posture, the final determination unit 24 clears the counter CT (CT ⁇ 0), and executes process S7. If the temporary determination unit 232 makes an erroneous determination, the counter CT is cleared by one erroneous determination in the process S6. Therefore, in the process S6, the final determination unit 24 replaces the counter CT with a clear value.
- the counter CT may be counted down (CT ⁇ CT ⁇ 1).
- the final determination unit 24 determines whether or not the counter CT exceeds a preset number of times.
- the specified number of times is the number of determination results indicating that the predetermined posture is determined by the temporary determination unit 232, which is necessary for finally determining the predetermined posture.
- the number of times is set to an appropriate number of times such as 5 times or 10 times.
- the final determination unit 24 finally determines that the posture to be monitored is the predetermined posture, and the final determination unit 24 notifies the control unit 21 that it has finally determined that the posture of the monitoring target is the predetermined posture (S8).
- the control unit 21 receives notification from the final determination unit 24 that the posture of the monitoring target is finally the predetermined posture, the posture of the monitoring target is finally the predetermined predetermined. Is output to indicate that the posture is (S9).
- the control unit 21 outputs to the output unit 5 information indicating that the posture to be monitored is finally the predetermined posture.
- control unit 21 transmits a communication signal (posture notification signal) containing information indicating that the posture to be monitored is finally the predetermined posture to the communication terminal device TA via the communication IF unit 7. .
- a communication signal (posture notification signal) containing information indicating that the posture to be monitored is finally the predetermined posture to the communication terminal device TA via the communication IF unit 7.
- the communication terminal apparatus TA displays information indicating that the attitude of the monitoring target is finally the predetermined attitude on the display device (liquid crystal display, organic EL display, or the like).
- the current determination process ends, and the next determination process is executed. That is, each process described above is executed from process S1.
- the posture detection device D and the posture detection method implemented in the present embodiment acquire an image of the detection area by the image acquisition unit 1 and from the image of the detection area by the head extraction unit 22.
- a head an image area representing the head in the image, an image of the head
- the monitoring target (monitored person) applied to the head based on a predetermined parameter in the head by the posture determination unit 23 A predetermined posture of the person being watched over and the target person). Therefore, the posture detection device D and the posture detection method implemented in this embodiment have a simpler configuration of using one image acquisition unit 1 and use predetermined parameters relating to the head that is difficult to be shielded. For example, it is possible to more accurately determine the posture of a monitoring target such as a fall or a fall.
- the posture detection device D and the posture detection method implemented in this embodiment can be realized even with hardware having a relatively low information processing capability. Is possible.
- the attitude detection device D and the attitude detection method implemented therein determine whether or not the final determination unit 24 has the predetermined attitude based on a plurality of determination results determined by the attitude determination unit 23. Since the final determination is made, the posture of the monitoring target can be determined more accurately.
- the posture detection device D and the posture detection method implemented in this embodiment when the image acquisition unit 1 is a camera disposed on the ceiling CE, the monitoring target OJ that appears in the image of the detection area is in the room RM. It becomes difficult to be shielded by placed fixtures and the like, and the posture of the monitoring target OJ can be determined more accurately.
- the thresholds th1, th21, and th22 are set by performing statistical processing from a plurality of samples, and the posture detection device D is configured as a general-purpose device.
- a first threshold value setting unit 26 that sets the threshold values th1, th21, and th22 for each subject may be further provided in the control processing unit 2 (first modification).
- the user inputs the thresholds th1, th21, and th22 corresponding to the monitoring target from the input unit 4
- the first threshold setting unit 26 receives the threshold th1 corresponding to the monitoring target from the input unit 4.
- Th21, th22 are stored in the storage unit 3 as the threshold values th1, th21, th22, and the threshold values th1, th21, th22 are set.
- the provisional determination unit 232 of the posture determination unit 23 determines whether the predetermined posture is used by using the threshold values th1, th21, and th22 stored in the storage unit 3 according to the monitoring target.
- the threshold values th1, th21, and th22 themselves (ie, themselves) corresponding to the monitoring target may be input from the input unit 4, but the standing height (height) (or sitting height) of the monitoring target is also acceptable.
- the first threshold setting unit 26 obtains the thresholds th1, th21, and th22 from the standing height (or sitting height) of the monitoring target received by the input unit 4. (Converted to the threshold values th1, th21, and th22) and stored in the storage unit 3, and the threshold values th1, th21, and th22 may be set. Since such a posture detection device D further includes the first threshold value setting unit 26, the threshold values th1, th21, and th22 can be set according to the monitoring target, so that it can be customized according to the monitoring target (per monitor). The posture of the monitoring target can be determined even more accurately.
- the image acquisition unit 1 acquires a plurality of images of detection areas at different times, and the posture detection device D is acquired by the image acquisition unit 1 as indicated by a broken line in FIG.
- the control processing unit 2 may further include a second threshold value setting unit 27 that sets the threshold values th1, th21, and th22 based on the plurality of images (second modification).
- the image acquisition unit 1 acquires a plurality of images of the detection area at different times, so that the actual behavior of the monitoring target in the detection area is acquired.
- the second threshold setting unit 27 obtains the predetermined parameters for the head from each of the plurality of images, and obtains an average value or a minimum value of the parameters after removing outliers (noise).
- the threshold values th1, th21, th22 are obtained from the obtained values (converted to the threshold values th1, th21, th22) and stored in the storage unit 3, and the threshold values th1, th21, th22 are set. Also good.
- the second threshold value setting unit 27 sets the threshold values th1, th21, and th22 based on a plurality of images in the detection area at different times. Threshold values th1, th21, and th22 can be set. In particular, even when the posture of standing or walking is different from that of a healthy person, such as when the waist is bent, the threshold values th1, th21, and th22 can be set by automatically taking such personal circumstances into consideration. Become.
- the posture detection device D is set in advance as shown by a broken line in FIG.
- a threshold correction unit 28 that corrects the thresholds th1, th21, and th22 set by the two threshold setting unit 27 may be further provided in the control processing unit 2 (third modified embodiment, fourth modified embodiment).
- FIG. 4 is a diagram showing a fall / fall determination table in the third modification.
- FIG. 5 is a diagram for explaining the relationship between the image of the detection area and the determination area in the third modification.
- FIG. 6 is a diagram for explaining the relationship between the image of the detection area and the determination area for each threshold in the second modification.
- the size of the head is large when the angle of view of the digital camera is relatively narrow or in the area around the optical axis in the image. Since it is substantially proportional to the height of the head, a predetermined posture on the monitoring target can be determined by the size of the head.
- the height of the head is C (m)
- the height of the ceiling CE is H (m)
- Sh may be calculated from the specifications of the digital camera and its mounting position, or may be measured.
- the threshold correction unit 28 sets the thresholds th1, th21, and th22 used in the temporary determination unit 232 on the image so as to eliminate a deviation from the proportional relationship between the size of the head and the height of the head. Correction is made according to the position of the head (the position on the image in which the head is shown). In this correction, the aberration of the imaging optical system may be taken into consideration.
- a function expression representing the relationship between the position of the head on the image and the correction value may be stored in the storage unit 3, and the function expression may be used by the temporary determination unit 232.
- the table shown may be stored in the storage unit 3, and the table may be used by the temporary determination unit 232.
- the position of the head on the image is divided into four first to fourth determination areas AR0 to AR3 as shown in FIG.
- a different threshold th is set for each of AR0 to AR3. That is, the fall threshold value th1 for the first determination area AR0, which is an area within a circle having a predetermined first radius centered on the optical axis, in which the size of the head and the height of the head are approximately proportional, is an example.
- the size of the head calculated by the parameter calculation unit 231 (head When the length of the short side in the image area in which the image is taken is 51 [pixel] or more, it is determined that the posture of the monitoring target is an un fallen fall (O) (not a fall), and the parameter calculation unit 231 If the size of the head calculated by the above is less than 51 [pixel], it is determined that the posture of the monitoring target is a fall (*). Falling over the second determination area AR1 that is concentric with the first determination area AR0, exceeds the first determination area AR0, and is within a circle having a predetermined second radius (> first radius) centered on the optical axis.
- the head calculated by the parameter calculating unit 231 is used. If the size of the head is 46 [pixel] or more, it is determined that the posture to be monitored is an un fallen fall (O) (not a fall), and the size of the head calculated by the parameter calculation unit 231 is If it is less than 46 [pixel], it is determined that the posture of the monitoring target is falling and falling (x).
- the second and third determination areas AR1 and AR2 are areas in which the size of the head and the height of the head are not proportional. In this example, in order to correct more accurately, the size of the head and the height of the head It is divided into two regions according to the degree of deviation from the proportional relationship between the two.
- the fourth determination area AR3 which is an area exceeding the third determination area AR2 in the image, is an area outside the determination (an area where determination is impossible), and the fall threshold value th1 for the fourth determination area AR3 is set.
- the threshold value th is set to a different value for each determination area AR as described above, it is possible to perform determination in consideration of a change in the relationship between the size and height of the head at a position on the image. Further, according to this, it is possible to make a determination in consideration of a specific area where a bed or the like is present.
- the digital camera has the shooting direction coincided with the vertical direction at the center position of the ceiling CE.
- the detection area may be shot by tilting shooting.
- the shape of the determination area is appropriately changed according to the shooting conditions (camera characteristics), and the threshold value of each determination area is appropriately set, thereby creating the table. good.
- the digital camera is installed in the upper corner of the room RM with the shooting direction obliquely downward, and the first determination area AR0 is a point on the floor FL that is directly below the center of the optical axis.
- the first determination area AR1 is concentric with the first determination area AR0, exceeds the first determination area AR0, and corresponds directly below the center of the optical axis.
- the third determination area AR2 extends beyond the second determination area AR1 and the inner wall surface as well as the above-described third determination area AR2 is a region within a semicircular shape having a predetermined fourth radius (> third radius) centered on the point on the floor FL.
- the region includes the positions of the ceiling surface CE, the right wall surface, and the left wall surface connected to the back wall surface, and the fourth determination area AR3 is a region that exceeds the third determination area AR2 in the image.
- the first to third determination areas AR0 to AR2 are appropriately set with the threshold th1 in consideration of shooting as a shooting condition, and the fourth determination area AR4 is an area outside the determination (an area that cannot be determined). In addition, the threshold value th1 of the fall and fall for the fourth determination area AR3 is not set.
- each threshold th1 in the first to third determination areas AR0 to AR2 is set as follows, for example. First, a head model (head model) having a statistically standard size is prepared in advance. For each judgment area AR0 to AR2, a known head model of this size is subject to falling or falling. The size of the head model on the image (number of pixels) is determined by the above digital camera, and the size of the head model on the determined image (number of pixels) is determined. ) Is set as the threshold th1.
- the size of the head is exemplified, but the same applies to the height of the head. Further, in the above description, the collapse of the proportional relationship between the size of the head and the height of the head has been eliminated by correcting the thresholds th1, th21, and th22 by the threshold correction unit 28, but acquired by the image acquisition unit 1
- the detected area image, the head extracted by the head extraction unit 22 (head image), or the parameters related to the head calculated by the parameter calculation unit 231 are the size of the head and the height of the head. It may be corrected so as to eliminate the collapse of the proportional relationship with the.
- the parameter may further include the position of the head (fifth modification). That is, in one example, the posture determination unit 23 obtains the size and position in the head extracted by the head extraction unit 22, and uses the predetermined posture based on the obtained size and position in the head. It is determined whether or not there is. In another example, the posture determination unit 23 obtains the height and position in the head extracted by the head extraction unit 22, and the predetermined posture based on the obtained height and position in the head. It is determined whether or not.
- the posture determination unit 23 determines whether or not the posture is a predetermined posture
- the predetermined posture may not occur depending on the position of the monitoring target.
- the predetermined posture is highly likely to occur. For example, when it is determined by the posture determination unit 23 whether or not the object falls, if the monitoring object is located on the bed, the determination using the threshold value th1 is performed even if it is determined that the object falls. There is a high possibility that it is not lying down, just lying on the bed. Conversely, if the position of the monitoring target is on the floor, the monitoring target is highly likely to have fallen.
- the position of the monitoring target is estimated based on the position of the head, and as described above, the posture determination unit 23 adds the head position, that is, the monitoring position in addition to the size of the head or the height of the head. By determining whether or not the predetermined posture is also considered in consideration of the position of the target, the posture of the monitoring target can be determined more accurately.
- FIG. 7 is a diagram for explaining the relationship between the image of the detection area and the determination area for each fall / fall determination in the fifth modification. More specifically, as shown in FIG. 7, when a bed BT is placed in a room RM in the detection area, an area AD2 on the image corresponding to the bed BT is set as a determination area outside determination. On the contrary, the area AD ⁇ b> 1 on the image corresponding to the floor FL is set as a determination area to be determined and stored in the storage unit 3.
- the posture determination unit 23 refers to the storage unit 3 before determining (or after the determination) whether to determine whether the posture is the predetermined posture using the size of the head or the height of the head. It is determined whether or not the position of the part is a determination area outside the determination.
- the area AD2 on the image corresponding to the bed BT may be included in the third determination area AR2 in the table shown in FIG.
- the posture determination unit 23 falls as the predetermined posture depending on whether or not the position of the head extracted by the head extraction unit 22 is on the floor. It is determined whether or not it is a fall.
- the posture determination unit 23 determines whether or not the predetermined position is a fall and fall depending on whether or not the position of the head is on the floor. Judgment can be made more accurately.
- the posture determination unit 23 determines whether the predetermined position depends on whether the position of the head extracted by the head extraction unit 22 is on the bed. It is determined whether the posture is falling or falling. When the position of the head is on the bed, there is a high possibility that the posture of the monitoring target is not lying down and lying on the bed. Therefore, in such a posture detection device D, the posture determination unit 23 determines whether or not the predetermined posture is a fall and fall depending on whether or not the position of the head is on the bed. Can be determined more accurately. In other words, recumbency on the bed can be determined.
- the parameter may further include the orientation of the head (sixth modification).
- the posture determination unit 23 obtains the size and orientation of the head extracted by the head extraction unit 22, and uses the predetermined posture based on the obtained size and orientation of the head. It is determined whether or not there is.
- the posture determination unit 23 obtains the height and orientation of the head extracted by the head extraction unit 22, and the predetermined posture based on the obtained height and orientation of the head. It is determined whether or not.
- the posture determination unit 23 obtains the size, position, and orientation in the head extracted by the head extraction unit 22, and based on the obtained size, position, and orientation in the head, It is determined whether or not the posture is the same. In another example, the posture determination unit 23 obtains the height, position, and orientation in the head extracted by the head extraction unit 22, and based on the obtained height, position, and orientation in the head. It is determined whether or not the predetermined posture.
- the angle formed by the midline connecting the central position of both eyes and the lower jaw with respect to the vertical direction is 0 degree, the face is directed in the horizontal direction.
- Head on the side means a state in which the midline of the head forms an angle of about 90 degrees with the vertical direction and the face faces the horizontal direction. Therefore, the orientation parameter refers to an angle formed by the face direction and the midline of the head with the vertical direction.
- the posture determination unit 23 determines whether or not the predetermined posture is set in advance, the predetermined posture may not occur depending on the orientation of the head in the monitoring target. Conversely, depending on the orientation of the head in the monitoring target, there is a possibility that the predetermined posture is highly likely to occur. For example, when it is determined whether or not the posture determination unit 23 falls over, if the orientation of the head, that is, the orientation of the face that can be determined from the orientation of the head is facing the front (horizontal direction), The monitoring target is likely to be crouching instead of falling down, and conversely, if the orientation of the head, that is, the orientation of the face that can be determined from the orientation of the head is landscape or top, The monitoring target is likely to have fallen.
- the posture determination unit 23 further determines the posture of the monitoring target by determining whether or not the predetermined posture is considered in consideration of the direction of the head (that is, the direction of the face). Accurate judgment can be made.
- a known image processing technique is used to extract the head orientation.
- template matching is performed by the parameter calculation unit 231 using a template with a head contour shape prepared in advance as a template, and a face shape composed of facial feature points such as eyes and mouth is used as a template prepared in advance.
- the orientation of the face is extracted by matching or by Haal-like focusing on the facial feature points, and the orientation of the head is obtained.
- the head orientation may be obtained by the head extraction unit 22 instead of the parameter calculation unit 231.
- the posture determination unit 23 determines whether or not the predetermined posture is used using a parameter including the orientation of the head.
- the posture determination unit 23 determines that the head direction is the front (horizontal direction) when the size of the head obtained by the parameter calculation unit 231 is not equal to or greater than the threshold th1 for determining the presence or absence of falls. If it is facing, it is determined that it has not fallen, and if the head is oriented sideways or top, it is determined that it has fallen.
- the posture detection device D has a trunk corresponding to the head extracted by the head extraction unit 22 from the image of the detection area acquired by the image acquisition unit 1, as indicated by a broken line in FIG. And a trunk extracting unit 25 for extracting the head, and the parameter may further include a positional relationship between the head and the trunk.
- FIG. 8 is a diagram for explaining the positional relationship between the head and the trunk in the sixth modification.
- FIG. 8A shows a state in which the monitoring target is lying down
- FIG. 8B shows a state in which the monitoring target is squatting and not lying down.
- FIG. 8A if the longitudinal direction of the trunk BD and the longitudinal direction of the head HD coincide with each other, or if the head HD is located at one end of the trunk BD, the body is lying down.
- FIG. 8B if the head HD is positioned at the center position of the trunk BD, it can be determined that the player is crouching.
- a known image processing technique is used to extract the trunk BD.
- the trunk BD is obtained by the parameter calculation unit 231 by template matching using the outline shape of the trunk BD as a template prepared in advance.
- the trunk BD template may include the contour shape of the foot.
- the trunk BD may be obtained by moving body extraction using, for example, a background difference method. In the background difference method, a background image is obtained and stored in advance, and a moving object is extracted as a trunk BD from the difference image between the acquired image and the background image.
- the image acquisition unit 1 acquires a plurality of images of the detection area at different times
- the head extraction unit 22 acquires the plurality of images of the detection area acquired by the image acquisition unit 1.
- a head is extracted from the image
- the posture determination unit 23 obtains the moving speed of the head as the parameter based on the plurality of heads extracted by the head extraction unit 22, and obtains this You may determine whether it is the said predetermined attitude
- the posture detection apparatus includes an image acquisition unit that acquires an image of a predetermined detection area, a head extraction unit that extracts a head from the image of the detection area acquired by the image acquisition unit, and the head A posture determination unit that obtains a predetermined parameter in the head extracted by the part extraction unit and determines whether the posture is a predetermined posture based on the obtained parameter.
- the posture detection device In such a posture detection device, an image of the detection area is acquired by the image acquisition unit, and a head (an area of an image showing the head in the image, an image of the head is extracted from the image of the detection area by the head extraction unit. ) And the posture determination unit determines a predetermined posture of the monitoring target (monitored person, watched person, target person) applied to the head based on a predetermined parameter of the head. Therefore, the posture detection device has a simpler configuration that uses a single image acquisition unit, and uses a predetermined parameter related to a head that is difficult to be shielded, so that the posture of a monitoring target such as a fall or a fall can be more accurately determined. Can be determined.
- the parameter is a size of the head on an image.
- the posture detection device can estimate the height of the head by using the size of the head as the parameter, and based on the estimated height of the head, standing, sitting and falling The posture of a monitoring target such as a fall can be determined.
- the parameter is a height of the head.
- a posture detection apparatus uses the height of the head as the parameter, the posture of a monitoring target such as a standing position, a sitting position, and a fall and fall can be determined based on the calculated head height.
- the parameter further includes the position of the head.
- the posture detection apparatus uses the head position in addition to the size of the head or the height of the head to determine the posture, the posture of the monitoring target can be determined more accurately.
- the parameter further includes an orientation of the head.
- the orientation of the head that is, the orientation of the face that can be determined from the orientation of the head is the front (horizontal direction).
- the orientation of the head that is, the orientation of the face that can be determined from the orientation of the head, If it is sideways or top, the monitoring target is likely to have fallen. Since the posture detection apparatus uses the head orientation (ie, the facial orientation) in addition to the size of the head or the height of the head to determine the posture, the posture of the monitoring target can be determined more accurately. .
- a body that extracts the trunk corresponding to the head extracted by the head extraction unit from the image of the detection area acquired by the image acquisition unit A trunk extraction unit is further provided, and the parameter further includes a positional relationship between the head and the trunk.
- the head orientation may be difficult to determine with only the head extracted by the head extraction unit. Therefore, whether or not the body is lying down can be determined by referring to the positional relationship between the head and the trunk (body). That is, if the head is located at one end of the trunk, it can be determined that the body is lying down.
- the posture detection apparatus further includes a trunk extracting unit, and the trunk extracting unit extracts a trunk (an area of an image obtained by copying the trunk (body) in the image, a trunk (body) of the trunk) from the image of the detection area. Image) is extracted, and the positional relationship between the head and the trunk in addition to the size of the head or the height of the head is also used for the determination of the posture, so that the posture of the monitoring target can be determined more accurately.
- the posture determination unit determines whether the predetermined parameter in the head extracted by the head extraction unit is greater than or equal to a predetermined threshold. It is determined whether or not the posture is the same.
- Such an attitude detection device can easily determine whether or not it is the predetermined attitude only by determining whether or not the parameter is greater than or equal to a threshold value.
- the threshold is set based on the height of the standing position.
- the height of the sitting position depends on the height of the standing position, that is, the height. Therefore, by setting the threshold value so that the height is lower than the height of the sitting position based on the height (height) of the standing position, the posture detection device determines whether or not the posture of the monitoring target falls over. Can be determined.
- the threshold is set based on the height of the sitting position.
- the posture detection device is configured such that the posture to be monitored falls over by setting the threshold value so that the height is lower than the height of the sitting position based on the height of the sitting position. It becomes possible to determine whether or not.
- the above-described posture detection device further includes a first threshold setting unit that sets the threshold for each subject.
- a general-purpose posture detection device can be configured by performing statistical processing from a plurality of samples to set the threshold value, but can be customized (optimized) according to the monitoring target. Since the posture detection device further includes a first threshold setting unit, the threshold can be set according to the monitoring target. Therefore, the posture detection device can be customized according to the monitoring target (for each person to be monitored), and the posture of the monitoring target can be more accurately determined. Can be determined.
- the image acquisition unit acquires a plurality of images of detection areas at different times, and based on the plurality of images acquired by the image acquisition unit A second threshold setting unit for setting a threshold is further provided.
- the threshold value is set based on a plurality of images in the detection area at different times by the second threshold value setting unit, so that the threshold value can be automatically set for each subject.
- the threshold value can be set by automatically taking such personal circumstances into consideration.
- the above-described posture detection device further includes a threshold correction unit that corrects the threshold.
- the posture detection device When imaging the detection area, if the image is taken with a wide angle or a tilt, the size of the head on the image and the actual height of the head are not proportional. Since the posture detection device further includes a threshold correction unit that corrects the threshold, the threshold can be appropriately corrected according to the imaging situation, and the posture of the monitoring target can be determined more accurately.
- the threshold value is set to a different value for each of a plurality of determination areas obtained by dividing the detection area into a plurality of areas.
- the threshold value is set to a different value for each of a plurality of determination areas, the determination considering the change in the relationship between the size and height of the head at a position on the image is performed. It becomes possible. Further, according to this, it is possible to make a determination in consideration of a specific area where a bed or the like is present.
- the posture determination unit falls down as the predetermined posture depending on whether or not the position of the head extracted by the head extraction unit is on the floor. It is determined whether or not.
- the posture determination unit determines whether or not the predetermined position is a fall and fall depending on whether or not the position of the head is on the floor. Can be judged.
- the posture determination unit falls as the predetermined posture depending on whether or not the position of the head extracted by the head extraction unit is on a bed. It is determined whether or not it is a fall.
- the posture determination unit determines whether or not the predetermined posture is a fall and fall depending on whether or not the position of the head is on the bed. Can be determined. In other words, recumbency on the bed can be determined.
- the image acquisition unit acquires a plurality of images of detection areas at different times
- the head extraction unit acquires the image acquired by the image acquisition unit.
- a head is extracted from the image
- the posture determination unit determines the moving speed of the head based on the plurality of heads extracted by the head extraction unit. It is determined whether or not it is the predetermined posture based on the determined moving speed of the head.
- the relatively fast movement of the head is likely to fall. Since the posture detection apparatus uses the moving speed of the head as the parameter, it is possible to determine a fall or fall as a predetermined posture to be monitored.
- the image acquisition unit acquires a plurality of images of detection areas at different times
- the head extraction unit acquires the image acquired by the image acquisition unit.
- a head is extracted from the image
- the posture determination unit is extracted by the head extraction unit for each of the plurality of images in the detection area acquired by the image acquisition unit. It is determined whether or not the posture is a predetermined posture based on a predetermined parameter in the head, and finally whether or not the predetermined posture is determined based on a plurality of determination results determined by the posture determination unit.
- a final determination unit for determining automatically.
- the final determination unit finally determines whether or not the predetermined posture is based on a plurality of determination results determined by the posture determination unit. More accurate judgment can be made.
- the image acquisition unit is a camera disposed on the ceiling that images the detection area.
- a posture detection device since a camera as an image acquisition unit is arranged on the ceiling, the monitoring target reflected in the image of the detection area is less likely to be shielded by a fixture or the like placed in the room. The posture can be determined more accurately.
- the posture detection method includes an image acquisition step of acquiring an image of a predetermined detection area, a head extraction step of extracting a head from the image of the detection area acquired by the image acquisition step, A posture determination step of determining whether or not the posture is a predetermined posture based on a predetermined parameter in the head extracted by the head extraction step.
- a posture detection method an image of a detection area is acquired by an image acquisition process using an image acquisition unit, a head is extracted from the image of the detection area by a head extraction process, and the head is detected by a posture determination process.
- the predetermined posture of the monitoring target on the head is determined based on the predetermined parameter. Therefore, the posture detection method has a simpler configuration using a single image acquisition unit, and can use a predetermined parameter related to the head to more accurately determine the posture of a monitoring target such as a fall or fall. .
- position detection method which detect the attitude
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Medical Informatics (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Physiology (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
- Alarm Systems (AREA)
- Emergency Alarm Devices (AREA)
Abstract
Description
Claims (19)
- 所定の検知エリアの画像を取得する画像取得部と、
前記画像取得部によって取得された前記検知エリアの画像から頭部を抽出する頭部抽出部と、
前記頭部抽出部によって抽出された前記頭部における所定のパラメータを求め、この求めた前記パラメータに基づいて所定の姿勢であるか否かを判定する姿勢判定部とを備える、
姿勢検知装置。 - 前記パラメータは、画像上での前記頭部の大きさである、
請求項1に記載の姿勢検知装置。 - 前記パラメータは、前記頭部の高さである、
請求項1に記載の姿勢検知装置。 - 前記パラメータは、前記頭部の位置をさらに含む、
請求項2または請求項3に記載の姿勢検知装置。 - 前記パラメータは、前記頭部の向きをさらに含む、
請求項2ないし請求項4のいずれか1項に記載の姿勢検知装置。 - 前記画像取得部によって取得された前記検知エリアの画像から、前記頭部抽出部で抽出された前記頭部に対応する体幹を抽出する体幹抽出部をさらに備え、
前記パラメータは、前記頭部と前記体幹との位置関係をさらに含む、
請求項2ないし請求項4のいずれか1項に記載の姿勢検知装置。 - 前記姿勢判定部は、前記頭部抽出部によって抽出された前記頭部における所定のパラメータが所定の閾値以上であるか否かによって、前記所定の姿勢であるか否かを判定する、
請求項1ないし請求項6のいずれか1項に記載の姿勢検知装置。 - 前記閾値は、立位の高さに基づいて設定されている、
請求項7に記載の姿勢検知装置。 - 前記閾値は、座位の高さに基づいて設定されている、
請求項7に記載の姿勢検知装置。 - 対象者別に前記閾値を設定する第1閾値設定部をさらに備える、
請求項7ないし請求項9のいずれか1項に記載の姿勢検知装置。 - 前記画像取得部は、互いに異なる時刻における検知エリアの複数の画像を取得し、
前記画像取得部によって取得された前記複数の画像に基づいて前記閾値を設定する第2閾値設定部をさらに備える、
請求項7ないし請求項9のいずれか1項に記載の姿勢検知装置。 - 前記閾値を補正する閾値補正部をさらに備える、
請求項7ないし請求項11のいずれか1項に記載の姿勢検知装置。 - 前記閾値は、前記検知エリアを複数に区分けした複数の判定エリアごとに互いに異なる値で設定されている、
請求項12に記載の姿勢検知装置。 - 前記姿勢判定部は、前記頭部抽出部によって抽出された前記頭部の位置が床上であるか否かによって、前記所定の姿勢として転倒転落であるか否かを判定する、
請求項4に記載の姿勢検知装置。 - 前記姿勢判定部は、前記頭部抽出部によって抽出された前記頭部の位置がベッド上であるか否かによって、前記所定の姿勢として転倒転落であるか否かを判定する、
請求項4に記載の姿勢検知装置。 - 前記画像取得部は、互いに異なる時刻における検知エリアの複数の画像を取得し、
前記頭部抽出部は、前記画像取得部によって取得された前記検知エリアの複数の画像それぞれについて、前記画像から頭部を抽出し、
前記姿勢判定部は、前記頭部抽出部によって抽出された複数の前記頭部に基づいて前記頭部の移動速度を前記パラメータとして求め、この求めた前記頭部の移動速度に基づいて前記所定の姿勢であるか否かを判定する、
請求項1に記載の姿勢検知装置。 - 前記画像取得部は、互いに異なる時刻における検知エリアの複数の画像を取得し、
前記頭部抽出部は、前記画像取得部によって取得された前記検知エリアの複数の画像それぞれについて、前記画像から頭部を抽出し、
前記姿勢判定部は、前記画像取得部によって取得された前記検知エリアの複数の画像それぞれについて、前記頭部抽出部によって抽出された前記頭部における所定のパラメータに基づいて所定の姿勢であるか否かを判定し、
前記姿勢判定部によって判定された複数の判定結果に基づいて、前記所定の姿勢であるか否かを最終的に判定する最終判定部をさらに備える、
請求項1ないし請求項16のいずれか1項に記載の姿勢検知装置。 - 前記画像取得部は、前記検知エリアを撮影する、天井に配設されたカメラである、
請求項1ないし請求項17のいずれか1項に記載の姿勢検知装置。 - 所定の検知エリアの画像を取得する画像取得工程と、
前記画像取得工程によって取得された前記検知エリアの画像から頭部を抽出する頭部抽出工程と、
前記頭部抽出工程によって抽出された前記頭部における所定のパラメータに基づいて所定の姿勢であるか否かを判定する姿勢判定工程とを備える、
姿勢検知方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/555,869 US20180174320A1 (en) | 2015-03-06 | 2016-03-02 | Posture Detection Device and Posture Detection Method |
CN201680013336.9A CN107408308A (zh) | 2015-03-06 | 2016-03-02 | 姿势检测装置以及姿势检测方法 |
JP2017505014A JP6720961B2 (ja) | 2015-03-06 | 2016-03-02 | 姿勢検知装置および姿勢検知方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-044627 | 2015-03-06 | ||
JP2015044627 | 2015-03-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016143641A1 true WO2016143641A1 (ja) | 2016-09-15 |
Family
ID=56879554
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/056496 WO2016143641A1 (ja) | 2015-03-06 | 2016-03-02 | 姿勢検知装置および姿勢検知方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20180174320A1 (ja) |
JP (1) | JP6720961B2 (ja) |
CN (1) | CN107408308A (ja) |
WO (1) | WO2016143641A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109033919A (zh) * | 2017-06-08 | 2018-12-18 | 富泰华精密电子(郑州)有限公司 | 岗位监测装置、方法和存储设备 |
CN109963539A (zh) * | 2017-03-02 | 2019-07-02 | 欧姆龙株式会社 | 看护辅助系统及其控制方法、以及程序 |
JP2020017107A (ja) * | 2018-07-26 | 2020-01-30 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP2020123239A (ja) * | 2019-01-31 | 2020-08-13 | コニカミノルタ株式会社 | 姿勢推定装置、行動推定装置、姿勢推定プログラム、および姿勢推定方法 |
WO2021033597A1 (ja) * | 2019-08-20 | 2021-02-25 | コニカミノルタ株式会社 | 画像処理システム、画像処理プログラム、および画像処理方法 |
JP2021033379A (ja) * | 2019-08-15 | 2021-03-01 | コニカミノルタ株式会社 | 画像処理システム、画像処理プログラム、および画像処理方法 |
FR3136094A1 (fr) * | 2022-05-25 | 2023-12-01 | Inetum | Procédé de détection de chute par analyse d’images |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11000078B2 (en) * | 2015-12-28 | 2021-05-11 | Xin Jin | Personal airbag device for preventing bodily injury |
IL255249A0 (en) * | 2017-10-24 | 2017-12-31 | Pointgrab Ltd | Method and system for identifying a person in an image based on location in the image |
CN108090458B (zh) * | 2017-12-29 | 2020-02-14 | 南京阿凡达机器人科技有限公司 | 人体跌倒检测方法和装置 |
CN110136381B (zh) * | 2018-02-07 | 2023-04-07 | 中国石油化工股份有限公司 | 一种钻井作业现场人员站立监测预警系统 |
CN108806190A (zh) * | 2018-06-29 | 2018-11-13 | 张洪平 | 一种隐匿式雷达跌倒报警方法 |
DE202018104996U1 (de) * | 2018-08-31 | 2019-12-04 | Tridonic Gmbh & Co Kg | Leuchtensystem zur Überwachung einer Sitzhaltung einer Person |
JP7271915B2 (ja) * | 2018-11-22 | 2023-05-12 | コニカミノルタ株式会社 | 画像処理プログラムおよび画像処理装置 |
CN109814714B (zh) * | 2019-01-21 | 2020-11-20 | 北京诺亦腾科技有限公司 | 运动传感器的安装姿态确定方法、装置以及存储介质 |
CN110290349B (zh) * | 2019-06-17 | 2022-03-08 | 苏州佳世达电通有限公司 | 灯具及侦测使用者的坐姿状态的方法 |
CN110443147B (zh) * | 2019-07-10 | 2022-03-18 | 广州市讯码通讯科技有限公司 | 一种坐姿识别方法、系统和存储介质 |
CN111345928B (zh) * | 2020-03-09 | 2022-02-25 | 腾讯科技(深圳)有限公司 | 头部姿势监测方法及装置、存储介质、电子设备 |
CN112446302B (zh) * | 2020-11-05 | 2023-09-19 | 杭州易现先进科技有限公司 | 一种人体姿态检测方法、系统、电子设备和存储介质 |
CN112446360B (zh) * | 2020-12-15 | 2024-12-17 | 作业帮教育科技(北京)有限公司 | 目标行为检测方法、装置及电子设备 |
CN112782664B (zh) * | 2021-02-22 | 2023-12-12 | 四川八维九章科技有限公司 | 一种基于毫米波雷达的卫生间跌倒检测方法 |
CN113132636B (zh) * | 2021-04-16 | 2024-04-12 | 上海天跃科技股份有限公司 | 一种具有人体形态检测的智能监控系统 |
US11837006B2 (en) * | 2021-06-30 | 2023-12-05 | Ubtech North America Research And Development Center Corp | Human posture determination method and mobile machine using the same |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000253382A (ja) * | 1999-02-25 | 2000-09-14 | Matsushita Electric Works Ltd | 転倒検知装置 |
JP2006177086A (ja) * | 2004-12-24 | 2006-07-06 | Matsushita Electric Ind Co Ltd | 入退室管理装置 |
JP2011141732A (ja) * | 2010-01-07 | 2011-07-21 | Nikon Corp | 画像判定装置 |
JP2014236896A (ja) * | 2013-06-10 | 2014-12-18 | Nkワークス株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9047751B2 (en) * | 2010-01-07 | 2015-06-02 | Nikon Corporation | Image determining device to determine the state of a subject |
WO2013011644A1 (ja) * | 2011-07-15 | 2013-01-24 | パナソニック株式会社 | 姿勢推定装置、姿勢推定方法、および姿勢推定プログラム |
CN102722715A (zh) * | 2012-05-21 | 2012-10-10 | 华南理工大学 | 一种基于人体姿势状态判决的跌倒检测方法 |
CN103577792A (zh) * | 2012-07-26 | 2014-02-12 | 北京三星通信技术研究有限公司 | 用于估计人体姿势的设备和方法 |
KR102013705B1 (ko) * | 2013-08-16 | 2019-08-23 | 한국전자통신연구원 | 승마시뮬레이터에서 사용자 자세 인식 장치 및 방법 |
-
2016
- 2016-03-02 WO PCT/JP2016/056496 patent/WO2016143641A1/ja active Application Filing
- 2016-03-02 US US15/555,869 patent/US20180174320A1/en not_active Abandoned
- 2016-03-02 CN CN201680013336.9A patent/CN107408308A/zh active Pending
- 2016-03-02 JP JP2017505014A patent/JP6720961B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000253382A (ja) * | 1999-02-25 | 2000-09-14 | Matsushita Electric Works Ltd | 転倒検知装置 |
JP2006177086A (ja) * | 2004-12-24 | 2006-07-06 | Matsushita Electric Ind Co Ltd | 入退室管理装置 |
JP2011141732A (ja) * | 2010-01-07 | 2011-07-21 | Nikon Corp | 画像判定装置 |
JP2014236896A (ja) * | 2013-06-10 | 2014-12-18 | Nkワークス株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109963539B (zh) * | 2017-03-02 | 2021-12-28 | 欧姆龙株式会社 | 看护辅助系统及其控制方法、以及计算机可读取的记录介质 |
CN109963539A (zh) * | 2017-03-02 | 2019-07-02 | 欧姆龙株式会社 | 看护辅助系统及其控制方法、以及程序 |
US10786183B2 (en) | 2017-03-02 | 2020-09-29 | Omron Corporation | Monitoring assistance system, control method thereof, and program |
CN109033919A (zh) * | 2017-06-08 | 2018-12-18 | 富泰华精密电子(郑州)有限公司 | 岗位监测装置、方法和存储设备 |
JP2020017107A (ja) * | 2018-07-26 | 2020-01-30 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP7283037B2 (ja) | 2018-07-26 | 2023-05-30 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
US11574504B2 (en) | 2018-07-26 | 2023-02-07 | Sony Corporation | Information processing apparatus, information processing method, and program |
JP7196645B2 (ja) | 2019-01-31 | 2022-12-27 | コニカミノルタ株式会社 | 姿勢推定装置、行動推定装置、姿勢推定プログラム、および姿勢推定方法 |
JP2020123239A (ja) * | 2019-01-31 | 2020-08-13 | コニカミノルタ株式会社 | 姿勢推定装置、行動推定装置、姿勢推定プログラム、および姿勢推定方法 |
JP2021033379A (ja) * | 2019-08-15 | 2021-03-01 | コニカミノルタ株式会社 | 画像処理システム、画像処理プログラム、および画像処理方法 |
JP7500929B2 (ja) | 2019-08-15 | 2024-06-18 | コニカミノルタ株式会社 | 画像処理システム、画像処理プログラム、および画像処理方法 |
JPWO2021033597A1 (ja) * | 2019-08-20 | 2021-02-25 | ||
WO2021033597A1 (ja) * | 2019-08-20 | 2021-02-25 | コニカミノルタ株式会社 | 画像処理システム、画像処理プログラム、および画像処理方法 |
JP7388440B2 (ja) | 2019-08-20 | 2023-11-29 | コニカミノルタ株式会社 | 画像処理システム、画像処理プログラム、および画像処理方法 |
FR3136094A1 (fr) * | 2022-05-25 | 2023-12-01 | Inetum | Procédé de détection de chute par analyse d’images |
Also Published As
Publication number | Publication date |
---|---|
JPWO2016143641A1 (ja) | 2017-12-21 |
JP6720961B2 (ja) | 2020-07-08 |
CN107408308A (zh) | 2017-11-28 |
US20180174320A1 (en) | 2018-06-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016143641A1 (ja) | 姿勢検知装置および姿勢検知方法 | |
US10786183B2 (en) | Monitoring assistance system, control method thereof, and program | |
JP6150207B2 (ja) | 監視システム | |
US20180300538A1 (en) | Image processing system, image processing apparatus, image processing method, and image processing program | |
JP6984712B2 (ja) | 被監視者監視システムおよび被監視者監視システムのプログラム | |
JP6720909B2 (ja) | 行動検知装置、該方法および該プログラム、ならびに、被監視者監視装置 | |
JP6292283B2 (ja) | 行動検知装置および行動検知方法ならびに被監視者監視装置 | |
JP6822328B2 (ja) | 見守り支援システム及びその制御方法 | |
US10762761B2 (en) | Monitoring assistance system, control method thereof, and program | |
JP6870465B2 (ja) | 被監視者監視装置および該方法ならびに被監視者監視システム | |
EP2763116A1 (en) | Fall detection system and method for detecting a fall of a monitored person | |
JP6791731B2 (ja) | 姿勢判定装置及び通報システム | |
WO2017025546A1 (en) | Occupancy detection | |
WO2020241057A1 (ja) | 画像処理システム、画像処理プログラム、および画像処理方法 | |
WO2020008995A1 (ja) | 画像認識プログラム、画像認識装置、学習プログラム、および学習装置 | |
JP2021033379A (ja) | 画像処理システム、画像処理プログラム、および画像処理方法 | |
JP2022072765A (ja) | ベッド領域抽出装置、ベッド領域抽出方法、ベッド領域抽出プログラムおよび見守り支援システム | |
WO2021033597A1 (ja) | 画像処理システム、画像処理プログラム、および画像処理方法 | |
JPWO2016199506A1 (ja) | 対象物検出装置および対象物検出方法ならびに被監視者監視装置 | |
JP2021065617A (ja) | 画像処理装置および画像処理プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16761611 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017505014 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15555869 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16761611 Country of ref document: EP Kind code of ref document: A1 |