[go: up one dir, main page]

JP5799219B2 - Object detection device - Google Patents

Object detection device Download PDF

Info

Publication number
JP5799219B2
JP5799219B2 JP2011059699A JP2011059699A JP5799219B2 JP 5799219 B2 JP5799219 B2 JP 5799219B2 JP 2011059699 A JP2011059699 A JP 2011059699A JP 2011059699 A JP2011059699 A JP 2011059699A JP 5799219 B2 JP5799219 B2 JP 5799219B2
Authority
JP
Japan
Prior art keywords
detection
region
background
area
update
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2011059699A
Other languages
Japanese (ja)
Other versions
JP2012194121A (en
Inventor
健一郎 野坂
健一郎 野坂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Intellectual Property Management Co Ltd
Original Assignee
Panasonic Intellectual Property Management Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Intellectual Property Management Co Ltd filed Critical Panasonic Intellectual Property Management Co Ltd
Priority to JP2011059699A priority Critical patent/JP5799219B2/en
Priority to US14/001,573 priority patent/US9189685B2/en
Priority to CN201280011379.5A priority patent/CN103415788B/en
Priority to EP12757894.6A priority patent/EP2672294A1/en
Priority to PCT/IB2012/000476 priority patent/WO2012123808A1/en
Priority to KR1020137022497A priority patent/KR20140007402A/en
Publication of JP2012194121A publication Critical patent/JP2012194121A/en
Application granted granted Critical
Publication of JP5799219B2 publication Critical patent/JP5799219B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/783Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived from static detectors or detector systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/174Segmentation; Edge detection involving the use of two or more images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V8/00Prospecting or detecting by optical means
    • G01V8/10Detecting, e.g. by using light barriers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V9/00Prospecting or detecting by methods not provided for in groups G01V1/00 - G01V8/00
    • G01V9/005Prospecting or detecting by methods not provided for in groups G01V1/00 - G01V8/00 by thermal methods, e.g. after generation of heat by chemical reactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geophysics (AREA)
  • Geophysics And Detection Of Objects (AREA)
  • Radiation Pyrometers (AREA)
  • Image Processing (AREA)

Description

本発明は、熱画像を用いて、現在の熱画像と背景画像との差分を求めることによって対象物を検出する物体検出装置に関する。   The present invention relates to an object detection device that detects an object by obtaining a difference between a current thermal image and a background image using a thermal image.

従来から、センサ装置から検知範囲の熱画像を取得し、熱画像から対象物である人などの熱源を検出する物体検出装置が提供されている。物体検出装置による物体の検出方法としては、検知範囲に対象物が存在しない時点での検知範囲の熱画像を背景画像として予め記録しておき、センサ装置より取得された現在の熱画像と背景画像との差分を検出する背景差分法が知られている。背景差分法は、演算が簡単であり、安定した環境下では検出精度もよいという利点がある。   2. Description of the Related Art Conventionally, there has been provided an object detection device that acquires a thermal image of a detection range from a sensor device and detects a heat source such as a person as a target from the thermal image. As a method for detecting an object by the object detection device, a thermal image of the detection range at the time when no object is present in the detection range is recorded in advance as a background image, and the current thermal image and background image acquired from the sensor device are recorded. A background subtraction method for detecting a difference between and is known. The background subtraction method is advantageous in that the calculation is simple and the detection accuracy is good in a stable environment.

ところで、背景差分法を用いた物体検出装置は、一度取得した背景画像をそのまま使用し続けると、対象物に依らない検知範囲内の温度(以下、「環境温度」という)が変化したときに誤検出を生じるおそれがあるので、背景画像を更新していく必要がある。   By the way, if an object detection device using the background subtraction method continues to use a background image that has been acquired once, it will be erroneous when the temperature within the detection range that does not depend on the object (hereinafter referred to as “environment temperature”) changes. Since there is a risk of detection, it is necessary to update the background image.

そこで、被検知物体が検出されなかった場合に、参照画像(背景画像)を原画像(取得画像)にて更新させる機能を持った装置が提案されている(たとえば特許文献1参照)。また、前回の基準画像(背景画像)を取り込んだ時刻より所定時間隔てた時刻での入力画像を取り込み、この入力画像と基準画像との加算平均値を求めて、次回の基準画像として更新する機能を持った装置も提案されている(たとえば特許文献2参照)。ただし、特許文献2に記載の装置においては、入力画像から対象物が検出された場合、加算平均処理は実行されない。   Therefore, an apparatus having a function of updating a reference image (background image) with an original image (acquired image) when a detected object is not detected has been proposed (for example, see Patent Document 1). Also, a function that captures an input image at a time interval that is a predetermined time interval from the time when a previous reference image (background image) is captured, obtains an average value of the input image and the reference image, and updates it as the next reference image An apparatus having the above has also been proposed (see, for example, Patent Document 2). However, in the apparatus described in Patent Document 2, when an object is detected from an input image, the addition averaging process is not executed.

さらにまた、画像から人体を識別する装置として、背景画像と現在の画像との差分演算を行い所定の閾値で2値化し、閾値以下の画素については現在の画像で背景画像を書き換える機能を有する装置も提案されている(たとえば特許文献3参照)。特許文献3に記載の装置には、閾値以上の画素(つまり2値化により検出された画素)であっても、人体でないと判断された画素については、現在の画像で背景画像を書き換える機能も備わっている。   Furthermore, as an apparatus for identifying a human body from an image, an apparatus having a function of performing a difference operation between a background image and a current image, binarizing with a predetermined threshold, and rewriting the background image with the current image for pixels below the threshold Has also been proposed (see, for example, Patent Document 3). The device described in Patent Document 3 also has a function of rewriting a background image with a current image for a pixel that is determined to be not a human body even if it is a pixel that is equal to or greater than a threshold value (that is, a pixel detected by binarization). It is equipped.

特開昭62−240823号公報JP-A-62-240823 特開平10−308939号公報Japanese Patent Laid-Open No. 10-308939 特開平5−118916号公報Japanese Patent Laid-Open No. 5-118916

しかし、特許文献1,2に記載の構成では、対象物が検出されている画像では背景画像の更新が行われないため、検知範囲に対象物が存在し続けている間は背景画像の更新が行わず、環境温度が変化した場合に対応できずに誤検出を生じる可能性がある。   However, in the configurations described in Patent Documents 1 and 2, since the background image is not updated in the image in which the target object is detected, the background image is updated while the target object continues to exist in the detection range. Otherwise, it may not be possible to cope with a change in the environmental temperature, and erroneous detection may occur.

また、特許文献3に記載の装置のように、対象物が検出された画像のうち対象物(人)が検出されない領域については背景画像の更新を行う場合でも、対象物が検出された領域は背景画像の更新が行われないため、環境温度の変化に対応できずに誤検出を生じ得る。すなわち、たとえば同じ位置に人が居続けている状態で環境温度が低下した場合、人の体温も環境温度に応じて低下するが、人が居る領域の背景画像は、環境温度が高い状態のまま更新されない。この場合、いずれは人の体温と背景画像との差分値が閾値を下回ってしまい、物体検出装置は人が居ないと誤検出(失報)する可能性がある。反対に、同じ位置に人が居続けている状態で環境温度が上昇した場合、人が居る領域の背景画像が実際の環境温度よりも低いままとなり、人がその領域を去っても、物体検出装置はその領域に人が居ると誤検出する可能性がある。   Further, as in the apparatus described in Patent Document 3, even in the case where the background image is updated for an area in which the object (person) is not detected in the image in which the object is detected, the area in which the object is detected is Since the background image is not updated, it is not possible to cope with changes in the environmental temperature, and erroneous detection may occur. That is, for example, if the environmental temperature drops while a person continues to be at the same position, the human body temperature also decreases according to the environmental temperature, but the background image of the area where the person is present is updated while the environmental temperature is high. Not. In this case, in any case, the difference value between the human body temperature and the background image falls below the threshold value, and the object detection device may be erroneously detected (not reported) when there is no person. On the other hand, if the ambient temperature rises while a person is still present at the same position, the background image of the area where the person is present remains lower than the actual ambient temperature, and even if the person leaves that area, the object detection device May falsely detect that there are people in the area.

本発明は上記事由に鑑みて為されており、環境温度の変化に対応することにより誤検出を生じにくくした物体検出装置を提供することを目的とする。   The present invention has been made in view of the above-described reasons, and an object thereof is to provide an object detection apparatus that is less likely to cause erroneous detection by responding to changes in environmental temperature.

本発明の物体検出装置は、検知範囲内の温度分布を表す取得情報を取得する取得部と、前記検知範囲に対象物が存在しないときの前記検知範囲内の温度分布を表す背景情報を記憶する記憶部と、前記背景情報に対する前記取得情報の変化に基づいて前記検知範囲における前記対象物の有無を検出する検出部と、前記記憶部に記憶されている前記背景情報を繰り返し更新する更新部とを備え、前記更新部は、前記背景情報を、前記検出部で前記対象物が検出された領域を含む検出領域と、当該検出領域以外の領域からなる不検出領域とに分け、前記不検出領域については、前記取得情報に基づいて前記背景情報を更新する第1の背景更新処理を実行し、前記検出領域については、前記不検出領域の少なくとも一部からなる抽出領域の前記第1の背景更新処理による温度の変化量から求まる補正値を用いて、前記背景情報を更新する第2の背景更新処理を実行することを特徴とする。   The object detection apparatus according to the present invention stores an acquisition unit that acquires acquisition information representing a temperature distribution within a detection range, and background information that represents a temperature distribution within the detection range when no object is present in the detection range. A storage unit; a detection unit that detects the presence or absence of the object in the detection range based on a change in the acquired information with respect to the background information; and an update unit that repeatedly updates the background information stored in the storage unit; The update unit divides the background information into a detection region including a region where the object is detected by the detection unit and a non-detection region including a region other than the detection region, and the non-detection region For the first background update process for updating the background information based on the acquired information, and for the detection area, the first of the extraction area consisting of at least a part of the non-detection area Using the correction value obtained from the amount of change in temperature by Jing update process, and executes a second background update processing of updating the background information.

この物体検出装置において、前記更新部は、前記不検出領域のうち前記検出領域に接している一部の領域を前記抽出領域として、当該抽出領域の温度の変化量から前記補正値を求めることが望ましい。   In this object detection apparatus, the update unit obtains the correction value from the amount of change in temperature of the extraction region, with the partial region in contact with the detection region as the extraction region in the non-detection region. desirable.

この物体検出装置において、前記更新部は、前記背景情報の前記不検出領域の一部の領域を前記抽出領域として、当該抽出領域の温度の変化量から前記補正値を求める第1モードと、前記背景情報の前記不検出領域の全域を前記抽出領域として、当該抽出領域の温度の変化量から前記補正値を求める第2モードとを、前記第1の背景更新処理による前記不検出領域の温度の変化量のばらつき度合いに応じて切り替えることがより望ましい。   In this object detection apparatus, the update unit uses a partial area of the non-detection area of the background information as the extraction area, and obtains the correction value from the amount of change in temperature of the extraction area; The second mode for obtaining the correction value from the amount of change in the temperature of the extraction region, using the entire non-detection region of the background information as the extraction region, and the temperature of the non-detection region by the first background update process. It is more desirable to switch according to the variation degree of variation.

本発明は、検出領域については、更新部が、不検出領域の少なくとも一部からなる抽出領域の温度の変化量から求まる補正値を用いて、背景情報を更新する第2の背景更新処理を実行するので、環境温度の変化に対応することにより誤検出を生じにくくなる。   According to the present invention, for a detection region, the update unit performs a second background update process for updating background information using a correction value obtained from the amount of change in temperature of the extraction region that is at least part of the non-detection region. Therefore, it becomes difficult to cause erroneous detection by responding to changes in the environmental temperature.

実施形態1に係る物体検出装置の構成を示す概略図である。1 is a schematic diagram illustrating a configuration of an object detection device according to a first embodiment. 実施形態1に係る物体検出装置の動作を示すフローチャートである。3 is a flowchart illustrating an operation of the object detection apparatus according to the first embodiment. 実施形態1に係る物体検出装置の動作を示す説明図である。FIG. 6 is an explanatory diagram illustrating an operation of the object detection device according to the first embodiment. 実施形態1に係る物体検出装置の動作を示す説明図である。FIG. 6 is an explanatory diagram illustrating an operation of the object detection device according to the first embodiment. 実施形態1に係る物体検出装置の動作を示す説明図である。FIG. 6 is an explanatory diagram illustrating an operation of the object detection device according to the first embodiment. 実施形態1の他の構成に係る物体検出装置の動作を示す説明図である。FIG. 10 is an explanatory diagram illustrating an operation of an object detection device according to another configuration of the first embodiment. 実施形態1の他の構成に係る物体検出装置の動作を示す説明図である。FIG. 10 is an explanatory diagram illustrating an operation of an object detection device according to another configuration of the first embodiment. 実施形態1の他の構成に係る物体検出装置の動作を示す説明図である。FIG. 10 is an explanatory diagram illustrating an operation of an object detection device according to another configuration of the first embodiment. 実施形態2に係る物体検出装置の動作を示す説明図である。FIG. 10 is an explanatory diagram illustrating an operation of the object detection device according to the second embodiment.

本実施形態の物体検出装置1は、図1に示すように、センサ装置2から情報を取得する取得部11と、情報を記憶する記憶部12と、背景差分などの演算を行い対象物3の有無を検出する検出部13と、検出結果を出力する出力部14とを備えている。ここでは人を対象物3とするが、人に限らず、たとえば走行中の自動車など熱源となる対象物であれば本実施形態の物体検出装置1は検出することができる。   As shown in FIG. 1, the object detection apparatus 1 according to the present embodiment performs an operation such as a background difference, an acquisition unit 11 that acquires information from the sensor device 2, a storage unit 12 that stores information, and the object 3. A detection unit 13 that detects presence / absence and an output unit 14 that outputs a detection result are provided. Here, the object 3 is the object 3, but the object detection apparatus 1 of the present embodiment can detect any object as long as it is a heat source such as a traveling car.

センサ装置2は、赤外線に感度を持つたとえばサーモパイル素子等のセンサ素子(図示せず)がマトリクス状に2次元配置されて成り、検知範囲(センシング範囲)内の温度分布を表す情報を二次元の熱画像(温度分布画像)として継続的に出力する。各センサ素子は熱画像の各画素に対応しており、以下では各センサ素子が検出した領域の摂氏温度を、熱画像にて対応する各画素の画素値として説明する。なお、図1の例では物体検知装置1はセンサ装置2と別体であるが、この構成に限らず、物体検知装置1にセンサ装置2が一体に備わっていてもよい。   The sensor device 2 is formed by two-dimensionally arranging sensor elements (not shown) having sensitivity to infrared rays, such as a thermopile element, in a matrix, and information representing a temperature distribution in the detection range (sensing range) is two-dimensionally displayed. Output continuously as a thermal image (temperature distribution image). Each sensor element corresponds to each pixel of the thermal image, and hereinafter, the Celsius temperature of the area detected by each sensor element will be described as the pixel value of each corresponding pixel in the thermal image. In the example of FIG. 1, the object detection device 1 is separate from the sensor device 2. However, the present invention is not limited to this configuration, and the object detection device 1 may be integrally provided with the sensor device 2.

本実施形態の物体検出装置1はコンピュータからなり、コンピュータに所定のプログラムを実行させることにより、取得部11、検出部13、さらに後述の更新部15としての各機能をコンピュータにて実現する。出力部14は、モニタ装置やプリンタ装置など、検出部13の検出結果をユーザに提示する機能を有する装置からなる。   The object detection apparatus 1 according to the present embodiment includes a computer. By causing a computer to execute a predetermined program, each function as an acquisition unit 11, a detection unit 13, and an update unit 15 described later is realized by the computer. The output unit 14 includes a device having a function of presenting the detection result of the detection unit 13 to the user, such as a monitor device or a printer device.

物体検出装置1は、検知範囲内の温度分布を表す取得情報を取得部11にてセンサ装置2から取得し、検出部13にて、記憶部12に記憶された背景情報に対する取得情報の変化に基づいて、検知範囲における対象物3の有無を検出する。すなわち、物体検出装置1は、検知範囲に対象物3が存在しない時点での検知範囲の温度分布を表す熱画像を背景画像(背景情報)として予め記憶部12に記録している。この物体検出装置1は、センサ装置2から現在の検知範囲の温度分布を表す熱画像である取得画像(取得情報)を定期的に取得し、取得画像と背景画像との差分を検出する背景差分法により、検出部13にて対象物3を検出する。   The object detection device 1 acquires acquisition information representing the temperature distribution within the detection range from the sensor device 2 by the acquisition unit 11, and changes the acquisition information with respect to the background information stored in the storage unit 12 by the detection unit 13. Based on this, the presence or absence of the object 3 in the detection range is detected. In other words, the object detection device 1 records in the storage unit 12 in advance a thermal image representing the temperature distribution of the detection range when the object 3 does not exist in the detection range as a background image (background information). The object detection device 1 periodically acquires an acquired image (acquisition information) that is a thermal image representing the temperature distribution of the current detection range from the sensor device 2, and detects a difference between the acquired image and the background image. By the method, the detection unit 13 detects the object 3.

ただし、もし一度取得した背景画像をそのまま使用し続けると、物体検出装置1は、時間帯の変化や冷暖房等により、対象物3に依らない検知範囲内の温度(以下、「環境温度」という)が変化したときに誤検出を生じる可能性がある。   However, if the background image acquired once is used as it is, the object detection device 1 will detect the temperature within the detection range that does not depend on the object 3 (hereinafter referred to as “environment temperature”) due to a change in time zone, air conditioning, or the like. There is a possibility that false detection will occur when changes occur.

そこで、本実施形態の物体検出装置1は、記憶部12に記憶されている背景画像を繰り返し更新する更新部15を備えている。ここでは、更新部15は、取得画像と背景画像との対比による対象物3の検出処理が検出部13にて為される度に背景画像を更新するが、この構成に限らず、たとえば複数回の検出処理ごとに1度の周期で背景画像を更新してもよい。また、更新部15は、たとえばある決められた時刻になると背景画像を更新してもよい。   Therefore, the object detection apparatus 1 of the present embodiment includes an update unit 15 that repeatedly updates the background image stored in the storage unit 12. Here, the update unit 15 updates the background image every time the detection unit 13 performs the detection processing of the target object 3 by comparing the acquired image and the background image. The background image may be updated at a cycle of once every detection process. Further, the update unit 15 may update the background image at, for example, a predetermined time.

以下に、物体検出装置1の動作について、検出部13による対象物3の検出処理、および更新部15による背景画像の更新処理を中心に、図2のフローチャートを参照して説明する。また、図3〜5は、5×5画素の熱画像を例として、物体検出装置1の処理を例示している。なお、以下では、画像における各画素の画素値を、画像における各画素の座標位置(x,y)および時刻tにて特定する。   Hereinafter, the operation of the object detection apparatus 1 will be described with reference to the flowchart of FIG. 2, focusing on the detection process of the object 3 by the detection unit 13 and the update process of the background image by the update unit 15. Moreover, FIGS. 3-5 has illustrated the process of the object detection apparatus 1 by taking the thermal image of 5 * 5 pixels as an example. In the following, the pixel value of each pixel in the image is specified by the coordinate position (x, y) of each pixel in the image and time t.

物体検出装置1は、まず、対象物3が存在しない状態でセンサ装置2から出力される熱画像を取得部11にて取得し、この熱画像の各画素値を、背景データTb(x,y,t)の初期値として記憶部12に予め記憶する(S1)。背景データTb(x,y,t)は、背景画像Tbを構成する各画素の画素値である。   First, the object detection device 1 acquires a thermal image output from the sensor device 2 in a state where the target object 3 is not present, by the acquisition unit 11, and sets each pixel value of the thermal image as background data Tb (x, y , T) is stored in advance in the storage unit 12 (S1). The background data Tb (x, y, t) is a pixel value of each pixel constituting the background image Tb.

物体検出装置1は、現在の温度分布データT(x,y,t)をセンサ装置2から取得部11にて取得し、この温度分布データT(x,y,t)を記憶部12に記憶する(S2)。温度分布データT(x,y,t)は、取得画像Tを構成する各画素の画素値である。   The object detection device 1 acquires the current temperature distribution data T (x, y, t) from the sensor device 2 by the acquisition unit 11 and stores the temperature distribution data T (x, y, t) in the storage unit 12. (S2). The temperature distribution data T (x, y, t) is a pixel value of each pixel constituting the acquired image T.

検出部13は、図3に例示するように、記憶部12に記憶されている背景データTb(x,y,t)と温度分布データT(x,y,t)との差分を、差分データTd(x,y,t)として算出する(S3)。つまり、差分データTd(x,y,t)は数1にて表される。差分データTd(x,y,t)は差分画像Tdを構成する各画素の画素値である。検出部13は数1で表される演算を画素ごとに取得画像Tの全画素について行う。   As illustrated in FIG. 3, the detection unit 13 calculates the difference between the background data Tb (x, y, t) stored in the storage unit 12 and the temperature distribution data T (x, y, t) as difference data. Calculated as Td (x, y, t) (S3). That is, the difference data Td (x, y, t) is expressed by Equation 1. The difference data Td (x, y, t) is a pixel value of each pixel constituting the difference image Td. The detection unit 13 performs the calculation expressed by Equation 1 for all pixels of the acquired image T for each pixel.

Figure 0005799219
それから、検出部13は、差分データTd(x,y,t)を所定の閾値によって2値化する(S4)。このとき、検出部13は、図3に例示するように、差分値が閾値以上の画素を対象物3が検出された「検出画素」とし(S5)、差分値が閾値より小さな画素を対象物3が検出されなかった「不検出画素」とする(S6)ように2値化し、検出画像Iを生成する。図3〜5の例では、斜線部が検出画素の集合からなる領域、つまり対象物3が検出された領域である検出領域D1を表し、それ以外の領域が不検出領域の集合からなる領域、つまり対象物3が検出されなかった領域である不検出領域D0を表している。検出画像Iを構成する各画素の画素値I(x,y,t)は閾値Cを用いて数2にて表される。
Figure 0005799219
Then, the detection unit 13 binarizes the difference data Td (x, y, t) with a predetermined threshold (S4). At this time, as illustrated in FIG. 3, the detection unit 13 sets a pixel having a difference value equal to or larger than a threshold as a “detection pixel” in which the target 3 is detected (S5), and sets a pixel having a difference value smaller than the threshold as the target. The detected image I is generated by binarizing so that “3” is detected as “undetected pixel” (S6). In the example of FIGS. 3 to 5, the shaded area represents a region composed of a set of detection pixels, that is, a detection region D1 in which the object 3 is detected, and the other region is a region composed of a set of non-detection regions. That is, it represents a non-detection area D0 that is an area where the object 3 has not been detected. A pixel value I (x, y, t) of each pixel constituting the detection image I is expressed by Equation 2 using a threshold value C.

Figure 0005799219
つまり、I(x,y,t)=1となる画素(x,y)が検出画素であり、I(x,y,t)=0となる画素(x,y)が不検出画素である。検出画像Iは、検出部13の検出結果として出力部14から出力される。
Figure 0005799219
That is, a pixel (x, y) where I (x, y, t) = 1 is a detection pixel, and a pixel (x, y) where I (x, y, t) = 0 is a non-detection pixel. . The detection image I is output from the output unit 14 as a detection result of the detection unit 13.

次に、物体検出装置1は、不検出画素の集合からなる不検出領域D0について背景データTb(x,y,t)を更新する第1の背景更新処理を、更新部15にて行う(S7)。更新部15は、不検出領域D0の全画素について、最新の温度分布データT(x,y,t)に基づいて記憶部12に記憶されている背景データTb(x,y,t)を補正することにより、第1の背景更新処理を行う。具体的には、更新部15は、図4に例示するように、記憶部12に記憶されている背景データTb(x,y,t)と、現在の温度分布データT(x,y,t)との重み付き平均(加重平均)値を算出することによって、第1の背景更新処理を行う。つまり、不検出領域D0の新たな背景データTb(x,y,t+1)は、重み付き平均のパラメータαを用いて数3にて表される。図4の例では、パラメータα=0.1としている。更新部15は、数3で表される演算を画素ごとに不検出領域D0の全画素について行う。   Next, the object detection apparatus 1 performs, in the update unit 15, first background update processing for updating the background data Tb (x, y, t) for the non-detection region D0 including a set of non-detection pixels (S7). ). The update unit 15 corrects the background data Tb (x, y, t) stored in the storage unit 12 based on the latest temperature distribution data T (x, y, t) for all pixels in the non-detection region D0. Thus, the first background update process is performed. Specifically, as illustrated in FIG. 4, the update unit 15 includes the background data Tb (x, y, t) stored in the storage unit 12 and the current temperature distribution data T (x, y, t). The first background update process is performed by calculating a weighted average (weighted average) value. That is, the new background data Tb (x, y, t + 1) in the non-detection area D0 is expressed by Equation 3 using the weighted average parameter α. In the example of FIG. 4, the parameter α = 0.1. The update unit 15 performs the calculation expressed by Equation 3 for all the pixels in the non-detection region D0 for each pixel.

Figure 0005799219
更新部15は、第1の背景更新処理で得られた不検出領域D0の新たな背景データTb(x,y,t+1)を用いて、後述の第2の背景更新処理で用いられる補正値を求める。補正値は、背景画像Tbのうち不検出領域D0の少なくとも一部からなる抽出領域において、第1の背景更新処理で更新された温度の変化量から求められる。本実施形態では、不検出領域D0の全域を抽出領域とする。ここでは、更新部15は、まず数4で表される更新値ΔTb(x,y,t)を、図5(a)に例示するように、画素ごとに不検出領域D0の全画素について求める(S8)。更新値ΔTb(x,y,t)は、背景画像Tbにおいて第1の背景更新処理によって更新された画素値の変化量を表す値である。
Figure 0005799219
The update unit 15 uses the new background data Tb (x, y, t + 1) of the non-detection area D0 obtained in the first background update process, and calculates a correction value used in the second background update process described later. Ask. The correction value is obtained from the amount of change in temperature updated in the first background update process in the extraction region consisting of at least a part of the non-detection region D0 in the background image Tb. In the present embodiment, the entire non-detection area D0 is set as the extraction area. Here, the update unit 15 first obtains the update value ΔTb (x, y, t) expressed by Equation 4 for all the pixels in the non-detection region D0 for each pixel as illustrated in FIG. 5A. (S8). The update value ΔTb (x, y, t) is a value representing the amount of change in the pixel value updated by the first background update process in the background image Tb.

Figure 0005799219
続いて、更新部15は、抽出領域である不検出領域D0の全画素における更新値ΔTb(x,y,t)の代表値を、補正値として求める(S9)。本実施形態では、補正値となる更新値ΔTb(x,y,t)の代表値は数5にて表される平均値ΔTavとする。ただし、代表値は平均値に限らず、中央値や最頻値などであってもよい。なお、数5では、不検出領域D0に属する要素数(画素数)を|D0|として表している。
Figure 0005799219
Subsequently, the update unit 15 obtains a representative value of the update value ΔTb (x, y, t) in all the pixels of the non-detection area D0 that is the extraction area as a correction value (S9). In the present embodiment, the representative value of the update value ΔTb (x, y, t) serving as the correction value is the average value ΔTav expressed by Equation 5. However, the representative value is not limited to the average value, and may be a median value or a mode value. In Equation 5, the number of elements (number of pixels) belonging to the non-detection region D0 is represented as | D0 |.

Figure 0005799219
図5の例では、19画素からなる不検出領域D0中に、更新値ΔTb(x,y,t)=0.1の画素が12画素存在するので、平均値ΔTavは1.2/19≒0.06となる。
Figure 0005799219
In the example of FIG. 5, since there are 12 pixels of the update value ΔTb (x, y, t) = 0.1 in the non-detection area D0 consisting of 19 pixels, the average value ΔTav is 1.2 / 19≈. 0.06.

それから、物体検出装置1は、検出画素の集合からなる検出領域D1について背景データTb(x,y,t)を更新する第2の背景更新処理を、更新部15にて行う(S10)。更新部15は、検出領域D1の全画素について、処理S8,S9にて求めた補正値(平均値ΔTav)を用いて記憶部12に記憶されている背景データTb(x,y,t)を補正することにより、第2の背景更新処理を行う。具体的には、更新部15は、図5(b)に例示するように、記憶部12に記憶されている現在の背景データTb(x,y,t)に、補正値を加算することによって、第2の背景更新処理を行う。つまり、検出領域D1の新たな背景データTb(x,y,t+1)は数6にて表される。更新部15は、数6で表される演算を画素ごとに検出領域D1の全画素について行う。   Then, the object detection apparatus 1 performs the second background update process in which the update unit 15 updates the background data Tb (x, y, t) for the detection region D1 including the set of detection pixels (S10). The update unit 15 uses the correction values (average value ΔTav) obtained in the processes S8 and S9 for all the pixels in the detection area D1, and uses the background data Tb (x, y, t) stored in the storage unit 12. By correcting, the second background update process is performed. Specifically, as illustrated in FIG. 5B, the update unit 15 adds the correction value to the current background data Tb (x, y, t) stored in the storage unit 12. The second background update process is performed. That is, the new background data Tb (x, y, t + 1) in the detection area D1 is expressed by Equation 6. The updating unit 15 performs the calculation represented by Equation 6 for all the pixels in the detection region D1 for each pixel.

Figure 0005799219
要するに、更新部15は、背景データTb(x,y,t)のうち、不検出領域D0については数3による第1の背景更新処理を行い、検出領域D1については数6による第2の背景更新処理を行う。更新部15は、このようにして得られる新たな背景データTb(x,y,t+1)によって、記憶部12内の背景画像Tbを随時書き換える。その結果、記憶部12には、全画素(不検出領域D0および検出領域D1の両方の全画素)について値が更新された新たな背景データTb(x,y,t+1)からなる背景画像Tb(t+1)が、背景画像Tbとして記憶されることになる。なお、センサ装置2の検知範囲内に対象物3が存在しなければ、差分画像Tdは検出領域D1を含まず、全域が不検出領域D0になるので、第1の背景更新処理のみで背景画像Tbの全域が更新される。
Figure 0005799219
In short, in the background data Tb (x, y, t), the update unit 15 performs the first background update process according to Equation 3 for the non-detection region D0, and the second background according to Equation 6 for the detection region D1. Perform update processing. The update unit 15 rewrites the background image Tb in the storage unit 12 as needed with the new background data Tb (x, y, t + 1) obtained in this way. As a result, the storage unit 12 stores a background image Tb (new background data Tb (x, y, t + 1)) with updated values for all pixels (all pixels in both the non-detection region D0 and the detection region D1). t + 1) is stored as the background image Tb. If the object 3 does not exist within the detection range of the sensor device 2, the difference image Td does not include the detection region D1, and the entire region becomes the non-detection region D0. Therefore, the background image is obtained only by the first background update process. The entire area of Tb is updated.

その後、物体検出装置1は、S2の処理に戻って、更新後の背景画像Tbを用いてS2〜S10の処理を繰り返し実行する。   Thereafter, the object detection apparatus 1 returns to the process of S2 and repeatedly executes the processes of S2 to S10 using the updated background image Tb.

以上説明した本実施形態の物体検出装置1によれば、対象物3が同一箇所に存在し続けている間に環境温度が変化した場合でも、更新部15は、対象物3が検出された検出領域D1について、環境温度の変化に追従して背景画像Tbを更新することができる。したがって、対象物3が同一箇所に長時間存在し続けた場合でも、背景画像Tbは環境温度の変化に対応して正しく更新され、物体検出装置1は、更新後の背景画像Tbを用いることにより、その後の対象物3の検出処理を正しく行うことができる。   According to the object detection device 1 of the present embodiment described above, even when the environmental temperature changes while the target object 3 continues to exist at the same location, the update unit 15 detects that the target object 3 has been detected. Regarding the region D1, the background image Tb can be updated following the change in the environmental temperature. Therefore, even when the object 3 continues to exist in the same place for a long time, the background image Tb is correctly updated in response to a change in the environmental temperature, and the object detection device 1 uses the updated background image Tb. Subsequently, the detection process of the object 3 can be correctly performed.

要するに、更新部15は、対象物3が検出されなかった不検出領域D0だけでなく、対象物3が検出された検出領域D1についても、背景画像Tbを更新するので、背景画像Tbの全域を環境温度の変化に対応させることができる。たとえば、同じ位置に人が居続けている状態で環境温度が低下した場合でも、人が居る領域の背景画像Tbも環境温度の変化に応じて更新されるため、物体検出装置1は人が居ないと誤検出(失報)してしまうことを回避できる。また、同じ位置に人が居続けている状態で環境温度が上昇した場合には、物体検出装置1は、人がその領域を去ったときに、その領域に人が居ると誤検出してしまうことを回避できる。   In short, the update unit 15 updates the background image Tb not only in the non-detection area D0 where the object 3 is not detected but also in the detection area D1 where the object 3 is detected. Can respond to changes in environmental temperature. For example, even when the ambient temperature is lowered while a person is still present at the same position, the background image Tb in the area where the person is present is also updated according to the change in the ambient temperature, so the object detection device 1 does not have a person. It is possible to avoid erroneous detection (missing). In addition, when the environmental temperature rises while a person is still present at the same position, the object detection device 1 may erroneously detect that there is a person in the area when the person leaves the area. Can be avoided.

結果的に、本実施形態の物体検出装置1は、背景画像Tbのうち検出領域D1の更新を行わない構成に比べて、環境温度の変化に対応することにより誤検出を生じにくいという利点がある。   As a result, the object detection device 1 according to the present embodiment has an advantage that it is less likely to cause false detection by responding to a change in the environmental temperature, compared to a configuration in which the detection region D1 is not updated in the background image Tb. .

ところで、上記実施形態においては、検出領域D1は、差分データが閾値以上の画素(検出画素)の集合そのものであるが、この例に限らず、検出領域D1は、適切に拡大されることが望ましい。つまり、検出領域D1は、少なくとも検出部13で対象物3が検出された領域全体を含んでいればよく、対象物3が検出された領域そのものに限らない。同様に、不検出領域D0は、検出領域D1以外の領域であって、対象物3が検出されなかった領域そのものに限らない。検出領域D1が拡大されることにより、以下のような問題を解消することができる。   By the way, in the above embodiment, the detection area D1 is a set of pixels (detection pixels) whose difference data is equal to or greater than a threshold value. However, the present invention is not limited to this example, and the detection area D1 is desirably enlarged appropriately. . That is, the detection area D1 only needs to include at least the entire area where the object 3 is detected by the detection unit 13, and is not limited to the area where the object 3 is detected. Similarly, the non-detection area D0 is an area other than the detection area D1, and is not limited to the area itself in which the object 3 is not detected. By enlarging the detection region D1, the following problems can be solved.

すなわち、センサ装置2の画素分解能が粗い場合などで、同一画素中に対象物3が存在する部分と存在しない部分とが混在する画素については、平均化された低い画素値(温度)が出力され、差分データが閾値以上とならずに不検出領域D0に含まれることがある。この画素においては実際には対象物3が存在するため、画素値は閾値未満とはいえ本来の環境温度より高い。したがって、上記実施形態の物体検出装置1によれば、背景データは更新部15によって本来の環境温度よりも高い温度に更新されることとなり、誤検出の原因となる。   That is, when the pixel resolution of the sensor device 2 is rough, an average low pixel value (temperature) is output for a pixel in which a portion where the object 3 exists and a portion where the object 3 does not exist are mixed in the same pixel. The difference data may be included in the non-detection area D0 without exceeding the threshold value. In this pixel, since the object 3 actually exists, the pixel value is higher than the original environmental temperature although it is less than the threshold value. Therefore, according to the object detection apparatus 1 of the above-described embodiment, the background data is updated to a temperature higher than the original environmental temperature by the update unit 15, which causes erroneous detection.

これに対して、検出領域D1が拡大された場合、同一画素中に対象物3が存在する部分と存在しない部分とが混在する画素についても、検出領域D1に含まれることになるので、この画素の画素値によって背景データが更新されることを回避できる。したがって、背景データが更新部15によって本来の環境温度よりも高い温度に更新されることを回避できる。   On the other hand, when the detection area D1 is enlarged, a pixel in which a part where the object 3 exists and a part where the object 3 does not exist are included in the detection area D1. It is possible to avoid the background data being updated with the pixel value. Therefore, it is possible to avoid the background data being updated to a temperature higher than the original environmental temperature by the updating unit 15.

検出領域D1の拡大の処理は、具体的には、上記処理S4〜S6によって生成された検出画像Iのうち画素値I(x,y,t)=1である画素について為される膨張処理(Dilation)である。これにより、差分画像Tdの検出領域D1は、図6に例示すように差分データTd(x,y,t)が閾値以上の画素(検出画素)の集合だけでなく、その周辺にまで拡大される。ここでいう膨張処理は、画像処理技術として一般的な手法であり、値が同一であり互いに連結された画素の塊の境界線を外側に1画素分拡大する処理である。この膨張処理を行うことによって、同一画素中に対象物3が存在する部分と存在しない部分とが混在する画素を検出領域D1に含むことができる。   Specifically, the enlargement process of the detection area D1 is performed on the pixel having the pixel value I (x, y, t) = 1 in the detection image I generated by the processes S4 to S6 ( Dilation). As a result, the detection area D1 of the difference image Td is expanded not only to a set of pixels (detection pixels) whose difference data Td (x, y, t) is equal to or greater than the threshold value as illustrated in FIG. The The dilation processing here is a general method as an image processing technique, and is a processing for enlarging the boundary line of pixel blocks having the same value and connected to each other by one pixel. By performing this expansion process, it is possible to include in the detection region D1 a pixel in which a portion where the object 3 exists and a portion where the target 3 does not exist in the same pixel.

物体検出装置1は、検出領域D1の拡大処理を行った後、図7および図8に例示するように、検出領域D1以外の領域を不検出領域D0として、第1の背景更新処理および第2の背景更新処理を実行する(上記処理S7〜S10)。なお、図6〜8の例では、斜線部が検出領域D1を表し、それ以外の領域が不検出領域D0を表している。   After performing the enlargement process of the detection area D1, the object detection apparatus 1 sets the area other than the detection area D1 as the non-detection area D0 and performs the first background update process and the second process as illustrated in FIGS. The background update process is executed (the above processes S7 to S10). 6-8, the shaded area represents the detection area D1, and the other areas represent the non-detection area D0.

(実施形態2)
本実施形態の物体検出装置1は、第2の背景更新処理用の補正値を求める際に用いる抽出領域の設定が実施形態1の物体検出装置1と相違する。以下では、実施形態1と同様の構成については同一の符号を付して説明を適宜省略する。
(Embodiment 2)
The object detection device 1 according to the present embodiment is different from the object detection device 1 according to the first embodiment in the setting of the extraction region used when obtaining the correction value for the second background update process. In the following, the same components as those in the first embodiment are denoted by the same reference numerals, and the description thereof is omitted as appropriate.

実施形態1に係る物体検出装置1では、不検出領域D0の全域を抽出領域としており、更新部15は不検出領域D0の全画素の更新値ΔTb(x,y,t)の代表値を第2の背景更新処理用の補正値とする。そのため、センサ装置2の検知範囲が広く、検知範囲内で環境温度の変化に偏りがある場合に、以下のような不具合が生じ得る。たとえばストーブのような対象物3以外の熱源が対象物(人)3から離れた位置に存在する場合、検出領域D1の本来の環境温度とは関係のない熱源周辺の温度が、第2の背景更新処理用の補正値に影響することがある。これにより、検出領域D1の背景データが、本来の環境温度よりも高い温度に更新されることがある。   In the object detection device 1 according to the first embodiment, the entire non-detection region D0 is an extraction region, and the update unit 15 sets the representative values of the update values ΔTb (x, y, t) of all the pixels in the non-detection region D0. The correction value for the background update process 2 is used. Therefore, when the detection range of the sensor device 2 is wide and the change in environmental temperature is biased within the detection range, the following problems may occur. For example, when a heat source other than the object 3 such as a stove exists at a position away from the object (person) 3, the temperature around the heat source that is not related to the original environmental temperature of the detection region D1 is the second background. It may affect the correction value for the update process. Thereby, the background data of the detection area D1 may be updated to a temperature higher than the original environmental temperature.

これに対して、本実施形態では、更新部15は、不検出領域D0のうち検出領域D1に接している一部の領域のみを抽出領域とする。すなわち、更新部15は、不検出領域D0のうち検出領域D1に隣接する周辺画素における更新値ΔTb(x,y,t)の代表値(たとえば平均値ΔTav)を補正値として算出し、この補正値を用いて第2の背景更新処理を行う。具体的には、更新部15は、検出画像Iに対してn回(n=1,2,・・・)の膨張処理を実行し、膨張された領域と膨張前の検出領域D1との差分領域を抽出領域とし、抽出領域内の画素のみを対象として、更新値ΔTb(x,y,t)の代表値(補正値)を求める。ここで、nは抽出領域の広さ(幅)を決めるパラメータであって、センサ装置2の検知範囲および画素分解能に応じて設定されていればよい。   On the other hand, in this embodiment, the update unit 15 sets only a part of the non-detection area D0 in contact with the detection area D1 as an extraction area. That is, the update unit 15 calculates a representative value (for example, average value ΔTav) of the update value ΔTb (x, y, t) in the neighboring pixels adjacent to the detection region D1 in the non-detection region D0 as a correction value. A second background update process is performed using the value. Specifically, the update unit 15 performs expansion processing (n = 1, 2,...) On the detection image I n times, and the difference between the expanded region and the detection region D1 before the expansion. A representative value (correction value) of the update value ΔTb (x, y, t) is obtained by setting the region as an extraction region and targeting only pixels in the extraction region. Here, n is a parameter that determines the width (width) of the extraction region, and may be set according to the detection range and pixel resolution of the sensor device 2.

以上説明した本実施形態の物体検出装置1によれば、不検出領域D0のうち検出領域D1に接している一部の領域のみが抽出領域となるので、検知範囲内で環境温度の変化に偏りがある場合でも、第2の背景更新処理用の補正値として適切な値が算出される。つまり、補正値を用いた第2の背景更新処理により、対象物3が検出された検出領域D1の背景データは、対象物3の近傍の環境温度の変化を用いて更新されるので、より確からしい値に補正されることになる。したがって、検知範囲内で環境温度の変化に偏りがある場合でも、第2の背景更新処理において、検出領域D1の背景データが本来の環境温度よりも高い温度に更新されることを回避できる。この構成は、センサ装置2の検知範囲が広い場合など、検知範囲内で環境温度の変化に偏りが生じやすい場合に特に有用である。   According to the object detection device 1 of the present embodiment described above, since only a part of the non-detection area D0 that is in contact with the detection area D1 is an extraction area, it is biased toward changes in environmental temperature within the detection range. Even if there is, an appropriate value is calculated as the correction value for the second background update process. That is, the background data of the detection region D1 in which the object 3 is detected by the second background update process using the correction value is updated using the change in the environmental temperature in the vicinity of the object 3. It will be corrected to a new value. Therefore, even when the change in the environmental temperature is uneven within the detection range, it is possible to avoid the background data in the detection area D1 being updated to a temperature higher than the original environmental temperature in the second background update process. This configuration is particularly useful when the environmental temperature changes tend to be biased within the detection range, such as when the detection range of the sensor device 2 is wide.

また、図9に例示するように、互いに連結された検出画素の塊が複数存在する場合、更新部15は、塊ごとに、各塊のみが存在する検出画像Im(m=1,2,・・・)を一時的に生成し、各検出画像Imについて個別に抽出領域を設定してもよい。つまり、更新部15は、検出画素の塊の数だけ検出画像Imを生成し、各検出画像Imについて個別に膨張処理をn回(図9では2回)実行し、膨張後の領域と膨張前の検出領域D11,D12との差分領域を抽出領域D13,D14とする。なお、図9では、検出画素の塊である検出領域D11,D12と、補正値の算出対象となる抽出領域D13,D14とを斜線の向きで区別している。   Further, as illustrated in FIG. 9, when there are a plurality of detection pixel clusters connected to each other, the updating unit 15 detects, for each cluster, a detection image Im (m = 1, 2,. ..) May be temporarily generated, and the extraction region may be set individually for each detected image Im. That is, the update unit 15 generates detection images Im as many as the number of detection pixel clusters, individually performs expansion processing for each detection image Im n times (twice in FIG. 9), and expands the region after expansion and before expansion. The difference areas from the detection areas D11 and D12 are the extraction areas D13 and D14. In FIG. 9, the detection areas D11 and D12 that are detection pixel clusters and the extraction areas D13 and D14 that are correction value calculation targets are distinguished by the direction of diagonal lines.

この場合に、更新部15は、抽出領域D13,D14ごとに補正値を算出し、各補正値を用いて各検出領域D11,D12の背景データを更新する。すなわち、図9の例では、検出画像I1の抽出領域D13から算出される補正値は、検出画像I1の検出領域D11の背景データ更新に用いられ、検出画像I2の抽出領域D14から算出される補正値は、検出画像I2の検出領域D12の背景データ更新に用いられる。   In this case, the update unit 15 calculates a correction value for each of the extraction regions D13 and D14, and updates the background data of each detection region D11 and D12 using each correction value. That is, in the example of FIG. 9, the correction value calculated from the extraction region D13 of the detection image I1 is used for background data update of the detection region D11 of the detection image I1, and is calculated from the extraction region D14 of the detection image I2. The value is used to update the background data of the detection area D12 of the detection image I2.

なお、本実施形態では、抽出領域の決め方として単純に画像処理技術における膨張処理を採用しているが、センサ装置2の配置によっては、検出画素に隣接する画素が実空間では対象物3の近傍ではないこともあり得る。センサ装置2の配置がこのような配置であることが予め分かっている場合、たとえばセンサ装置2が斜め下方に向けて配置されているような場合、更新部15は、検出画素のうち対象物3と床面との接点になる足元領域の周辺画素のみを抽出領域としてもよい。   In the present embodiment, the expansion processing in the image processing technique is simply adopted as a method of determining the extraction region. However, depending on the arrangement of the sensor device 2, the pixel adjacent to the detection pixel is in the vicinity of the object 3 in the real space. It may not be. When it is known in advance that the arrangement of the sensor device 2 is such an arrangement, for example, when the sensor device 2 is arranged obliquely downward, the update unit 15 includes the object 3 among the detection pixels. Only the peripheral pixels of the foot area that becomes the contact point between the floor and the floor may be used as the extraction area.

その他の構成および機能は実施形態1と同様である。   Other configurations and functions are the same as those of the first embodiment.

(実施形態3)
本実施形態の物体検出装置1は、第2の背景更新処理用の補正値を求める際に用いる抽出領域の設定が実施形態2の物体検出装置1と相違する。以下では、実施形態2と同様の構成については同一の符号を付して説明を適宜省略する。
(Embodiment 3)
The object detection device 1 of the present embodiment is different from the object detection device 1 of the second embodiment in the setting of the extraction region used when obtaining the correction value for the second background update process. In the following, the same components as those in the second embodiment are denoted by the same reference numerals, and description thereof is omitted as appropriate.

本実施形態では、更新部15は、不検出領域D0のうち検出領域D1に接している一部の領域のみを抽出領域とする第1モードと、不検出領域D0の全域を抽出領域とする第2モードとの2つの動作モードを切替可能である。更新部15は、これら第1モードと第2モードとを、第1の背景更新処理による不検出領域D0の温度の変化量のばらつき度合いに応じて切り替える。   In the present embodiment, the updating unit 15 performs the first mode in which only a part of the non-detection area D0 that is in contact with the detection area D1 is the extraction area, and the first mode in which the entire non-detection area D0 is the extraction area. The two operation modes, i.e., the two modes, can be switched. The update unit 15 switches between the first mode and the second mode according to the variation degree of the temperature change amount of the non-detection region D0 by the first background update process.

具体的には、更新部15は、第1の背景更新処理に伴い算出された更新値ΔTb(x,y,t)の値が、不検出領域D0の全画素について概ね均一か否かを判断し、均一でないと判断すれば第1モードで動作し、均一であると判断すれば第2モードで動作する。ここで、更新部15は、たとえば不検出領域D0における更新値ΔTb(x,y,t)の最大値と最小値との差分値を所定の閾値と比較し、差分値が閾値より小さければ概ね均一であり、閾値以上であれば均一でないと判断する。この構成に限らず、更新部15は、たとえば更新値ΔTb(x,y,t)の標準偏差を求め、この値が所定の閾値の範囲内か否かによって、概ね均一か否かを判断してもよい。   Specifically, the update unit 15 determines whether or not the value of the update value ΔTb (x, y, t) calculated with the first background update process is substantially uniform for all the pixels in the non-detection region D0. If it is determined that it is not uniform, it operates in the first mode, and if it is determined that it is not uniform, it operates in the second mode. Here, for example, the update unit 15 compares the difference value between the maximum value and the minimum value of the update value ΔTb (x, y, t) in the non-detection region D0 with a predetermined threshold value, and if the difference value is smaller than the threshold value, If it is uniform and is equal to or greater than the threshold value, it is determined that it is not uniform. For example, the update unit 15 obtains a standard deviation of the update value ΔTb (x, y, t) and determines whether or not the value is substantially uniform depending on whether the value is within a predetermined threshold range. May be.

以上説明した本実施形態の物体検出装置1によれば、不検出領域D0の温度の変化量のばらつき度合いに応じて補正値の算出対象である抽出領域が変わるので、更新部15は最適な方法で補正値を算出することができる。   According to the object detection device 1 of the present embodiment described above, since the extraction region that is a correction value calculation target changes according to the variation degree of the temperature change amount of the non-detection region D0, the update unit 15 uses the optimum method. The correction value can be calculated with.

すなわち、検出領域D1に接している一部の領域を抽出領域とする第1モードは、検知範囲内で環境温度の変化に偏りがある場合に有効であるが、算出対象となる画素数の減少による補正値のばらつきや、膨張処理に伴う演算負荷の増大を生じる可能性がある。一方、不検出領域D0の全域を抽出領域とする第2モードは、検知範囲内で環境温度の変化に偏りがある場合には適さないが、算出対象となる画素数の減少による補正値のばらつきや、膨張処理に伴う演算負荷の増大を生じないという利点がある。本実施形態の更新部15は、これらの第1モードと第2モードとを、不検出領域D0における温度の変化量のばらつき度合いに応じて切り替えるので、常に最適な方法で補正値を算出できる。   In other words, the first mode in which a part of the area in contact with the detection area D1 is an extraction area is effective when the environmental temperature is uneven in the detection range, but the number of pixels to be calculated is reduced. There is a possibility that the correction value varies due to the increase in calculation load due to the expansion process. On the other hand, the second mode in which the entire non-detection area D0 is the extraction area is not suitable when there is a bias in the change in the environmental temperature within the detection range, but the variation in correction value due to a decrease in the number of pixels to be calculated. In addition, there is an advantage that the calculation load associated with the expansion process does not increase. Since the update unit 15 of the present embodiment switches between the first mode and the second mode according to the variation degree of the temperature change amount in the non-detection region D0, the correction value can always be calculated by an optimum method.

なお、更新部14が第1モードにおいて抽出領域とする領域は、不検出領域D0のうち検出領域D1に接している一部の領域に限らず、不検出領域D0の一部の領域であればよい。   Note that the area that the update unit 14 uses as the extraction area in the first mode is not limited to a part of the non-detection area D0 that is in contact with the detection area D1, but is a part of the non-detection area D0. Good.

その他の構成および機能は実施形態2と同様である。   Other configurations and functions are the same as those of the second embodiment.

1 物体検出装置
2 センサ装置
3 対象物
11 取得部
12 記憶部
13 検出部
15 更新部
DESCRIPTION OF SYMBOLS 1 Object detection apparatus 2 Sensor apparatus 3 Object 11 Acquisition part 12 Storage part 13 Detection part 15 Update part

Claims (3)

検知範囲内の温度分布を表す取得情報を取得する取得部と、前記検知範囲に対象物が存在しないときの前記検知範囲内の温度分布を表す背景情報を記憶する記憶部と、前記背景情報に対する前記取得情報の変化に基づいて前記検知範囲における前記対象物の有無を検出する検出部と、前記記憶部に記憶されている前記背景情報を繰り返し更新する更新部とを備え、
前記更新部は、前記背景情報を、前記検出部で前記対象物が検出された領域を含む検出領域と、当該検出領域以外の領域からなる不検出領域とに分け、
前記不検出領域については、前記取得情報に基づいて前記背景情報を更新する第1の背景更新処理を実行し、前記検出領域については、前記不検出領域の少なくとも一部からなる抽出領域の前記第1の背景更新処理による温度の変化量から求まる補正値を用いて、前記背景情報を更新する第2の背景更新処理を実行することを特徴とする物体検出装置。
An acquisition unit for acquiring acquisition information representing a temperature distribution in the detection range, a storage unit for storing background information indicating a temperature distribution in the detection range when no object is present in the detection range, and the background information A detection unit that detects the presence or absence of the object in the detection range based on a change in the acquired information, and an update unit that repeatedly updates the background information stored in the storage unit,
The update unit divides the background information into a detection region including a region where the object is detected by the detection unit, and a non-detection region including a region other than the detection region,
For the non-detection area, a first background update process is performed to update the background information based on the acquired information. An object detection apparatus that executes a second background update process for updating the background information using a correction value obtained from the amount of change in temperature by one background update process.
前記更新部は、前記不検出領域のうち前記検出領域に接している一部の領域を前記抽出領域として、当該抽出領域の温度の変化量から前記補正値を求めることを特徴とする請求項1に記載の物体検出装置。   The update unit obtains the correction value from the amount of change in temperature of the extraction region, with a part of the non-detection region in contact with the detection region as the extraction region. The object detection apparatus described in 1. 前記更新部は、前記背景情報の前記不検出領域の一部の領域を前記抽出領域として、当該抽出領域の温度の変化量から前記補正値を求める第1モードと、前記背景情報の前記不検出領域の全域を前記抽出領域として、当該抽出領域の温度の変化量から前記補正値を求める第2モードとを、前記第1の背景更新処理による前記不検出領域の温度の変化量のばらつき度合いに応じて切り替えることを特徴とする請求項1または請求項2に記載の物体検出装置。
The update unit uses a partial area of the non-detection area of the background information as the extraction area, a first mode for obtaining the correction value from the amount of change in temperature of the extraction area, and the non-detection of the background information The second mode for obtaining the correction value from the amount of change in the temperature of the extraction region with the entire region as the extraction region is set to the degree of variation in the amount of change in the temperature of the non-detection region by the first background update process. The object detection apparatus according to claim 1, wherein the object detection apparatus is switched according to the selection.
JP2011059699A 2011-03-17 2011-03-17 Object detection device Active JP5799219B2 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
JP2011059699A JP5799219B2 (en) 2011-03-17 2011-03-17 Object detection device
US14/001,573 US9189685B2 (en) 2011-03-17 2012-03-14 Object detection device
CN201280011379.5A CN103415788B (en) 2011-03-17 2012-03-14 Object detection device
EP12757894.6A EP2672294A1 (en) 2011-03-17 2012-03-14 Object detection device
PCT/IB2012/000476 WO2012123808A1 (en) 2011-03-17 2012-03-14 Object detection device
KR1020137022497A KR20140007402A (en) 2011-03-17 2012-03-14 Object detection device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011059699A JP5799219B2 (en) 2011-03-17 2011-03-17 Object detection device

Publications (2)

Publication Number Publication Date
JP2012194121A JP2012194121A (en) 2012-10-11
JP5799219B2 true JP5799219B2 (en) 2015-10-21

Family

ID=46830097

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011059699A Active JP5799219B2 (en) 2011-03-17 2011-03-17 Object detection device

Country Status (6)

Country Link
US (1) US9189685B2 (en)
EP (1) EP2672294A1 (en)
JP (1) JP5799219B2 (en)
KR (1) KR20140007402A (en)
CN (1) CN103415788B (en)
WO (1) WO2012123808A1 (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103575323B (en) * 2012-07-30 2016-03-30 日电(中国)有限公司 Take detection method and device
CN105339742B (en) * 2014-02-17 2019-04-30 松下电器产业株式会社 air conditioner
JP6376804B2 (en) 2014-04-01 2018-08-22 キヤノン株式会社 Image forming apparatus, image forming apparatus control method, and program
JP2015216482A (en) * 2014-05-09 2015-12-03 キヤノン株式会社 Imaging control method and imaging apparatus
JP6415178B2 (en) 2014-08-19 2018-10-31 キヤノン株式会社 Printing apparatus and data updating method
JP6374757B2 (en) * 2014-10-21 2018-08-15 アズビル株式会社 Human detection system and method
TWI554747B (en) 2014-12-04 2016-10-21 台達電子工業股份有限公司 Human detection system and human detection method
CN105423494B (en) * 2015-12-11 2018-05-01 四川长虹电器股份有限公司 A kind of bearing calibration and air-conditioning equipment
CN107233082B (en) * 2016-03-29 2020-04-21 广州斯摩莱信息科技有限公司 Infrared thermal imaging detection system
JP6654091B2 (en) * 2016-04-19 2020-02-26 アズビル株式会社 Monitoring device, monitoring method, and program
CN112105964B (en) * 2018-05-16 2024-07-23 松下知识产权经营株式会社 Sensor system, air conditioning system, object detection method, and recording medium
KR20190138325A (en) * 2018-06-05 2019-12-13 휴렛-팩커드 디벨롭먼트 컴퍼니, 엘.피. Image forming apparatus to detect user and method for controlling thereof
US11332654B2 (en) 2020-02-14 2022-05-17 Halliburton Energy Services, Inc. Well bore spacer and efficiency fluids comprising geopolymers
US11162015B2 (en) 2020-02-14 2021-11-02 Halliburton Energy Services, Inc. Geopolymer formulations for mitigating losses
US11242479B2 (en) * 2020-02-14 2022-02-08 Halliburton Energy Services, Inc. Geopolymer cement for use in subterranean operations
JP7418652B2 (en) * 2021-02-19 2024-01-19 三菱電機株式会社 Human detection device, electrical equipment, human detection method, and human detection system
CN113743222B (en) * 2021-08-04 2025-03-21 赵华 Body temperature measurement method, device, electronic device and readable storage medium

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62240823A (en) 1986-04-14 1987-10-21 Matsushita Electric Works Ltd Image sensor for monitor control
JPH05118916A (en) 1991-10-25 1993-05-14 Matsushita Electric Ind Co Ltd Human body recognition apparatus
JP3453870B2 (en) * 1994-09-21 2003-10-06 松下電器産業株式会社 Image processing device and applied equipment using the image processing device
DE69616191T2 (en) 1995-07-19 2002-03-14 Matsushita Electric Industrial Co., Ltd. Movement pattern recognition device for determining the movement of people and for counting people passing by
JPH0962822A (en) * 1995-08-28 1997-03-07 Matsushita Electric Ind Co Ltd Human body movement detection device and detection device for number of passing peoples
JPH0933662A (en) * 1995-07-21 1997-02-07 Murata Mfg Co Ltd Detecting equipment of object generating infrared rays
JPH10308939A (en) 1997-05-08 1998-11-17 Nec Corp Infrared monitoring system
JP2002148354A (en) * 2000-11-07 2002-05-22 Matsushita Electric Ind Co Ltd Human body detector
US7415164B2 (en) * 2005-01-05 2008-08-19 Mitsubishi Electric Research Laboratories, Inc. Modeling scenes in videos using spectral similarity
JP5103767B2 (en) * 2006-03-27 2012-12-19 日産自動車株式会社 Temperature detection device
US7693331B2 (en) * 2006-08-30 2010-04-06 Mitsubishi Electric Research Laboratories, Inc. Object segmentation using visible and infrared images
CN101169891A (en) * 2006-11-30 2008-04-30 中国科学院长春光学精密机械与物理研究所 Intelligent anti-theft monitor
US8300890B1 (en) * 2007-01-29 2012-10-30 Intellivision Technologies Corporation Person/object image and screening
JP4894002B2 (en) * 2007-03-12 2012-03-07 サクサ株式会社 Moving body detection device and moving body detection system
US8446468B1 (en) * 2007-06-19 2013-05-21 University Of Southern California Moving object detection using a mobile infrared camera
CN101702035B (en) * 2009-02-19 2012-10-03 黄程云 Digital quasi-static passive human body detector
US8934020B2 (en) * 2011-12-22 2015-01-13 Pelco, Inc. Integrated video quantization
JP5948983B2 (en) * 2012-03-09 2016-07-06 オムロン株式会社 Image processing apparatus, image processing method, and image processing program

Also Published As

Publication number Publication date
JP2012194121A (en) 2012-10-11
US20130329959A1 (en) 2013-12-12
WO2012123808A1 (en) 2012-09-20
CN103415788A (en) 2013-11-27
CN103415788B (en) 2017-02-15
EP2672294A1 (en) 2013-12-11
KR20140007402A (en) 2014-01-17
US9189685B2 (en) 2015-11-17

Similar Documents

Publication Publication Date Title
JP5799219B2 (en) Object detection device
JP6554169B2 (en) Object recognition device and object recognition system
JP5576937B2 (en) Vehicle periphery monitoring device
JP2018181333A5 (en)
JP5756709B2 (en) Height estimation device, height estimation method, and height estimation program
JP5938631B2 (en) Object detection apparatus and object detection method
JP2013074461A5 (en)
US8942478B2 (en) Information processing apparatus, processing method therefor, and non-transitory computer-readable storage medium
KR101712136B1 (en) Method and apparatus for detecting a fainting situation of an object by using thermal image camera
JP6543790B2 (en) Signal processing device, input device, signal processing method, and program
TWI556132B (en) Optical pointing system
JP2021052238A (en) Deposit detection device and deposit detection method
JPH11224389A (en) Flame detection method, fire detection method and fire detection device
JP4533836B2 (en) Fluctuating region detection apparatus and method
JP2012226595A (en) Gesture recognition device
JP6011173B2 (en) Pupil detection device and pupil detection method
JP2010286995A (en) Image processing system for vehicle
JP2018107665A (en) Imaging apparatus, dirt detection system, and dirt detection method
JPWO2020261568A5 (en)
US11069046B2 (en) Efficient smoke detection based on video data processing
JP6527183B2 (en) Leftover object detection device
CN112424849B (en) Information processing apparatus, information processing method, and recording medium
JP2016003964A (en) Tire state evaluation system and tire state evaluation method
US11039096B2 (en) Image processing device, image processing method and storage medium
JP4828265B2 (en) Image sensor

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20140210

A711 Notification of change in applicant

Free format text: JAPANESE INTERMEDIATE CODE: A711

Effective date: 20141008

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20150303

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20150327

R151 Written notification of patent or utility model registration

Ref document number: 5799219

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151