[go: up one dir, main page]

CN113942905B - Elevator user detection system - Google Patents

Elevator user detection system Download PDF

Info

Publication number
CN113942905B
CN113942905B CN202110556537.6A CN202110556537A CN113942905B CN 113942905 B CN113942905 B CN 113942905B CN 202110556537 A CN202110556537 A CN 202110556537A CN 113942905 B CN113942905 B CN 113942905B
Authority
CN
China
Prior art keywords
camera
user
exposure
car
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110556537.6A
Other languages
Chinese (zh)
Other versions
CN113942905A (en
Inventor
木村纱由美
田村聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Elevator and Building Systems Corp
Original Assignee
Toshiba Elevator Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Elevator Co Ltd filed Critical Toshiba Elevator Co Ltd
Publication of CN113942905A publication Critical patent/CN113942905A/en
Application granted granted Critical
Publication of CN113942905B publication Critical patent/CN113942905B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B5/00Applications of checking, fault-correcting, or safety devices in elevators
    • B66B5/0006Monitoring devices or performance analysers
    • B66B5/0012Devices monitoring the users of the elevator system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B13/00Doors, gates, or other apparatus controlling access to, or exit from, cages or lift well landings
    • B66B13/02Door or gate operation
    • B66B13/06Door or gate operation of sliding doors
    • B66B13/08Door or gate operation of sliding doors guided for horizontal movement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B13/00Doors, gates, or other apparatus controlling access to, or exit from, cages or lift well landings
    • B66B13/02Door or gate operation
    • B66B13/14Control systems or devices
    • B66B13/143Control systems or devices electrical
    • B66B13/146Control systems or devices electrical method or algorithm for controlling doors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B13/00Doors, gates, or other apparatus controlling access to, or exit from, cages or lift well landings
    • B66B13/24Safety devices in passenger lifts, not otherwise provided for, for preventing trapping of passengers
    • B66B13/26Safety devices in passenger lifts, not otherwise provided for, for preventing trapping of passengers between closing doors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B5/00Applications of checking, fault-correcting, or safety devices in elevators
    • B66B5/0006Monitoring devices or performance analysers
    • B66B5/0037Performance analysers

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Indicating And Signalling Devices For Elevators (AREA)
  • Elevator Door Apparatuses (AREA)

Abstract

The invention provides a user detection system of an elevator, which can effectively inhibit undetected user or false shadow detection according to the environment of a hall. The elevator user detection system according to one embodiment includes: a door sensor provided at an entrance of the car; a detection processing unit that detects a user located in the hall from a captured image of the camera; an exposure adjustment unit that determines false detection of a shadow that is not detected by the user or that is reflected in the captured image, based on a combination of a detection result of the door sensor and a detection result of the camera, and adjusts an exposure value of the camera based on the determination result; and a door opening/closing control unit that performs door opening/closing control of the door of the car based on the detection result of the camera.

Description

Elevator user detection system
The present application is based on japanese patent application 2020-121499 (filing date: 15 th 7 th 2020), based on which priority is granted. This application is hereby incorporated by reference in its entirety.
Technical Field
Embodiments of the present invention relate to a user detection system for an elevator.
Background
In general, when an elevator car arrives at a hall and opens a door, the car is closed after a predetermined time elapses. In this case, the user of the elevator sometimes collides with the door being closed when entering the car from the hall, because it is unclear when the car will be closed. In order to avoid such a collision of the door while riding, there are the following systems: a user of the car is detected by using a captured image of the camera, and the detection result is reflected on the control of opening and closing of the door.
Disclosure of Invention
In the above system, the presence or absence of a user is determined by utilizing a change in brightness of an image generated by a user's motion. However, if a shadow or the like of the user is reflected in the image, the motion of the shadow may be erroneously detected as the user. Therefore, the shadow can be made difficult to detect by the exposure adjustment of the camera.
However, depending on the brightness of the floor of the hall as a background, there may be a case where the user cannot be detected. For example, in a case where the exposure value is increased to suppress false detection of shadows, if the floor of a hall is bright, even a user who is an original detection target may be overexposed and undetected.
The invention aims to provide a user detection system of an elevator, which can effectively inhibit undetected users or false detection of shadows according to the environment of a hall.
The user detection system of the elevator according to one embodiment is provided with a camera which is arranged on the car and shoots the vicinity of the entrance of the car and the hall. The user detection system is characterized by comprising: a door sensor provided at an entrance of the car; a detection processing unit that detects a user located in the hall from a captured image of the camera; an exposure adjustment unit that determines that the user has not detected or has misdetected a shadow in the captured image based on a combination of a detection result of the door sensor and a detection result of the camera, and adjusts an exposure value of the camera based on the determination result; and a door opening/closing control unit that performs door opening/closing control of the door of the car based on the detection result of the camera.
According to the user detection system of the elevator with the above structure, the undetected user or the false detection of the shadow can be effectively restrained according to the environment of the hall.
Drawings
Fig. 1 is a diagram showing a configuration of a user detection system of an elevator according to an embodiment.
Fig. 2 is a diagram showing the detection range of the camera and the detection range of the door sensor according to the same embodiment.
Fig. 3 is a view showing a configuration of a surrounding portion of an entrance in the car according to the same embodiment.
Fig. 4 is a view showing an example of an image captured by the camera according to the embodiment.
Fig. 5 is a flowchart showing a user detection process when the door is opened in the user detection system according to the embodiment.
Fig. 6 is a diagram for explaining a coordinate system in real space according to the embodiment.
Fig. 7 is a diagram showing a state in which a captured image is divided in block units according to the same embodiment.
Fig. 8 is a view showing an example of a captured image in the case where the floor surface of the hall is bright in the same manner as in the embodiment.
Fig. 9 is a view showing an example of a captured image when the floor of the hall is dark in the same embodiment.
Fig. 10 is a flowchart showing the operation of the process of undetected suppression/false detection suppression in the bright exposure mode of the user detection system.
Fig. 11 is a flowchart showing the operation of the process of undetected suppression/false detection suppression in the bright exposure mode of the user detection system.
Fig. 12 is a flowchart showing the operation of the detection system for detecting the undetected suppression and erroneous detection suppression in the dark exposure mode.
Fig. 13 is a flowchart showing the operation of the detection system for detecting the undetected suppression and erroneous detection suppression in the dark exposure mode.
Detailed Description
The following describes specific embodiments with reference to the drawings.
The disclosure is merely an example, and the invention is not limited to the following embodiments. Variations that would be readily apparent to a person skilled in the art are of course included within the scope of the disclosure. In order to make the description more clear, the dimensions, shapes, and the like of the respective portions may be schematically shown in the drawings with respect to the actual embodiment. In the drawings, corresponding elements are denoted by the same reference numerals, and detailed description thereof may be omitted.
Fig. 1 is a diagram showing a configuration of a user detection system of an elevator according to an embodiment. In this case, 1 car is taken as an example, but a plurality of cars are also constructed in the same manner.
A camera 12 is provided above the entrance of the car 11. Specifically, the camera 12 is provided so that a lens portion of a door lintel plate 11a covering an upper portion of an entrance of the car 11 is inclined by a predetermined angle directly below, toward the hall 15 or toward the inside of the car 11.
The camera 12 is a small-sized monitoring camera such as an in-vehicle camera, has a wide-angle lens or a fisheye lens, and can continuously capture images of a plurality of frames (for example, 30 frames/second) per second. The camera 12 is activated when, for example, the car 11 reaches the hall 15 of each floor, and photographs the vicinity of the car door 13 and the hall 15. The camera 12 may be always in operation when the car 11 is running.
The imaging range at this time is adjusted to L1+L2 (L1 > L2). L1 is an imaging range on the hall side, and has a predetermined distance from the car door 13 toward the hall 15. L2 is an imaging range on the car side, and has a predetermined distance from the car door 13 toward the car rear surface. The ranges L1 and L2 are ranges in the depth direction, and the ranges in the width direction (direction orthogonal to the depth direction) are at least larger than the lateral width of the car 11.
In each hall 15, a hall door 14 is provided so as to be openable and closable at an arrival opening of the car 11. The hall doors 14 engage with the car doors 13 to perform opening and closing operations when the car 11 arrives. The power source (door motor) is located on the car 11 side, and the hoistway door 14 is opened and closed only following the car door 13. In the following description, the hoistway door 14 is also opened when the car door 13 is opened, and the hoistway door 14 is also closed when the car door 13 is closed.
Each image (video) continuously captured by the camera 12 is analyzed and processed in real time by the image processing device 20. In fig. 1, the image processing device 20 is taken out of the car 11 for convenience, but in practice, the image processing device 20 is housed in the door lintel plate 11a together with the camera 12.
A door sensor 10, which is a sensor for detecting a user different from the camera 12, is provided at the entrance/exit of the car 11. The door sensor 10 is constituted by, for example, a photoelectric sensor having a plurality of optical axes, and optically detects a user passing through an entrance of the car 11.
Fig. 2 shows a comparison of the detection range of the camera 12 with the detection range of the door sensor 10. The detection range of the camera 12 is widened toward the hall 15. Therefore, a user who is directed to the car 11 from the hall 15 can be detected early. In contrast, the detection range of the door sensor 10 is only the entrance of the car 11, and the user can accurately detect only when passing through the entrance of the car 11. However, since the door sensor 10 is not affected by the environment (the color of the ground, the brightness of the illumination device, or the like) of the hall 15, the detection accuracy is high. As described later, in the present embodiment, the detection result of the door sensor 10 is used for exposure adjustment of the camera 12.
The image processing apparatus 20 includes a storage unit 21 and a detection unit 22. The storage unit 21 is constituted by a storage device such as a RAM. The storage section 21 has a buffer area for sequentially storing images captured by the camera 12 and temporarily storing data necessary for processing by the detection section 22. The storage unit 21 may store an image after the preprocessing of the captured image, in which the processing such as distortion correction, enlargement and reduction, and partial cropping is performed.
The detection unit 22 is constituted by, for example, a microprocessor, and detects a user located near the car door 13 using a captured image of the camera 12. When the detection unit 22 is functionally divided, it is composed of a detection region setting unit 22a, a detection processing unit 22b, an exposure adjustment unit 22c, and a process switching unit 22 d. In addition, these may be realized by software, or may be realized by hardware such as an Integrated Circuit (IC), or may be realized by using a combination of software and hardware.
The detection area setting unit 22a sets at least 1 or more detection areas for detecting a user on the captured image obtained from the camera 12. In the present embodiment, a detection area E1 for detecting a user located in the hall 15 is set. Specifically, the detection area setting unit 22a sets a detection area E1 (see fig. 4) having a predetermined distance L3 from the entrance of the car 11, including the thresholds 18 and 47, toward the hall 15.
The detection processing unit 22b detects a user or an object existing in the hall 15 using the image in the detection area E1 set by the detection area setting unit 22 a. The term "object" includes a moving body such as clothes, luggage, and wheelchairs of a user. In the following description, where reference is made to "detecting a user", an "object" is also included.
The exposure adjustment unit 22c determines false detection of a shadow not detected by the user or reflected in the captured image based on a combination of the detection result of the camera 12 and the detection result of the door sensor 10 obtained by the detection processing unit 22b, and adjusts the exposure value of the camera 12 based on the determination result. The process switching unit 22d switches between a process when the bright exposure mode is set and a process when the dark exposure mode is set. In addition, the elevator control device 30 may be provided with a part or all of the functions of the image processing device 20.
The elevator control device 30 is configured by a computer having a CPU, ROM, RAM, and the like. The elevator control device 30 performs operation control of the car 11. The elevator control device 30 further includes a door opening/closing control unit 31.
The door opening/closing control unit 31 controls the opening/closing of the doors 13 of the car when the car 11 reaches the hall 15. Specifically, the door opening/closing control unit 31 opens the car door 13 when the car 11 reaches the hall 15, and closes the door after a predetermined time elapses. However, when the detection processing unit 22b detects the user during the door closing operation of the car door 13, the door opening/closing control unit 31 prohibits the door closing operation of the car door 13, and the car door 13 is re-opened in the fully open direction and maintained in the door open state.
Fig. 3 is a view showing a configuration of the entrance peripheral portion in the car 11.
A car door 13 is provided to open and close an entrance of the car 11. In the example of fig. 3, a double-leaf split car door 13 is shown, and 2 door panels 13a and 13b constituting the car door 13 are opened and closed in opposite directions along the surface width direction (horizontal direction). The "face width" is the same as the entrance of the car 11.
Front posts 41a, 41b are provided on both sides of the entrance of the car 11, and surround the entrance of the car 11 together with the door lintel plate 11 a. The "front column" is also called an entrance column or an entrance frame, and a door box for accommodating the car door 13 is generally provided on the rear side. In the example of fig. 3, when the car door 13 is opened, one door panel 13a is accommodated in a door box 42a provided on the rear side of the front column 41a, and the other door panel 13B is accommodated in a door box 42B provided on the rear side of the front column 41B.
One or both of the front posts 41a and 41b are provided with a display 43, an operation panel 45 provided with a destination layer button 44, and a speaker 46. In the example of fig. 3, a speaker 46 is provided in the front pillar 41a, and a display 43 and an operation panel 45 are provided in the front pillar 41 b. Here, a camera 12 having a wide-angle lens is provided in a central portion of a door lintel plate 11a at an upper portion of an entrance of the car 11.
In the present embodiment, a pair of door sensors 10 are provided on the front posts 41a and 41 b. The door sensor 10 is constituted by, for example, a transmissive photosensor, and includes a projector 10a and a light receiver 10b. In the example of fig. 3, the light emitters 10a are arranged in a row at a predetermined interval on the inner side surface 41a-1 of one front pillar 41a, and the light receivers 10b are arranged in a row at a predetermined interval on the inner side surface 41b-1 of the other front pillar 41 b. The door sensor 10 detects the passage of the user by cutting off the optical axis between the projector 10a and the light receiver 10b. The door sensor 10 may be on only during a period from when the car 11 is opened to when the car is closed at any floor, or may be always on.
Fig. 4 is a diagram showing an example of a captured image of the camera 12. The hall 15 is located at the upper side, and the car 11 is located at the lower side. In the figure, 16 denotes the floor of the hall 15, and 19 denotes the floor of the car 11. E1 represents a detection region.
The car door 13 has 2 door panels 13a, 13b that move in opposite directions to each other on the car sill 47. Similarly, the hall door 14 has 2 door panels 14a and 14b that move in opposite directions relative to each other on the hall sill 18. The door panels 14a, 14b of the hoistway door 14 move in the door opening and closing direction together with the door panels 13a, 13b of the car door 13.
The camera 12 is provided at an upper portion of an entrance of the car 11. Therefore, when the car 11 opens in the hall 15, as shown in fig. 1, the predetermined range (L1) on the hall side and the predetermined range (L2) in the car are photographed. A detection area E1 for detecting a user of the car 11 is set in a predetermined area (L1) on the hall side.
In real space, the detection area E1 has a distance L3 from the center of the entrance (face width) toward the hall (L3 is equal to or smaller than the imaging range L1 on the hall side). The width W1 of the detection area E1 at the time of full opening is set to be equal to or larger than the width W0 of the entrance (front face). As shown by oblique lines in fig. 3, the detection area E1 includes the thresholds 18, 47, and is set so as to eliminate dead angles of the door pocket 17a, 17 b. The size of the detection area E1 in the lateral direction (X-axis direction) may be changed in accordance with the opening/closing operation of the car door 13. The size of the detection area E1 in the longitudinal direction (Y-axis direction) may be changed according to the opening/closing operation of the car door 13.
Hereinafter, the operation of the present system will be described as (a) user detection processing and (b) processing for detecting no suppression/false detection suppression.
(a) User detection process
Fig. 5 is a flowchart showing a user detection process at the time of opening the door in the present system.
First, as an initial setting, a detection area setting process is performed by a detection area setting unit 22a of a detection unit 22 provided in the image processing apparatus 20 (step S10). For example, when the camera 12 is set or when the setting position of the camera 12 is adjusted, the detection area setting process is performed as follows.
That is, the detection area setting unit 22a sets the detection area E1 having the distance L3 from the entrance to the hall 15 in the state where the car 11 is fully opened. As shown in fig. 3, the detection area E1 includes the thresholds 18 and 47, and is set so as to eliminate dead angles of the door pocket 17a and 17 b. In this case, in a state where the car 11 is fully opened, the detection area E1 has a lateral dimension (X-axis direction) W1 and a distance equal to or greater than a lateral width W0 of the doorway (face width).
Here, when the car 11 reaches the hall 15 at an arbitrary floor (yes in step S11), the elevator control device 30 opens the car door 13 and waits for the user to ride on the car 11 (step S12).
At this time, a predetermined range (L1) on the hall side and a predetermined range (L2) in the car are photographed at a predetermined frame rate (for example, 30 frames/second) by a camera 12 provided at the upper part of the entrance of the car 11. The image processing device 20 acquires images captured by the camera 12 in time series, and executes the following user detection processing in real time (step S14) while sequentially storing the images in the storage unit 21 (step S13). In addition, distortion correction, enlargement and reduction, partial cropping of an image, and the like may be performed as preprocessing of a captured image.
The user detection process is performed by the detection processing unit 22b of the detection unit 22 provided in the image processing apparatus 20. The detection processing unit 22b extracts images in the detection area E1 from a plurality of captured images obtained by the camera 12 in time series, and detects the presence or absence of a user or an object based on the images.
Specifically, as shown in fig. 6, the camera 12 captures images in which the direction horizontal to the car door 13 provided at the entrance/exit of the car 11 is the X-axis, the direction from the center of the car door 13 to the hall 15 (the direction perpendicular to the car door 13) is the Y-axis, and the height direction of the car 11 is the Z-axis. In each image captured by the camera 12, the portion of the detection area E1 is compared in block units, so that the movement of the center of the car door 13 in the direction of the hall 15, that is, the position of the user's foot in the Y-axis direction is detected.
Fig. 7 shows an example in which a captured image is divided into a matrix by a predetermined block unit. The lattice-like image in which the original image is divided into one side WBloCk is called a "block". In the example of fig. 7, the blocks have the same longitudinal and lateral lengths, but may have different longitudinal and lateral lengths. The block may have a uniform size throughout the image, or may have a nonuniform size such as a shorter length in the vertical direction (Y-axis direction) as the block gets closer to the upper portion of the image.
The detection processing unit 22b sequentially reads out the images stored in the storage unit 21 in time series order, and calculates the average luminance value of the images for each block. At this time, the average luminance value of each block calculated when the first image is input is held as an initial value in a 1 st buffer area, not shown, in the storage unit 21.
When the 2 nd and subsequent images are obtained, the detection processing section 22b compares the average luminance value of each block of the current image with the average luminance value of each block of the previous image stored in the above-mentioned 1 st buffer. As a result, when a block having a luminance difference equal to or greater than the predetermined threshold value exists in the current image, the detection processing unit 22b determines that the block is a moving block. When determining the presence or absence of motion for the current image, the detection processing section 22b holds the average luminance value of each block of the image in the above-described 1 st buffer for comparison with the next image. In the same manner as described above, the detection processing unit 22b repeatedly determines whether or not motion is present while comparing the luminance values of the respective images in units of blocks in time series.
The detection processing unit 22b checks whether or not there is a moving block in the image in the detection area E1. As a result, if there is a moving block in the image in the detection area E1, the detection processing unit 22b determines that there is a user or an object in the detection area E1.
By this means, when the presence of a user or an object in the detection area E1 is detected when the car door 13 is opened (yes in step S15), a user detection signal is output from the image processing device 20 to the elevator control device 30. The door opening/closing control unit 31 of the elevator control device 30, upon receiving the user detection signal, prohibits the door closing operation of the car door 13 and maintains the door opening state (step S16).
Specifically, when the car door 13 is in the fully opened state, the door opening/closing control unit 31 starts the door opening time counting operation, and closes the door at a time point when a predetermined time T (for example, 1 minute) is counted. When a user is detected during this period and a user detection signal is sent, the door opening/closing control unit 31 stops the counting operation and clears the count value. Thereby, during the above-described time T, the door opening state of the car door 13 is maintained.
When a new user is detected during this period, the count value is cleared again, and the door opening state of the car door 13 is maintained during the time T. However, if the user arrives a plurality of times during the time T, the situation in which the car door 13 is not closed at all times continues, and therefore, it is preferable to set an allowable time Tx (for example, 3 minutes) in advance, and to forcibly close the car door 13 when the allowable time Tx has elapsed. After the counting operation for the time T is completed, the door opening/closing control unit 31 closes the car door 13 to start the car 11 toward the destination floor (step S17).
In the flow of fig. 5, the explanation has been given assuming that the car door is opened, but similarly to the case of closing the door, when a user or an object is detected in the detection area E1 during the period from the start of closing the door to the full closing (during the closing operation), the closing operation is temporarily interrupted.
(b) Undetected inhibition/false detection inhibition
As described above, the user detection process detects the movement of the user from the brightness variation of the image in the detection area E1. However, for example, due to the relationship of light of the illumination device, sunlight, or the like, a shadow of a user or a door or the like may be reflected in a photographed image, and a motion of the shadow may be represented as a brightness change on the image, and may be erroneously detected as a user.
For example, as shown in fig. 8, when the floor surface 16 of the hall 15 is bright, the shadow S1 of the user P1 is strongly reflected on the captured image, and thus the possibility of erroneous detection increases. On the other hand, as shown in fig. 9, when the floor 16 of the hall 15 is dark, there is a possibility that the shadow S2 is erroneously detected, but the contrast between the user P2 and the floor 16 is lowered, so that the user P2 may not be detected accurately.
In general, false detection of shadows can be suppressed by increasing the threshold value with respect to brightness change, and undetected by the user can be suppressed by decreasing the threshold value. However, the false detection suppression processing and the non-detection suppression processing are in an inverse relationship, and only one processing can be prioritized to determine the threshold value.
In the present embodiment, the undetected suppression processing and the false detection suppression are realized not by setting the threshold but by using a technique of exposure adjustment (light exposure/dark exposure).
"exposure adjustment" refers to adjusting the exposure time of camera 12. The "exposure time" is a time during which the imaging element included in the camera 12 is exposed through the lens, and corresponds to a shutter open time during shooting. The longer the exposure time, the brighter the resulting image. The bright shooting with prolonged exposure time is called "bright exposure". The shortening of the exposure time for dark photographing is called "dark exposure".
In addition, in addition to the "exposure adjustment", the brightness of the captured image may be changed by adjusting the "gain". The "gain" is a coefficient for increasing or decreasing the output value of the camera 12. If the value of the gain is raised, the output value of the camera 12 is also raised, so that a bright image can be obtained. If the value of the gain is reduced, the output value of the camera 12 is also reduced, and thus a dark image is obtained. Either or both of the exposure time and the gain may be adjusted. However, when the gain is increased, noise included in the image is also amplified, and therefore, it is preferable to adjust the exposure time in consideration of the image quality. If the exposure time is too long, the moving object will shake and reflect, so it is preferable to limit the exposure time so that it does not become equal to or greater than a predetermined value.
Here, when erroneous detection of the shadow is suppressed by exposure adjustment, an exposure value at which detection of the shadow is difficult may be set. Specifically, the shadow exposure may be performed to be overexposed or the shadow exposure may be performed to be underexposed. However, since the false detection suppression processing and the undetected suppression processing are in an inverse relationship, there is a possibility that the user cannot be detected when the bright exposure or the dark exposure is performed due to the environment of the hall 15.
For example, if the exposure is set to bright, when the floor surface 16 of the hall 15 is bright, it may be difficult to express a change in the brightness of an image when a user of white clothes arrives, and the user may not be detected correctly. On the other hand, if the dark exposure is set, in a case where the floor 16 of the hall 15 is dark, it may not be detected when the user of the black clothing comes.
Therefore, in the present embodiment, the exposure value of the camera 12 is adjusted by the door sensor 10 provided at the entrance of the car 11. Specifically, false detection of a shadow not detected by the user or reflected in the photographed image is determined based on a combination of the detection result of the door sensor 10 and the detection result of the camera 12, and the exposure value of the camera 12 is adjusted based on the determination result.
The "detection result of the door sensor 10" is a result of detecting the presence or absence of a user by cutting off the optical axis between the light projector 10a and the light receiver 10b provided on both sides of the entrance of the car 11. As shown in fig. 5, the "detection result of the camera 12" is a result of detecting the presence or absence of a user in the detection area E1 of the hall 15 based on a change in brightness of the captured image of the camera 12.
Hereinafter, specific processing operations will be described as the initial setting, which is classified into (b-1) the case where the bright exposure mode is set and (b-2) the case where the dark exposure mode is set.
(b-1) case where the bright Exposure mode is set
Fig. 10 and 11 are flowcharts showing the processing operation of undetected suppression and false detection suppression in the light exposure mode in the present system. The manager of the elevator operates a mode switch, not shown, provided in the image processing device 20 to set the light exposure mode (step S101).
First, as an initial setting, exposure control=a, floor count 1=0, and floor count 2=0 are set (step S102). The "exposure control" takes the a value or the B value as the exposure value of the camera 12. The a value is set to be brighter than usual. The B value is set darker than usual. "floor count 1" indicates the number of undetected floors. "floor count 2" indicates the number of floors that were erroneously detected.
When the car 11 reaches an arbitrary floor (yes in step S103), the car door 13 opens (step S104). At this time, the exposure adjustment unit 22c adjusts the exposure value of the camera 12 to the a value, and photographs the vicinity of the entrance of the car 11 and the hall 15 (step S105).
As described in fig. 5, the image captured by the camera 12 is supplied to the detection processing section 22b. The presence or absence of a user is detected from the captured image, and the detection result is reflected on the opening and closing control of the car door 13. On the other hand, the presence or absence of the passage of the user is detected by a door sensor 10 provided at the entrance/exit of the car 11. The detection result of the door sensor 10 may be provided to the elevator control device 30 to be reflected on the opening/closing control of the car door 13. In this case, even if the camera 12 does not detect the user, if the door sensor 10 detects the user, the door closing operation of the car door 13 is prohibited, and the car door 13 is re-opened in the fully open direction and maintained in the open state.
The detection result of the door sensor 10 and the detection result of the camera 12 are supplied to the exposure adjustment section 22c (step S106). When the car door 13 is closed, the exposure adjustment unit 22c determines the undetected/erroneous detection for the current floor as follows based on a combination of the detection result of the door sensor 10 and the detection result of the camera 12.
That is, when the door sensor 10 is detected and the camera 12 is not detected (yes in step S108), the exposure adjustment unit 22c updates the undetected floor count 1 by +1 (step S109). As described above, the door sensor 10 is not affected by the environment of the hall 15. Therefore, when the door sensor 10 is detected, and the camera 12 is not detected, the detection result of the camera 12 is erroneous.
For example, if the exposure is adjusted to the a value when the hall 15 is in a bright environment, the shadow is not only formed, but also the user is overexposed and undetected. In the case where the state of step S108 described above continues, the exposure value of the camera 12 does not match the environment of the hall 15. Therefore, when the floor count 1 exceeds the predetermined N times (yes in step S110), the exposure adjustment unit 22c determines that the user is not detected, and changes the exposure value of the camera 12 from the a value to the B value (step S111). At this time, the floor count 1 is reset to return to 0.
The B value is set lower than the a value and darker than usual. Therefore, by reducing the exposure value of the camera 12 from the a value to the B value, the user can be detected even in a state where, for example, the hall 15 is bright. However, since the exposure value of the camera 12 is lowered to the B value, there is a possibility that the shadow may be erroneously detected.
Therefore, after adjusting the exposure value to the B value, the exposure adjusting unit 22c compares the detection result of the door sensor 10 with the detection result of the camera 12 every time the car 11 stops at each floor. As a result, if the presence door sensor 10 is not detected and the camera 12 is a floor having a detected floor (yes in step S112), the exposure adjustment unit 22c updates the floor count 2 having been erroneously detected by +1 (step S113). The same process as described above is repeated for each floor, and when the state of step S112 is continued, it is considered that the shadow is erroneously detected. Therefore, when the floor count 2 exceeds the predetermined N times (yes in step S114), the exposure adjustment unit 22c determines that the shadow is erroneously detected, and returns the exposure value of the camera 12 to the a value (step S115). At this time, the floor count 2 value is reset to return to 0. The determination is performed N times as in step S110, but may be performed N times different from each other.
If the door sensor 10 detects the detection and the camera 12 detects the detection ("yes" in step S116), this means that the exposure value of the camera 12 is in a state suitable for the environment of the hall 15, and the user detection process of the camera 12 functions normally. In this case, the value of floor count 1 and the value of floor count 2 are reset to return to 0, and the photographing by the camera 12 is continued at the current exposure value (a value or B value).
In the same manner as in the case where the door sensor 10 is not detected and the camera 12 is not detected, the photographing of the camera 12 is continued with the current exposure value (the a value or the B value) (no in step S116). However, since the detection-undetected/false-detected judgment cannot be obtained, the values of floor counts 1 and 2 are not cleared, and the photographing by the camera 12 is continued at the current exposure value (a value or B value).
Therefore, even if the exposure value (a value) for suppressing the shadow in the bright exposure mode is initially set, when it is determined that the undetected hall environment of the user is likely to occur, the exposure adjustment is performed in the direction (a value→b value) for suppressing the undetected user. When it is determined that the shadow is likely to be erroneously detected after the adjustment, the exposure adjustment is performed in the direction (B value→a value) in which the shadow is not detected. Specifically, the exposure value of the camera 12 is appropriately adjusted according to the hall environment, and undetected suppression by the user and false detection suppression by the shadow are performed. This allows the hall 15 at each floor to accurately detect the user from the image captured by the camera 12, and to reflect the control of opening and closing the car door 13.
(b-2) case where the dark Exposure mode is set
In the above (b-1), the assumption is made that the shadow is overexposed by the bright exposure, but the same applies to the case that the shadow is underexposed by the dark exposure. In this case, the initial setting is set to the B value, and the a value is adjusted according to the environment of the hall 15.
Fig. 12 and 13 are flowcharts showing the processing operation of undetected suppression and false detection suppression in the dark exposure mode of the present system. The manager of the elevator operates a mode switch, not shown, provided in the image processing device 20 to set a dark exposure mode (step S201).
First, as an initial setting, exposure control=b, floor count 1=0, and floor count 2=0 are set (step S202). As described above, the "exposure control" takes the a value or the B value as the exposure value of the camera 12. The a value is set to be brighter than usual. The B value is set darker than usual. "floor count 1" indicates the number of undetected floors. "floor count 2" indicates the number of floors that were erroneously detected.
In the dark exposure mode, when the car 11 stops at any floor and opens the door, the exposure value of the camera 12 is adjusted to the B value, and the vicinity of the entrance of the car 11 and the hall 15 are photographed (steps S203 to 205). The image captured by the camera 12 is supplied to the detection processing unit 22b. The presence or absence of a user is detected from the captured image, and the detection result is reflected on the opening and closing control of the car door 13. On the other hand, the presence or absence of the passage of the user is detected by a door sensor 10 provided at the entrance/exit of the car 11.
The exposure adjustment unit 22C compares the detection result of the door sensor 10 with the detection result of the camera 12. As a result, when the door sensor 10 detects that the door sensor is not detected and the camera 12 does not detect the door sensor (yes in step S208), the exposure adjustment unit 22c updates the floor count 1 that is not detected by +1 (step S209).
For example, if the exposure is adjusted to the B value in a dark environment in the hall 15, not only the shadow but also the user may be underexposed, and may go undetected. In the case where the state of step S208 described above continues, the exposure value of the camera 12 does not match the environment of the hall 15. Therefore, when the floor count 1 exceeds the preset N times (yes in step S210), the exposure adjustment unit 22c determines that the user is not detected, and changes the exposure value of the camera 12 from the B value to the a value (step S211). At this time, the floor count 1 is reset to return to 0.
The value a is set to be higher than the value B and brighter than usual. Therefore, by raising the exposure value of the camera 12 from the B value to the a value, the user can be detected even in a state where, for example, the hall 15 is dark. However, since the exposure value of the camera 12 is raised to the a value, there is a possibility that a shadow is erroneously detected.
Therefore, after adjusting the exposure value to the a value, the exposure adjusting unit 22c compares the detection result of the door sensor 10 with the detection result of the camera 12 every time the car 11 stops at each floor. As a result, if the presence door sensor 10 is not detected and the camera 12 is a floor having a detected floor (yes in step S212), the exposure adjustment unit 22c updates the floor count 2 having been erroneously detected by +1 (step S213). The same process as described above is repeated for each floor, and when the state of step S212 described above continues, it is considered that a shadow is erroneously detected. Therefore, when the floor count 2 exceeds the predetermined N times (yes in step S214), the exposure adjustment unit 22c determines that the shadow is erroneously detected, and returns the exposure value of the camera 12 to the B value (step S215). At this time, the floor count 2 value is reset to return to 0. The determination is performed N times as in step S110, but may be performed N times different from each other.
If the door sensor 10 detects the detection and the camera 12 detects the detection ("yes" in step S216), this means that the exposure value of the camera 12 is in a state suitable for the environment of the hall 15, and the user detection process of the camera 12 functions normally. In this case, the value of floor count 1 and the value of floor count 2 are reset to return to 0, and the photographing by the camera 12 is continued at the current exposure value (a value or B value). In the same manner as in the case where the door sensor 10 is not detected and the camera 12 is not detected, the photographing of the camera 12 is continued at the current exposure value (a value or B value) (no in step S216). However, since the detection-undetected/false-detected judgment cannot be obtained, the values of floor counts 1 and 2 are not cleared, and the photographing by the camera 12 is continued with the current exposure value (a value or B value).
In this way, even if the exposure value (B value) for suppressing the shadow in the dark exposure mode is initially set, the exposure value of the camera 12 can be appropriately adjusted according to the hall environment, and undetected suppression by the user and false detection suppression of the shadow can be performed. This allows the hall 15 at each floor to accurately detect the user from the image captured by the camera 12, and to reflect the control of opening and closing the car door 13.
(other embodiments)
In the above embodiment, the case where the exposure value of the camera 12 is adjusted to the a value or the B value according to the hall environment was described as an example, but exposure adjustment may be performed in 2 or more stages.
For example, 4 exposure values, which are a value, B value, C value, and D value, are predetermined. The brightest a value and the darkest D value (a value > B value > C value > D value). In the bright exposure mode of (B-1), the exposure value of the camera 12 is initially set to the a value, and if the state of step S108 continues N times, it is determined that the user is not detected, and the exposure value of the camera 12 is sequentially reduced from the a value to the B value and the C value …. For example, if the state of step S112 continues N times after the exposure value of the camera 12 is lowered to the D value, it is determined that the shadow is erroneously detected, and the exposure value of the camera 12 is successively raised from the D value to the C value and the B value …. Finally, the state of step S116 described above, that is, the state in which the detection result of the door sensor 10 matches the detection result of the camera 12, becomes the optimum value.
In the case of the dark exposure mode of (B-2), the exposure value of the camera 12 is initially set to the D value, and if the state of step S208 continues N times, it is determined that the user is not detected, and the D value is successively increased to the C value and the B value …. For example, if the state of step S212 continues N times after the exposure value of the camera 12 is raised to the a value, it is determined that the shadow is erroneously detected, and the exposure value of the camera 12 is sequentially lowered from the a value to the B value and the C value …. Finally, the state of step S216, that is, the state in which the detection result of the door sensor 10 matches the detection result of the camera 12, is the optimum value.
Whether in the bright exposure mode or the dark exposure mode, the gate sensor 10 is used to determine the time point of exposure adjustment, and the exposure value is appropriately adjusted according to the hall environment at that time. Therefore, for example, a maintenance person is not required to perform an operation such as exposure adjustment by checking the brightness of the hall 15, and a process for setting an exposure value by determining the brightness of the hall 15 from the brightness value of the photographed image is not required.
According to at least one embodiment described above, it is possible to provide a user detection system for an elevator, which can effectively suppress undetected or false detection of shadows of a user according to the hall environment.
In addition, although various embodiments of the present invention have been described, these embodiments are merely examples and are not intended to limit the scope of the present invention. These novel embodiments may be implemented in various other ways, and various omissions, substitutions, and changes may be made without departing from the scope of the invention. These embodiments and modifications thereof are included in the scope and gist of the present invention, and are included in the invention described in the claims and their equivalents.

Claims (8)

1. A user detection system for an elevator is provided with:
a camera provided in the car and shooting the vicinity of the entrance and the hall of the car,
the elevator user detection system is characterized by comprising:
a door sensor provided at an entrance of the car and optically detecting a user passing through the entrance of the car;
a detection processing unit that detects a user located in the hall from a captured image of the camera;
an exposure adjustment unit that determines, based on a combination of a detection result of the door sensor and a detection result of the camera, that the user is not detecting or is not detecting a false shadow in the captured image, and adjusts an exposure value of the camera in a direction in which the user is not detecting or is not detecting a false shadow in the captured image, based on the determination result; and
and a door opening/closing control unit that performs door opening/closing control of the car based on a detection result of the camera.
2. The elevator user detection system of claim 1, wherein,
when the door sensor is detected and the camera is not detected, the exposure adjustment unit adjusts the exposure value of the camera in a direction in which the user is suppressed from being undetected.
3. The elevator user detection system of claim 1, wherein,
when the door sensor is not detected and the camera is detected, the exposure adjustment unit adjusts the exposure value of the camera in a direction to suppress erroneous detection of a shadow that is reflected in the captured image.
4. The elevator user detection system of claim 2, wherein,
the exposure adjustment unit compares a detection result of the camera with a detection result of the door sensor every time the car stops at each floor and opens the door, and adjusts an exposure value of the camera in a direction in which the user is suppressed from being undetected when the door sensor detects that the car is not detected and the camera continues for a predetermined number of floors.
5. The elevator user detection system of claim 3, wherein,
the exposure adjustment unit compares a detection result of the camera with a detection result of the door sensor every time the car stops at each floor and opens the door, and adjusts an exposure value of the camera in a direction in which false detection of a shadow in the captured image is suppressed when the door sensor does not detect the car and the camera continues for a predetermined number of floors in a detected state.
6. The elevator user detection system of claim 1, wherein,
the exposure adjustment unit has a bright exposure mode in which the exposure value of the camera is initially raised, and false detection of shadows in the captured image is suppressed by overexposure; and
and a dark exposure mode in which the exposure value of the camera is initially reduced, and false detection of shadows in the captured image is suppressed by underexposure.
7. The elevator user detection system of claim 6, wherein,
the exposure device is provided with a process switching unit for switching between the bright exposure mode process and the dark exposure mode process.
8. The elevator user detection system of claim 1, wherein,
the door sensor is a photosensor having a plurality of optical axes.
CN202110556537.6A 2020-07-15 2021-05-21 Elevator user detection system Active CN113942905B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-121499 2020-07-15
JP2020121499A JP6968943B1 (en) 2020-07-15 2020-07-15 Elevator user detection system

Publications (2)

Publication Number Publication Date
CN113942905A CN113942905A (en) 2022-01-18
CN113942905B true CN113942905B (en) 2023-06-09

Family

ID=78605589

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110556537.6A Active CN113942905B (en) 2020-07-15 2021-05-21 Elevator user detection system

Country Status (2)

Country Link
JP (1) JP6968943B1 (en)
CN (1) CN113942905B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7566858B2 (en) 2022-12-15 2024-10-15 東芝エレベータ株式会社 Elevator occupant detection system and exposure control method
CN115893143A (en) * 2022-12-16 2023-04-04 日立楼宇技术(广州)有限公司 A detection system, a lifting device control method, a lifting device and a storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002274257A (en) * 2001-03-19 2002-09-25 Nissan Motor Co Ltd Monitoring device for vehicle
CA2759527A1 (en) * 2011-11-28 2013-05-28 Technical Standards And Safety Authority System and method for maintaining, inspecting and assessing the reliability of elevator and other people moving devices
EP3192763A1 (en) * 2016-01-13 2017-07-19 Toshiba Elevator Kabushiki Kaisha Elevator system
CN106966275A (en) * 2016-01-13 2017-07-21 东芝电梯株式会社 Elevator device
CN106966276A (en) * 2016-01-13 2017-07-21 东芝电梯株式会社 The seating detecting system of elevator
JP6693627B1 (en) * 2019-05-16 2020-05-13 東芝エレベータ株式会社 Image processing device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7992687B2 (en) * 2006-03-20 2011-08-09 Mitsubishi Electric Corporation Device for elevator door control based on a detected object
JP2017149547A (en) * 2016-02-25 2017-08-31 キヤノン株式会社 Imaging device
JP6270925B2 (en) * 2016-07-07 2018-01-31 東芝エレベータ株式会社 Elevator system
EP3281904B1 (en) * 2016-08-09 2020-03-25 Otis Elevator Company Control systems and methods for elevators
JP6377797B1 (en) * 2017-03-24 2018-08-22 東芝エレベータ株式会社 Elevator boarding detection system
JP6480046B1 (en) * 2018-02-13 2019-03-06 東芝エレベータ株式会社 Failure diagnosis system and failure diagnosis method for optical axis sensor installed in elevator door

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002274257A (en) * 2001-03-19 2002-09-25 Nissan Motor Co Ltd Monitoring device for vehicle
CA2759527A1 (en) * 2011-11-28 2013-05-28 Technical Standards And Safety Authority System and method for maintaining, inspecting and assessing the reliability of elevator and other people moving devices
EP3192763A1 (en) * 2016-01-13 2017-07-19 Toshiba Elevator Kabushiki Kaisha Elevator system
CN106966274A (en) * 2016-01-13 2017-07-21 东芝电梯株式会社 Elevator device
CN106966275A (en) * 2016-01-13 2017-07-21 东芝电梯株式会社 Elevator device
CN106966276A (en) * 2016-01-13 2017-07-21 东芝电梯株式会社 The seating detecting system of elevator
JP6693627B1 (en) * 2019-05-16 2020-05-13 東芝エレベータ株式会社 Image processing device

Also Published As

Publication number Publication date
JP2022018411A (en) 2022-01-27
CN113942905A (en) 2022-01-18
JP6968943B1 (en) 2021-11-24

Similar Documents

Publication Publication Date Title
JP6377797B1 (en) Elevator boarding detection system
CN113428752B (en) User detection system for elevator
CN113942905B (en) Elevator user detection system
CN112340577B (en) User detection system for elevator
CN110294391B (en) User detection system
CN113428751B (en) User detection system of elevator
CN115108425B (en) Elevator user detection system
CN112429609B (en) User detection system for elevator
CN112441490B (en) User detection system for elevator
CN113428750B (en) User detection system for elevator
CN112340581B (en) User detection system for elevator
JP7183457B2 (en) Elevator user detection system
JP6716741B1 (en) Elevator user detection system
JP7566858B2 (en) Elevator occupant detection system and exposure control method
CN112441497B (en) User detection system for elevator
JP6729980B1 (en) Elevator user detection system
CN111453588B (en) Elevator system
CN113911868B (en) Elevator user detection system
CN115703608A (en) User detection system of elevator
HK40014401B (en) User detection system
HK40014401A (en) User detection system
HK40032802A (en) Elevator system
HK40032802B (en) Elevator system
HK1252953A1 (en) Elevator system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant