[go: up one dir, main page]

WO2022045049A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2022045049A1
WO2022045049A1 PCT/JP2021/030747 JP2021030747W WO2022045049A1 WO 2022045049 A1 WO2022045049 A1 WO 2022045049A1 JP 2021030747 W JP2021030747 W JP 2021030747W WO 2022045049 A1 WO2022045049 A1 WO 2022045049A1
Authority
WO
WIPO (PCT)
Prior art keywords
heat map
detected
information processing
server
value
Prior art date
Application number
PCT/JP2021/030747
Other languages
French (fr)
Japanese (ja)
Inventor
和伸 太田
正晃 上坂
成治 服部
祐太 坪井
翔太 佐々木
Original Assignee
Arithmer株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Arithmer株式会社 filed Critical Arithmer株式会社
Publication of WO2022045049A1 publication Critical patent/WO2022045049A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/12Hotels or restaurants

Definitions

  • the present invention relates to an information processing device, an information processing method and a program.
  • Patent Document 1 From the viewpoint of prevention of food poisoning, a method of supporting hygiene management at a restaurant that provides food has been proposed (for example, Patent Document 1).
  • the program acquires a moving image of a table on which food is placed from above, and detects a first object and a second object from the table area corresponding to the table in the moving image. , A heat map in which the values of each region in the table area are set is generated based on the detection result of the first object, and when the second object is detected, the region where the second object is detected is detected.
  • the computer is made to execute the process of changing the value of the heat map.
  • FIG. 1 It is explanatory drawing which shows the configuration example of the hygiene management system. It is a block diagram which shows the configuration example of a server. It is explanatory drawing which shows the outline of Embodiment 1. FIG. It is explanatory drawing which shows the outline of Embodiment 1. FIG. It is explanatory drawing which shows the display screen example of a heat map. It is a flowchart which shows an example of the processing procedure which a server executes. It is explanatory drawing which shows the outline of Embodiment 2. It is a flowchart which shows an example of the processing procedure executed by the server which concerns on Embodiment 2. FIG. It is explanatory drawing which shows the outline of Embodiment 3. FIG.
  • FIG. It is a flowchart which shows an example of the processing procedure executed by the server which concerns on Embodiment 3.
  • FIG. It is explanatory drawing which shows the outline of Embodiment 4.
  • FIG. 1 is an explanatory diagram showing a configuration example of a hygiene management system.
  • a hygiene management system that generates a heat map showing the hygiene state of the table from a moving image of a table on which food (object) is placed will be described.
  • the hygiene management system includes an information processing device 1, a terminal 2, and a camera 3. Each device is communicated and connected via the network N.
  • the heat map referred to here is represented by continuously changing numerical data in individual regions in an image, and is represented by, for example, continuously changing colors and their shades.
  • the information processing device 1 is an information processing device capable of transmitting and receiving various types of information processing and information, and is, for example, a server computer, a personal computer, or the like.
  • the information processing apparatus 1 is a server computer, and in the following, it will be read as server 1 for the sake of brevity.
  • the server 1 generates a heat map showing the hygiene state of the table from a moving image obtained by capturing the table on which the food is placed from above.
  • the table is a table used when providing food in the form of a buffet, a buffet, or the like at a restaurant or the like, and is a table used by an unspecified number of customers (persons).
  • the server 1 detects the customer's hand, finger, or equipment such as tongs used when taking food from the moving image of the table as the first object, and heat map showing the cleanliness of each area on the table. (See FIG. 3A). Then, the server 1 detects an object related to the cleaning work of the table as a second object, such as a cleaning tool (duster or the like) used by the employee who cleans the table, or a glove worn by the employee, and the second object. The value of the heat map in the area where an object is detected is changed (reset) (see FIG. 3B).
  • the server 1 may be a cloud server communication-connected to the terminal 2 or the like via the Internet or the like, or may be a local server communication-connected to the terminal 2 or the like via a LAN (Local Area Network) or the like.
  • LAN Local Area Network
  • the terminal 2 is an information processing terminal used by an employee, for example, a tablet terminal, a personal computer, a smartphone, or the like.
  • the server 1 outputs the generated heat map to the terminal 2, displays it, and presents it to the employee.
  • the camera 3 is an image pickup device that captures an image of a table on which food is placed.
  • the camera 3 is installed directly above the table as shown in FIG. 1, and continuously images the table from directly above.
  • the installation position of the camera 3 is not limited to directly above the table, and may be installed diagonally above the table, for example, and may be installed at least at a position higher than the mounting surface of the table.
  • the server 1 acquires a moving image from the camera 3 and generates a heat map.
  • FIG. 2 is a block diagram showing a configuration example of the server 1.
  • the server 1 includes a control unit 11, a main storage unit 12, a communication unit 13, and an auxiliary storage unit 14.
  • the control unit 11 has an arithmetic processing unit such as one or a plurality of CPUs (Central Processing Units), MPUs (Micro-Processing Units), GPUs (Graphics Processing Units), and stores the program P stored in the auxiliary storage unit 14. By reading and executing, various information processing, control processing, etc. are performed.
  • the main storage unit 12 is a temporary storage area for SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), flash memory, etc., and temporarily stores data necessary for the control unit 11 to execute arithmetic processing.
  • SRAM Static Random Access Memory
  • DRAM Dynamic Random Access Memory
  • flash memory etc.
  • the communication unit 13 is a communication module for performing processing related to communication, and transmits / receives information to / from the outside.
  • the auxiliary storage unit 14 is a non-volatile storage area such as a large-capacity memory or a hard disk, and stores a program P and other data necessary for the control unit 11 to execute processing.
  • the server 1 may be a multi-computer composed of a plurality of computers, or may be a virtual machine virtually constructed by software.
  • the server 1 may be virtualized by the container technology.
  • the server 1 is not limited to the above configuration, and may include, for example, an input unit that accepts operation input, a display unit that displays an image, and the like. Further, the server 1 includes a reading unit for reading a portable storage medium 1a such as a CD (CompactDisk) -ROM, a DVD (DigitalVersatileDisc) -ROM, and reads and executes a program P from the portable storage medium 1a. You can do it. Alternatively, the server 1 may read the program P from the semiconductor memory 1b.
  • a portable storage medium 1a such as a CD (CompactDisk) -ROM, a DVD (DigitalVersatileDisc) -ROM
  • FIG. 3 is an explanatory diagram showing an outline of the first embodiment.
  • FIGS. 3A and 3B conceptually show tables when viewed from the viewpoint (directly above) of the camera 3, respectively. An outline of the present embodiment will be described with reference to FIG.
  • the server 1 acquires a moving image of the table on which the object is placed from above from the camera 3.
  • the table is, for example, a table installed in a restaurant (store), and is a table on which food is placed when food is provided in the form of a buffet, a buffet, or the like.
  • the server 1 acquires a moving image from a camera 3 installed on the ceiling of a store, for example, and generates a heat map.
  • a table used by an unspecified number of customers such as a buffet and a buffet will be described as an example, but individual tables used by customers for meals may be targeted.
  • the target table is not limited to the table used by the customer, and may be a table used by employees, for example, a workbench used in a kitchen.
  • a restaurant as an example, but for example, a table (furniture, etc.) on which products are displayed at a retail store, a general store, or the like may be targeted. That is, the object placed on the table is not limited to food, and may be other objects.
  • the server 1 identifies the table area corresponding to the table from the moving image obtained by capturing the table. Then, the server 1 divides the table area into a plurality of grids in a grid pattern.
  • the division unit (shape) of the table area is not limited to the grid-like grid, and may be any division unit suitable for generating a heat map.
  • the Server 1 detects a specific object from the table area.
  • the object includes a first object and a second object.
  • the first object is, for example, a customer's hand, finger, or an instrument such as tongs used when picking up food, and is an object other than the second object, that is, an object that comes into contact with a table for a purpose other than cleaning work.
  • the second object is, for example, a cleaning tool (duster or the like) used by the employee during cleaning, or a glove worn by the employee during cleaning, and is an object related to the cleaning work of the table.
  • FIG. 3A illustrates how a heat map is generated based on the detection result of the first object.
  • the portion displayed in color on the heat map is shown by hatching.
  • the objects (food) on the table are not shown.
  • the server 1 When the first object is detected from the table area, the server 1 generates a heat map in which the values of each area in the table area are set based on the detection result of the first object.
  • the server 1 generates a heat map in which the value of the area where the first object is detected is set according to the time when the first object is detected.
  • the "region in which the first object is detected” refers to, for example, a grid on which the first objects overlap and a grid around the grid, but is a circle within a certain distance from the center of gravity of the detected first object. You may target the area. The same applies to the second object described later.
  • the server 1 sets the value of the heat map of the area (grid) where the first object is detected larger as the time when the first object is detected is longer. For example, the server 1 increases the increase width of the value of the grid on which the first object overlaps so that the value of the heat map becomes larger as it gets closer to the position where the first object is detected, and the grid around the grid. Set a small increase in the value of. The server 1 sets the increase width of each grid according to the detection time, and generates a heat map.
  • the value of the heat map is set according to the time when the first object is detected, but the present embodiment is not limited to this.
  • the server 1 may set the value of the heat map according to the number of times that the first object is detected in addition to the detection time of the first object or instead of the detection time.
  • the time and the number of times that the first object is detected is an example of the setting standard, and other setting criteria (for example, the type of the first object, or the presence or absence of contact as in the second embodiment described later). You may set the value of the heat map according to. In this way, the server 1 may be able to generate a heat map based on the detection result of the first object.
  • the heat map is generated only from the moving image, but the present embodiment is not limited to this.
  • the server 1 may generate a heat map based on environmental values such as temperature and humidity around the table.
  • sensors such as a temperature sensor and a humidity sensor are installed near the table, and the server 1 acquires environmental values such as temperature and humidity together with a moving image.
  • the server 1 weights the value of the heat map based on the acquired environment value. For example, when the temperature, humidity, or the like is equal to or higher than a certain value, the server 1 sets a large increase range of the heat map value due to the detection of the first object.
  • the server 1 may uniformly increase the values of all the grids in the table area regardless of whether or not the first object is detected. In this way, the heat map may be generated in consideration of the environment in which the food is placed.
  • FIG. 3B illustrates how the value of the heat map is changed when the second object is detected.
  • the server 1 changes the value of the heat map in the area where the second object is detected. Specifically, the server 1 resets the value of the heat map in the region. Note that "resetting the heat map value” means setting the initial value (for example, "0") before the first object is detected.
  • the server 1 resets the value of the heat map of the area where the second object is detected, that is, the area where the second object has passed by the wiping operation or the like, and erases the color display of the area.
  • the server 1 may recognize the operation of the second object from the moving image and reset the value of the heat map according to the recognized operation. Specifically, the server 1 recognizes an operation of moving the second object at a constant speed or higher, an operation of repeatedly moving the second object, and the like, and when the operation is recognized, the second object passes through. Reset the heatmap value for the area you have created. As a result, for example, the case where a cleaning tool (duster or the like) is only placed can be excluded, and the value of the heat map can be appropriately reset.
  • a cleaning tool duester or the like
  • the server 1 may notify the terminal 2 that there is an uncleaned area when the area where the value is not reset remains in the table area. .. As a result, it is possible to prevent unwiped residue and the like, and to more preferably support the cleaning work.
  • the value of the heat map is reset when the second object is detected, but the value is not completely reset to the initial value before the detection of the first object, but is a constant value. You may try to subtract only the value. That is, when the server 1 detects the second object, the server 1 may change the heat map value so as to be the same as or closer to the state (value) before the first object detection, and reset the heat map value.
  • the configuration to do is not mandatory.
  • the server 1 generates a heat map according to the detection result of the first object, and when the second object is detected, the value of the heat map of the region where the second object is detected is used. Reset. As a result, it is possible to visualize the area requiring cleaning and appropriately support hygiene management.
  • FIG. 4 is an explanatory diagram showing an example of a heat map display screen.
  • FIG. 4 illustrates an example of a display screen when the terminal 2 displays the above heat map.
  • the terminal 2 receives the output of the heat map from the server 1 and displays the screen.
  • the screen includes a table selection field 41 and a heat map 42.
  • the terminal 2 displays the heat map 42 corresponding to the table selected in the table selection field 41.
  • the terminal 2 displays an image in which the heat map 42 of the selected table is superimposed on the moving image captured by the camera 3.
  • the server 1 sequentially generates a heat map from the moving image, superimposes it on the moving image being imaged, and displays it on the terminal 2.
  • the terminal 2 displays the table information field 43 on the screen.
  • the table information column 43 is a list showing information of each object (food) placed on the table.
  • the terminal 2 displays the name, position, and the like of each food placed on the table, as well as the placing time of each food and the necessity of replacement.
  • the server 1 detects the first object and the second object from the table area, and also detects (recognizes) the food placed on the table. Then, the server 1 counts the placing time of each food and determines whether or not the food should be replaced. For example, the server 1 compares the placement time with a predetermined threshold value, and if it is equal to or more than the threshold value, determines that the server 1 should be replaced. For example, the server 1 may change the threshold value according to the type of detected food (whether or not it is fresh food, etc.).
  • the server 1 may target tableware (dish) prepared on the table, tableware such as tongs used when taking food.
  • the necessity of replacement is determined only by the mounting time, but the present embodiment is not limited to this, and the necessity of replacement may be determined in consideration of the value of the heat map. ..
  • the server 1 sets a lower threshold value as the value of the area where the food is placed is larger, and sets a higher threshold value as the value of the area where the food is placed is smaller. Compare. In this way, by referring not only to the mounting time but also to the value of the heat map, it is possible to more preferably determine the necessity of replacement.
  • the terminal 2 displays the food placement time counted by the server 1 and the necessity of replacement in the table information column 43. For example, the terminal 2 displays a list of the placement times of each food and highlights the foods determined to be replaced. In FIG. 4, for convenience, the highlighted state is shown in bold. As a result, it is possible to support not only the hygiene management of the table but also the replacement of food.
  • FIG. 5 is a flowchart showing an example of a processing procedure executed by the server 1.
  • the processing contents executed by the server 1 will be described with reference to FIG.
  • the control unit 11 of the server 1 acquires a moving image of the table on which the object is placed from above from the camera 3 (step S11).
  • the control unit 11 identifies a table area corresponding to the table from the moving image, and detects a specific object and an object placed on the table from the table area (step S12).
  • the object includes the first object and the second object.
  • the first object is, for example, a customer's hand, a finger, or an instrument such as tongs used when taking food, and is an object other than the second object, that is, an object that comes into contact with a table for a purpose other than cleaning.
  • the second object is, for example, a cleaning tool, gloves, or the like used by an employee during cleaning, and is an object related to table cleaning work.
  • the object placed on the table is, for example, food, which needs to be replaced regularly.
  • the control unit 11 determines whether or not the first object is detected from the moving image (step S13). When it is determined that the first object has been detected (S13: YES), the control unit 11 generates a heat map showing the hygiene state of the table based on the detection result of the first object (step S14). Specifically, the control unit 11 generates a heat map in which the value of the region where the first object is detected is set according to the time, the number of times, and the like when the first object is detected. The control unit 11 may generate a heat map by referring to the environmental values (temperature, humidity, etc.) around the table in addition to the detection result of the first object.
  • step S15 After executing the process of step S14, or when NO in step S13, the control unit 11 determines whether or not the second object is detected from the moving image (step S15). When it is determined that the second object has been detected (S15: YES), the control unit 11 resets the heat map value in the region where the second object is detected (step S16).
  • step S17 After executing the process of step S16, or if NO in step S15, the control unit 11 determines whether or not the predetermined time has elapsed (step S17). When it is determined that the predetermined time has not elapsed (S17: NO), the control unit 11 repeats the processes of steps S11 to S16 until the predetermined time elapses. When it is determined that the predetermined time has elapsed (S17: YES), the control unit 11 counts the placement time of the object (food) detected from the moving image in step S12 (step S18). The control unit 11 determines whether or not the object placed on the table should be replaced based on the counted placement time (step S19). In response to a request from the terminal 2, the control unit 11 outputs a heat map of the table, a placement time of each object placed on the table, a determination result relating to the necessity of replacement, and the like (step S20). The process returns to step S11.
  • steps S11 to S16 generation of heat map
  • steps S18 to S20 counting of the mounting time
  • the first embodiment it is possible to more appropriately support the employees of the store by controlling the placement time at the same time as generating the heat map and determining the necessity of exchanging the object (food). can.
  • the first embodiment by generating a heat map based on the environmental value around the table, it is possible to present a heat map considering the environment in which the object (food) is placed.
  • FIG. 6 is an explanatory diagram showing an outline of the second embodiment.
  • FIG. 6 illustrates a case where the first object (customer's hand, tongs, etc.) is approaching the table when viewed from the side of the table.
  • An outline of the present embodiment will be described with reference to FIG.
  • the server 1 detects the first object (and the second object) from the table area and generates a heat map, as in the first embodiment. In this case, the server 1 estimates the depth (distance) from the camera 3 to the first object (and the table), and determines whether or not the first object has touched the table based on the estimated depth.
  • the camera 3 is configured as a depth camera equipped with a depth sensor.
  • the server 1 acquires a distance image (moving image) in which a depth is added to each pixel in the image from the camera 3 and refers to the depth of the pixel corresponding to the position where the first object is detected.
  • the depth camera is an example of a depth estimation means, and the present embodiment is not limited to this.
  • a plurality of cameras 3 stereo cameras
  • the depth of the first object may be estimated from the parallax of each camera 3.
  • the server 1 may prepare a trained machine learning model (for example, a neural network) so as to estimate the depth of the first object when a moving image is input, and estimate the depth using the model. good.
  • the depth estimation means is not limited to the depth camera.
  • the server 1 estimates the depth from the camera 3 (the imaging point of the moving image) to the first object and the table, respectively.
  • the depth of the table is not estimated from the image, but may be a fixed value.
  • the server 1 compares the depth of the first object with the depth of the table and determines whether or not the first object has touched the table.
  • the term "contact with the table” includes not only the case of contact with the table body but also the case of contact with an object (food) placed on the table. Not only when the first object is actually in contact with the table, but also when the first object is sufficiently close to the table (when the difference between the depth of the first object and the depth of the table is a certain value or less). Can also be included.
  • Server 1 generates a heat map according to the above determination result. That is, when the server 1 determines that the first object has contacted, the server 1 increases the value of the region in which the first object has contacted according to the detection time (contact time), the number of times, and the like, and generates a heat map.
  • the server 1 performs the same processing. That is, the server 1 estimates the depth from the camera 3 to the second object, and determines whether or not the second object is in contact with the table. If it is determined that the table is in contact, the server 1 resets the value of the heat map of the area in which the second object is in contact.
  • FIG. 7 is a flowchart showing an example of a processing procedure executed by the server 1 according to the second embodiment.
  • the control unit 11 of the server 1 estimates the depth of the first object, and the first object comes into contact with the table based on the estimated depth. Whether or not it is determined (step S201).
  • the control unit 11 When it is determined that the first object is in contact (S201: YES), the control unit 11 generates a heat map in which the value of the region in which the first object is in contact is increased (step S202). After executing the process of step S202, or if NO in steps S13 and S201, the control unit 11 shifts the process to step S15.
  • the control unit 11 estimates the depth of the second object, and whether or not the second object has touched the table based on the estimated depth. Is determined (step S203). When it is determined that the second object is in contact (S203: YES), the control unit 11 resets the heat map value of the region in which the second object is in contact (step S204). After executing the process of step S204, or if NO in steps S15 and S203, the control unit 11 shifts the process to step S17.
  • the second embodiment it is possible to more preferably generate a heat map by determining whether or not the first object and / or the second object is in contact with the table.
  • FIG. 8 is an explanatory diagram showing an outline of the third embodiment.
  • FIG. 8 illustrates a case where a customer (person) exists around the table as viewed from the side of the table. An outline of the present embodiment will be described with reference to FIG.
  • thermography sensor 4 is a sensor that measures a thermal image showing a temperature distribution around a table.
  • the server 1 acquires a moving image from the camera 3 and also acquires a thermal image from the thermography sensor 4.
  • the server 1 When generating a heat map, the server 1 identifies peripheral person information about a person existing around the table and refers to it when generating the heat map.
  • Peripheral person information includes, for example, the density and behavior of people around the table, as well as the body temperature of the surrounding people.
  • the server 1 recognizes the people existing around the table from the moving image, and specifies the density (number of people) of the surrounding people. Further, the server 1 identifies the behavior of a peripheral person from the moving image.
  • the behavior to be identified is, for example, conversation, coughing, sneezing, etc. Further, the server 1 identifies the body temperature of the person recognized from the moving image from the thermal image.
  • Server 1 generates a heat map based on the peripheral person information specified above. For example, when the density of people existing around the table is a certain value or more, the server 1 weights the value of the heat map and sets a large increase width of the area where the first object is detected. Alternatively, when the server 1 identifies an action such as a conversation of a surrounding person, the server 1 sets a large value of the heat map of the table area within a certain distance from the person. Alternatively, when the body temperature of a surrounding person is equal to or higher than a certain value, the server 1 sets a large value of the heat map of the table area within a certain distance from the person.
  • the above process is an example, and it suffices if a heat map can be generated based on peripheral person information.
  • FIG. 9 is a flowchart showing an example of a processing procedure executed by the server 1 according to the third embodiment.
  • the server 1 executes the following process.
  • the control unit 11 of the server 1 acquires a thermal image showing the temperature distribution around the table (step S301).
  • the control unit 11 identifies peripheral person information regarding a person (customer) existing around the table based on the moving image acquired in step S11 and / or the thermal image acquired in step S301 (step S302). Specifically, as described above, the control unit 11 specifies the body temperature of the surrounding person from the thermal image in addition to the density of the person around the table, the behavior (presence or absence of conversation), and the like.
  • the control unit 11 shifts the process to step S13.
  • the control unit 11 When it is determined that the first object has been detected from the moving image (S13: YES), the control unit 11 generates a heat map based on the detection result of the first object and the peripheral person information (step S303). .. Specifically, the control unit 11 determines the heat map of each region in the table region according to the density, behavior, body temperature, etc. of the person around the table, in addition to the time, the number of times, etc., when the first object is detected. Set the value. The control unit 11 shifts the process to step S15.
  • the sanitary state of the table is determined by recognizing the person around the table from the moving image, specifying the information on the surrounding person related to the person, and referring to the heat map when generating the heat map. It can be managed more preferably.
  • thermography sensor 4 by installing the thermography sensor 4 and specifying the body temperature of the surrounding person, the person who may be suffering from a disease or the like is specified, and the heat map is more preferably generated. be able to.
  • FIG. 10 is an explanatory diagram showing an outline of the fourth embodiment.
  • FIG. 10 illustrates how the cleanliness of the store is posted on the review site. An outline of the present embodiment will be described with reference to FIG.
  • the server 1 generates a heat map showing the hygiene state of the table in the store where the object (food) is placed.
  • a plurality of cameras 3, 3, 3, ... are installed corresponding to each of the plurality of tables installed in the store, and a heat map of each table is generated.
  • the server 1 calculates the cleanliness of the entire store based on the heat map data generated for each table.
  • the server 1 calculates the cleanliness of the store based on the update history of the heat map of each table.
  • the specific method for calculating the cleanliness is not particularly limited, but for example, the server 1 calculates the cleanliness according to the area (the number of grids) of the region where the value of the heat map is equal to or larger than the threshold value. Specifically, the server 1 calculates the total area of the area where the heat map value is equal to or greater than the threshold value in units of the past week, one month, etc., and divides the calculated total area by the area of the entire table. , Calculate the ratio (area ratio) of the table area above the threshold value for each table. Then, the server 1 calculates the cleanliness by taking the average value, the median value, etc. of the ratios of all the tables.
  • the cleanliness may be expressed numerically, or may be expressed by a plurality of stages of evaluation such as "A rank", "B rank", “C rank", and so on.
  • the server 1 may calculate the cleanliness based on the number of times the heat map is reset by detecting the second object. Specifically, the server 1 counts the number of resets for each table in units of the past week, one month, etc., and calculates the cleanliness by taking the average value, the median value, etc. of the reset times of all the tables. As described above, the method of calculating the cleanliness is not particularly limited.
  • the server 1 outputs the calculated cleanliness of the store to a Web server (not shown) that manages the review site, and displays it in the review site. For example, as shown in FIG. 10, on the review site, the cleanliness of each store using this system is posted as "A rank", "B rank” and the like. As a result, it is possible to present to the users of the review site data that objectively evaluates the hygiene management efforts of each store.
  • FIG. 11 is a flowchart showing the procedure of the cleanliness calculation process. Based on FIG. 11, the processing contents executed by the server 1 in the present embodiment will be described.
  • the control unit 11 of the server 1 acquires heat map data (update history) of each of a plurality of tables installed in the store (step S401).
  • the control unit 11 calculates the cleanliness of the store based on the acquired heat map data (step S402). For example, the control unit 11 calculates an index value for each table according to the area of the table area where the heat map value is equal to or larger than the threshold value, the number of times the heat map value is reset, and the like, and the average value, median value, etc. of each table.
  • the cleanliness of the entire store is calculated by taking.
  • the control unit 11 outputs the calculated cleanliness to an external Web server (step S403), and ends a series of processes.
  • the cleanliness of the store can be objectively evaluated based on the heat map of each table.
  • FIG. 12 is a schematic diagram showing the configuration of the control unit 11 according to the fifth embodiment.
  • the control unit 11 has the same function as that of the first embodiment.
  • the control unit 11 functions as an acquisition unit 11a, a detection unit 11b, a generation unit 11c, and a change unit 11d by reading and executing the program P stored in the auxiliary storage unit 14.
  • the acquisition unit 11a acquires a moving image of the table on which the food is placed from above from the camera 3.
  • the detection unit 11b detects the first object and the second object from the table area corresponding to the table in the moving image. Specifically, the detection unit 11b detects a second object related to the cleaning work of the table and a first object other than the second object from the table area. Here, the detection unit 11b may detect the first object and / or the second object based on the movement of the object reflected in the moving image for a certain period of time.
  • the generation unit 11c generates a heat map in which the values of each area in the table area are set based on the detection result of the first object. Specifically, the generation unit 11c generates a heat map in which the value of the region where the first object is detected is set according to the time or the number of times when the first object is detected.
  • the change unit 11d changes the value of the heat map in the area where the second object is detected. Specifically, when the second object is detected, the changing unit 11d resets or subtracts the value in the region where the second object is detected.
  • the control unit 11 may further function as the depth estimation unit 11e.
  • the depth estimation unit 11e estimates the depth from the imaging point where the moving image is captured to the first object or the second object. Further, the depth estimation unit 11e determines whether or not the first object or the second object has come into contact with the table based on the estimated depth.
  • the generation unit 11c generates or changes the value of the heat map according to the determination result.
  • control unit 11 may further function as the food determination unit 11f.
  • the food determination unit 11f detects the food placed on the table from the moving image, and determines whether or not the food should be replaced based on the food placement time.
  • control unit 11 may further function as the recognition unit 11g.
  • the recognition unit 11g recognizes a person existing around the table from the moving image.
  • the recognition unit 11g identifies peripheral person information regarding the recognized person.
  • the generation unit 11c generates a heat map based on the peripheral person information and the detection result of the first object.
  • the recognition unit 11g may acquire a thermal image showing the temperature distribution around the table and specify the body temperature of the person based on the thermal image. In this case, the control unit 11 generates a heat map according to the body temperature of the person.
  • control unit 11 may further function as a cleanliness calculation unit 11h.
  • the cleanliness calculation unit 11e calculates the cleanliness of the store based on the heat maps of each of the plurality of tables as described in the fourth embodiment.
  • control unit 11 may acquire the environmental values around the table.
  • generation unit 11c generates a heat map based on the environmental value and the detection result of the first object.
  • FIG. 13 is a flowchart showing an example of a processing procedure executed by the server 1.
  • the control unit 11 of the server 1 detects an object reflected in the moving image in step S12.
  • the control unit 11 detects the first object from the moving image, it does not determine that the first object has been detected immediately, but determines that the object that is a candidate for the first object has been detected. (S501).
  • the control unit 11 stores in the main storage unit 12 the time-series data of the first target operation indicating the operation of the object that is a candidate for the first object for a certain period of time.
  • the control unit 11 analyzes the time-series data of the first target operation by comparing with a predetermined operation set in advance (S502).
  • the control unit 11 determines that the time-series data of the first object operation is unique to the first object, the control unit 11 considers the candidate of the first object to be the first object, and determines that the candidate of the first object is the first object. Is detected (S503).
  • the control unit 11 determines that the first object has been detected (S503: YES)
  • the control unit 11 generates a heat map showing the sanitary state of the table (S504).
  • the control unit 11 generates a heat map in which the value of the region where the first object is detected is set according to the time, the number of times, etc. from the time when the candidate of the first object is detected.
  • the control unit 11 determines whether or not the second object is detected from the moving image.
  • the control unit 11 detects the second object from the moving image, it does not determine that the second object has been detected immediately, but determines that the object that is a candidate for the second object has been detected. (S505).
  • the control unit 11 stores in the main storage unit 12 the time-series data of the second target operation indicating the operation of the object that is a candidate for the second object for a certain period of time.
  • the control unit 11 analyzes the time-series data of the second target operation by comparing with a predetermined operation set in advance (S506).
  • control unit 11 determines that the time-series data of the second target operation is unique to the second target, the control unit 11 considers the candidate for the second target to be the second target, and determines that the candidate for the second target is the second target. Is detected (S507).
  • control unit 11 determines that the second object has been detected, the control unit 11 changes the value of the heat map of each area in the table area according to the time, the number of times, etc. from the time when the candidate for the second object is detected. (Reset, etc.) (S508).
  • the server 1 according to the fifth embodiment detects the first object and / or the second object based on the movement of the object reflected in the moving image for a certain period of time.
  • the detection accuracy can be improved.
  • the method of specifying the object based on the features obtained from the shape and color without using the time series data there is a possibility that the detection omission of the object whose shape changes dynamically may occur.
  • the object is specified by using the time-series data, so that the omission of detection of the object can be avoided. .. As a result, the detection accuracy of the object can be improved.
  • the server 1 specifies the operation of wiping the table with the duster based on the shape of the hand, the color of the cloth under the hand, the time-series data of the position of the hand to be wiped, and the like, the time-series data. It is possible to avoid false detection of an object as compared with a method that does not use. As a result, the detection accuracy of the object can be improved.
  • the first object and the second object are composed of the same object, they can be distinguished from each other.
  • the method of specifying the object based on the feature amount obtained from the shape and color the first object that is the hand that touches the table and the second object that is the hand that wipes the table are confused and detected. Can happen. This is because since the first object and the second object are composed of "hands" which are the same object, there may be no difference in the feature amounts of the two obtained from the shape and color.
  • the server 1 according to the fifth embodiment since the object is identified by using the time series data, even if it is the first object and the second object composed of the same object, those objects are identified. It is possible to distinguish and detect the operation.

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • General Business, Economics & Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Primary Health Care (AREA)
  • Game Theory and Decision Science (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

[Problem] To provide an information processing device, etc. that make it possible to suitably support sanitary management. [Solution] A program of an information processing device according to the present invention acquires a video capturing, from above, a table on which a food product is placed, detects a first target object and a second target object from a table area in the video, the table area corresponding to the table, generates, on the basis of the detection result of the first target object, a heat map in which values are set for individual areas in the table area, and, when the second target object has been detected, causes a computer to execute processing for changing the value of the heat map for an area in which the second target object has been detected.

Description

情報処理装置、情報処理方法及びプログラムInformation processing equipment, information processing methods and programs
 本発明は、情報処理装置、情報処理方法及びプログラムに関する。 The present invention relates to an information processing device, an information processing method and a program.
 食中毒の予防等の観点から、食品を提供する飲食店での衛生管理を支援する手法が提案されている(例えば特許文献1)。 From the viewpoint of prevention of food poisoning, a method of supporting hygiene management at a restaurant that provides food has been proposed (for example, Patent Document 1).
特開2010-287166号公報Japanese Unexamined Patent Publication No. 2010-287166
 ところで昨今、ウィルス性の感染症の流行により、飲食店等では衛生管理の重要性がますます高まっている。例えば飲食店では、食品等の提供物や食器類、店内設備などを清潔に保つことが求められているが、これらが清潔に保たれているかを判断することが困難な場合がある。 By the way, due to the epidemic of viral infectious diseases these days, hygiene management is becoming more and more important in restaurants and the like. For example, in a restaurant, it is required to keep food and other supplies, tableware, and in-store equipment clean, but it may be difficult to determine whether these are kept clean.
 一つの側面に係るプログラムは、食品が載置されるテーブルを上方から撮像した動画像を取得し、前記動画像における前記テーブルに対応するテーブル領域から第1対象物及び第2対象物を検知し、前記第1対象物の検知結果に基づき、前記テーブル領域内の各領域の値を設定したヒートマップを生成し、前記第2対象物を検知した場合、前記第2対象物が検知された領域の前記ヒートマップの値を変化させる処理をコンピュータに実行させる。 The program according to one aspect acquires a moving image of a table on which food is placed from above, and detects a first object and a second object from the table area corresponding to the table in the moving image. , A heat map in which the values of each region in the table area are set is generated based on the detection result of the first object, and when the second object is detected, the region where the second object is detected is detected. The computer is made to execute the process of changing the value of the heat map.
 一つの側面では、衛生管理を好適に支援することができる。 On one side, hygiene management can be adequately supported.
衛生管理システムの構成例を示す説明図である。It is explanatory drawing which shows the configuration example of the hygiene management system. サーバの構成例を示すブロック図である。It is a block diagram which shows the configuration example of a server. 実施の形態1の概要を示す説明図である。It is explanatory drawing which shows the outline of Embodiment 1. FIG. 実施の形態1の概要を示す説明図である。It is explanatory drawing which shows the outline of Embodiment 1. FIG. ヒートマップの表示画面例を示す説明図である。It is explanatory drawing which shows the display screen example of a heat map. サーバが実行する処理手順の一例を示すフローチャートである。It is a flowchart which shows an example of the processing procedure which a server executes. 実施の形態2の概要を示す説明図である。It is explanatory drawing which shows the outline of Embodiment 2. 実施の形態2に係るサーバが実行する処理手順の一例を示すフローチャートである。It is a flowchart which shows an example of the processing procedure executed by the server which concerns on Embodiment 2. FIG. 実施の形態3の概要を示す説明図である。It is explanatory drawing which shows the outline of Embodiment 3. FIG. 実施の形態3に係るサーバが実行する処理手順の一例を示すフローチャートである。It is a flowchart which shows an example of the processing procedure executed by the server which concerns on Embodiment 3. FIG. 実施の形態4の概要を示す説明図である。It is explanatory drawing which shows the outline of Embodiment 4. 清潔度の算出処理の手順を示すフローチャートである。It is a flowchart which shows the procedure of the calculation process of a degree of cleanliness. 実施の形態5に係る制御部11の構成を示す模式図である。It is a schematic diagram which shows the structure of the control part 11 which concerns on Embodiment 5. 実施の形態5に係るサーバが実行する処理手順の一例を示すフローチャートである。It is a flowchart which shows an example of the processing procedure executed by the server which concerns on Embodiment 5.
 以下、本発明をその実施の形態を示す図面に基づいて詳述する。
(実施の形態1)
 図1は、衛生管理システムの構成例を示す説明図である。本実施の形態では、食品(物体)が載置されるテーブルを撮像した動画像から、当該テーブルの衛生状態を表すヒートマップを生成する衛生管理システムについて説明する。衛生管理システムは、情報処理装置1、端末2、カメラ3を含む。各装置は、ネットワークNを介して通信接続されている。なお、ここでいうヒートマップは、画像内の個々の領域における数値データを連続的に変化するもので表現したものであり、例えば連続的に変化する色及びその濃淡で表現したものである。
Hereinafter, the present invention will be described in detail with reference to the drawings showing the embodiments thereof.
(Embodiment 1)
FIG. 1 is an explanatory diagram showing a configuration example of a hygiene management system. In the present embodiment, a hygiene management system that generates a heat map showing the hygiene state of the table from a moving image of a table on which food (object) is placed will be described. The hygiene management system includes an information processing device 1, a terminal 2, and a camera 3. Each device is communicated and connected via the network N. The heat map referred to here is represented by continuously changing numerical data in individual regions in an image, and is represented by, for example, continuously changing colors and their shades.
 情報処理装置1は、種々の情報処理、情報の送受信が可能な情報処理装置であり、例えばサーバコンピュータ、パーソナルコンピュータ等である。本実施の形態では情報処理装置1がサーバコンピュータであるものとし、以下では簡潔のためサーバ1と読み替える。後述のように、サーバ1は、食品が載置されるテーブルを上方から撮像した動画像から、テーブルの衛生状態を表すヒートマップを生成する。当該テーブルは、飲食店等においてバイキング、ビュッフェ等の形式で食品を提供する際に用いられるテーブルであり、不特定多数の顧客(人物)が利用するテーブルである。サーバ1は、テーブルを撮像した動画像から顧客の手、指、又は食品を取る際に用いるトング等の器具などを第1対象物として検知し、テーブル上の各領域の清潔度合いを表すヒートマップを生成する(図3A参照)。そしてサーバ1は、テーブルを清掃する従業員が使用する清掃用具(ダスター等)、あるいは従業員が装着する手袋など、テーブルの清掃作業に関連する物体を第2対象物として検知し、第2対象物を検知した領域のヒートマップの値を変化(リセット)させる(図3B参照)。 The information processing device 1 is an information processing device capable of transmitting and receiving various types of information processing and information, and is, for example, a server computer, a personal computer, or the like. In the present embodiment, it is assumed that the information processing apparatus 1 is a server computer, and in the following, it will be read as server 1 for the sake of brevity. As will be described later, the server 1 generates a heat map showing the hygiene state of the table from a moving image obtained by capturing the table on which the food is placed from above. The table is a table used when providing food in the form of a buffet, a buffet, or the like at a restaurant or the like, and is a table used by an unspecified number of customers (persons). The server 1 detects the customer's hand, finger, or equipment such as tongs used when taking food from the moving image of the table as the first object, and heat map showing the cleanliness of each area on the table. (See FIG. 3A). Then, the server 1 detects an object related to the cleaning work of the table as a second object, such as a cleaning tool (duster or the like) used by the employee who cleans the table, or a glove worn by the employee, and the second object. The value of the heat map in the area where an object is detected is changed (reset) (see FIG. 3B).
 なお、サーバ1は、インターネット等を介して端末2等に通信接続されたクラウドサーバであってもよく、LAN(Local Area Network)等を介して通信接続されたローカルサーバであってもよい。 The server 1 may be a cloud server communication-connected to the terminal 2 or the like via the Internet or the like, or may be a local server communication-connected to the terminal 2 or the like via a LAN (Local Area Network) or the like.
 端末2は、従業員が使用する情報処理端末であり、例えばタブレット端末、パーソナルコンピュータ、スマートフォン等である。サーバ1は、生成したヒートマップを端末2に出力して表示させ、従業員に提示する。 The terminal 2 is an information processing terminal used by an employee, for example, a tablet terminal, a personal computer, a smartphone, or the like. The server 1 outputs the generated heat map to the terminal 2, displays it, and presents it to the employee.
 カメラ3は、食品が載置されるテーブルを撮像する撮像装置である。例えばカメラ3は、図1に示すようにテーブルの真上に設置され、真上からテーブルを継続的に撮像している。なお、カメラ3の設置位置はテーブルの真上に限定されず、例えばテーブルから見て斜め上に設置されていてもよく、少なくともテーブルの載置面よりも高い位置に設置されていればよい。サーバ1はカメラ3から動画像を取得し、ヒートマップを生成する。 The camera 3 is an image pickup device that captures an image of a table on which food is placed. For example, the camera 3 is installed directly above the table as shown in FIG. 1, and continuously images the table from directly above. The installation position of the camera 3 is not limited to directly above the table, and may be installed diagonally above the table, for example, and may be installed at least at a position higher than the mounting surface of the table. The server 1 acquires a moving image from the camera 3 and generates a heat map.
 図2は、サーバ1の構成例を示すブロック図である。サーバ1は、制御部11、主記憶部12、通信部13、及び補助記憶部14を備える。
 制御部11は、一又は複数のCPU(Central Processing Unit)、MPU(Micro-Processing Unit)、GPU(Graphics Processing Unit)等の演算処理装置を有し、補助記憶部14に記憶されたプログラムPを読み出して実行することにより、種々の情報処理、制御処理等を行う。主記憶部12は、SRAM(Static Random Access Memory)、DRAM(Dynamic Random Access Memory)、フラッシュメモリ等の一時記憶領域であり、制御部11が演算処理を実行するために必要なデータを一時的に記憶する。通信部13は、通信に関する処理を行うための通信モジュールであり、外部と情報の送受信を行う。補助記憶部14は、大容量メモリ、ハードディスク等の不揮発性記憶領域であり、制御部11が処理を実行するために必要なプログラムP、その他のデータを記憶している。
FIG. 2 is a block diagram showing a configuration example of the server 1. The server 1 includes a control unit 11, a main storage unit 12, a communication unit 13, and an auxiliary storage unit 14.
The control unit 11 has an arithmetic processing unit such as one or a plurality of CPUs (Central Processing Units), MPUs (Micro-Processing Units), GPUs (Graphics Processing Units), and stores the program P stored in the auxiliary storage unit 14. By reading and executing, various information processing, control processing, etc. are performed. The main storage unit 12 is a temporary storage area for SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), flash memory, etc., and temporarily stores data necessary for the control unit 11 to execute arithmetic processing. Remember. The communication unit 13 is a communication module for performing processing related to communication, and transmits / receives information to / from the outside. The auxiliary storage unit 14 is a non-volatile storage area such as a large-capacity memory or a hard disk, and stores a program P and other data necessary for the control unit 11 to execute processing.
 なお、サーバ1は複数のコンピュータからなるマルチコンピュータであっても良く、ソフトウェアによって仮想的に構築された仮想マシンであってもよい。なお、サーバ1はコンテナ技術で仮想化されるものでもよい。 The server 1 may be a multi-computer composed of a plurality of computers, or may be a virtual machine virtually constructed by software. The server 1 may be virtualized by the container technology.
 また、本実施の形態においてサーバ1は上記の構成に限られず、例えば操作入力を受け付ける入力部、画像を表示する表示部等を含んでもよい。また、サーバ1は、CD(Compact Disk)-ROM、DVD(Digital Versatile Disc)-ROM等の可搬型記憶媒体1aを読み取る読取部を備え、可搬型記憶媒体1aからプログラムPを読み取って実行するようにしても良い。あるいはサーバ1は、半導体メモリ1bからプログラムPを読み込んでも良い。 Further, in the present embodiment, the server 1 is not limited to the above configuration, and may include, for example, an input unit that accepts operation input, a display unit that displays an image, and the like. Further, the server 1 includes a reading unit for reading a portable storage medium 1a such as a CD (CompactDisk) -ROM, a DVD (DigitalVersatileDisc) -ROM, and reads and executes a program P from the portable storage medium 1a. You can do it. Alternatively, the server 1 may read the program P from the semiconductor memory 1b.
 図3は、実施の形態1の概要を示す説明図である。図3A、Bではそれぞれ、カメラ3の視点(真上)から見た場合のテーブルを概念的に図示している。図3に基づき、本実施の形態の概要を説明する。 FIG. 3 is an explanatory diagram showing an outline of the first embodiment. FIGS. 3A and 3B conceptually show tables when viewed from the viewpoint (directly above) of the camera 3, respectively. An outline of the present embodiment will be described with reference to FIG.
 上述の如く、サーバ1は、物体が載置されているテーブルを上方から撮像した動画像をカメラ3から取得する。当該テーブルは、例えば飲食店(店舗)に設置されたテーブルであり、バイキング、ビュッフェ等の形式で食品を提供する場合に、食品を載置しておくテーブルである。サーバ1は、例えば店舗の天井に設置されたカメラ3から動画像を取得し、ヒートマップを生成する。 As described above, the server 1 acquires a moving image of the table on which the object is placed from above from the camera 3. The table is, for example, a table installed in a restaurant (store), and is a table on which food is placed when food is provided in the form of a buffet, a buffet, or the like. The server 1 acquires a moving image from a camera 3 installed on the ceiling of a store, for example, and generates a heat map.
 なお、本実施の形態ではバイキング、ビュッフェ等のように不特定多数の顧客が利用するテーブルを一例に説明を行うが、顧客が食事の際に利用する個々のテーブルを対象としてもよい。また、対象とするテーブルは顧客が利用するテーブルに限らず、例えば厨房で使用する作業台のように、従業員が使用するテーブルであってもよい。 In this embodiment, a table used by an unspecified number of customers such as a buffet and a buffet will be described as an example, but individual tables used by customers for meals may be targeted. Further, the target table is not limited to the table used by the customer, and may be a table used by employees, for example, a workbench used in a kitchen.
 また、本実施の形態では飲食店を一例に説明を行うが、例えば小売店、雑貨店等で商品が陳列されるテーブル(什器等)を対象にしてもよい。すなわち、テーブルに載置される物体は食品に限定されず、その他の物体であってもよい。 Further, in the present embodiment, the explanation will be given using a restaurant as an example, but for example, a table (furniture, etc.) on which products are displayed at a retail store, a general store, or the like may be targeted. That is, the object placed on the table is not limited to food, and may be other objects.
 サーバ1は、テーブルを撮像した動画像から、テーブルに対応するテーブル領域を特定する。そしてサーバ1は、テーブル領域を格子状の複数のグリッドに分割する。なお、テーブル領域の分割単位(形状)は格子状のグリッドに限定されず、ヒートマップを生成する上で適切な分割単位であればよい。 The server 1 identifies the table area corresponding to the table from the moving image obtained by capturing the table. Then, the server 1 divides the table area into a plurality of grids in a grid pattern. The division unit (shape) of the table area is not limited to the grid-like grid, and may be any division unit suitable for generating a heat map.
 サーバ1は、テーブル領域から特定の対象物を検知する。当該対象物は、第1対象物及び第2対象物を含む。第1対象物は、例えば顧客の手、指、又は食品を取る際に用いるトング等の器具などであり、第2対象物以外の物体、すなわち清掃作業以外の目的でテーブルに接触する物体である。第2対象物は、例えば従業員が清掃時に使用する清掃用具(ダスター等)、又は清掃時に従業員が装着する手袋などであり、テーブルの清掃作業に関連する物体である。 Server 1 detects a specific object from the table area. The object includes a first object and a second object. The first object is, for example, a customer's hand, finger, or an instrument such as tongs used when picking up food, and is an object other than the second object, that is, an object that comes into contact with a table for a purpose other than cleaning work. .. The second object is, for example, a cleaning tool (duster or the like) used by the employee during cleaning, or a glove worn by the employee during cleaning, and is an object related to the cleaning work of the table.
 図3Aに、第1対象物の検知結果に基づいてヒートマップが生成される様子を図示している。なお、図3Aでは図示の便宜上、ヒートマップでカラー表示される部分をハッチングで図示している。また、図3A、Bではテーブル上の載置物(食品)は図示していない。テーブル領域から第1対象物を検知した場合、サーバ1は、第1対象物の検知結果に基づき、テーブル領域内の各領域の値を設定したヒートマップを生成する。 FIG. 3A illustrates how a heat map is generated based on the detection result of the first object. In FIG. 3A, for convenience of illustration, the portion displayed in color on the heat map is shown by hatching. Further, in FIGS. 3A and 3B, the objects (food) on the table are not shown. When the first object is detected from the table area, the server 1 generates a heat map in which the values of each area in the table area are set based on the detection result of the first object.
 具体的には、サーバ1は、第1対象物が検知された時間に応じて、第1対象物が検知された領域の値を設定したヒートマップを生成する。なお、「第1対象物が検知された領域」とは、例えば第1対象物が重なったグリッド及びその周辺のグリッドを指すが、検知された第1対象物の重心から一定距離内の円状領域を対象とするなどしてもよい。後述の第2対象物についても同様である。 Specifically, the server 1 generates a heat map in which the value of the area where the first object is detected is set according to the time when the first object is detected. The "region in which the first object is detected" refers to, for example, a grid on which the first objects overlap and a grid around the grid, but is a circle within a certain distance from the center of gravity of the detected first object. You may target the area. The same applies to the second object described later.
 サーバ1は、第1対象物が検知された時間が長いほど、第1対象物が検知された領域(グリッド)のヒートマップの値を大きく設定する。例えばサーバ1は、第1対象物が検知された位置に近いほどヒートマップの値が大きくなるように、第1対象物が重なったグリッドの値の増加幅を大きくし、当該グリッドの周辺のグリッドの値の増加幅を小さく設定する。サーバ1は、各グリッドの増加幅を検知時間に応じて設定し、ヒートマップを生成する。 The server 1 sets the value of the heat map of the area (grid) where the first object is detected larger as the time when the first object is detected is longer. For example, the server 1 increases the increase width of the value of the grid on which the first object overlaps so that the value of the heat map becomes larger as it gets closer to the position where the first object is detected, and the grid around the grid. Set a small increase in the value of. The server 1 sets the increase width of each grid according to the detection time, and generates a heat map.
 なお、上記では第1対象物が検知された時間に応じてヒートマップの値を設定するものとしたが、本実施の形態はこれに限定されるものではない。例えばサーバ1は、第1対象物の検知時間に加えて、又は検知時間に代えて、第1対象物が検知された回数に応じてヒートマップの値を設定してもよい。また、第1対象物が検知された時間及び回数は設定基準の一例であって、その他の設定基準(例えば第1対象物の種類、あるいは後述の実施の形態2のように接触の有無等)に応じてヒートマップの値を設定してもよい。このように、サーバ1は第1対象物の検知結果に基づいてヒートマップを生成可能であればよい。 In the above, the value of the heat map is set according to the time when the first object is detected, but the present embodiment is not limited to this. For example, the server 1 may set the value of the heat map according to the number of times that the first object is detected in addition to the detection time of the first object or instead of the detection time. Further, the time and the number of times that the first object is detected is an example of the setting standard, and other setting criteria (for example, the type of the first object, or the presence or absence of contact as in the second embodiment described later). You may set the value of the heat map according to. In this way, the server 1 may be able to generate a heat map based on the detection result of the first object.
 また、本実施の形態では動画像のみからヒートマップを生成するものとするが、本実施の形態はこれに限定されるものではない。例えばサーバ1は、テーブルの周辺の気温、湿度等の環境値に基づき、ヒートマップを生成してもよい。例えばテーブル付近に温度センサ、湿度センサ等のセンサを設置しておき、サーバ1は、動画像と共に気温、湿度等の環境値を取得する。そしてサーバ1は、取得した環境値に基づき、ヒートマップの値の重み付けを行う。例えばサーバ1は、気温、湿度等が一定値以上の場合、第1対象物を検知したことによるヒートマップの値の増加幅を大きく設定する。なお、例えばサーバ1は、第1対象物の検知の有無に関わらず、テーブル領域の全てのグリッドの値を一律に増加させるなどしてもよい。このように、食品が置かれている環境を考慮してヒートマップを生成してもよい。 Further, in the present embodiment, the heat map is generated only from the moving image, but the present embodiment is not limited to this. For example, the server 1 may generate a heat map based on environmental values such as temperature and humidity around the table. For example, sensors such as a temperature sensor and a humidity sensor are installed near the table, and the server 1 acquires environmental values such as temperature and humidity together with a moving image. Then, the server 1 weights the value of the heat map based on the acquired environment value. For example, when the temperature, humidity, or the like is equal to or higher than a certain value, the server 1 sets a large increase range of the heat map value due to the detection of the first object. For example, the server 1 may uniformly increase the values of all the grids in the table area regardless of whether or not the first object is detected. In this way, the heat map may be generated in consideration of the environment in which the food is placed.
 図3Bは、第2対象物が検知された場合にヒートマップの値を変化させる様子を図示している。第2対象物が検知された場合、サーバ1は、第2対象物が検知された領域のヒートマップの値を変化させる。具体的には、サーバ1は、当該領域のヒートマップの値をリセットする。なお、「ヒートマップの値をリセット」とは、第1対象物が検知される前の初期値(例えば「0」)に設定することを指す。サーバ1は、第2対象物が検知された領域、すなわち拭き取り動作等によって第2対象物が通過した領域のヒートマップの値をリセットし、当該領域のカラー表示を消去する。 FIG. 3B illustrates how the value of the heat map is changed when the second object is detected. When the second object is detected, the server 1 changes the value of the heat map in the area where the second object is detected. Specifically, the server 1 resets the value of the heat map in the region. Note that "resetting the heat map value" means setting the initial value (for example, "0") before the first object is detected. The server 1 resets the value of the heat map of the area where the second object is detected, that is, the area where the second object has passed by the wiping operation or the like, and erases the color display of the area.
 なお、例えばサーバ1は、動画像から第2対象物の動作を認識し、認識した動作に応じてヒートマップの値をリセットしてもよい。具体的には、サーバ1は、第2対象物を一定速度以上で移動させる動作、又は第2対象物を反復移動させる動作などを認識し、当該動作を認識した場合は第2対象物が通過した領域のヒートマップの値をリセットする。これにより、例えば清掃用具(ダスター等)が置かれただけの場合などを除外し、ヒートマップの値を好適にリセットすることができる。 Note that, for example, the server 1 may recognize the operation of the second object from the moving image and reset the value of the heat map according to the recognized operation. Specifically, the server 1 recognizes an operation of moving the second object at a constant speed or higher, an operation of repeatedly moving the second object, and the like, and when the operation is recognized, the second object passes through. Reset the heatmap value for the area you have created. As a result, for example, the case where a cleaning tool (duster or the like) is only placed can be excluded, and the value of the heat map can be appropriately reset.
 また、例えばサーバ1は、ヒートマップの値をリセットした場合において、テーブル領域に値がリセットされていない領域が残っている場合、清掃されていない領域がある旨を端末2に通知してもよい。これにより、拭き残し等を防止し、清掃作業をより好適に支援することができる。 Further, for example, when the value of the heat map is reset, the server 1 may notify the terminal 2 that there is an uncleaned area when the area where the value is not reset remains in the table area. .. As a result, it is possible to prevent unwiped residue and the like, and to more preferably support the cleaning work.
 また、本実施の形態では第2対象物を検知した場合にヒートマップの値をリセットするものとするが、第1対象物を検知する前の初期値に完全にリセットするのではなく、一定値だけ値を減算するようにしてもよい。すなわち、サーバ1は、第2対象物を検知した場合に、第1対象物検知前の状態(値)と同じ又は近づくようにヒートマップの値を変化させればよく、ヒートマップの値をリセットする構成は必須ではない。 Further, in the present embodiment, the value of the heat map is reset when the second object is detected, but the value is not completely reset to the initial value before the detection of the first object, but is a constant value. You may try to subtract only the value. That is, when the server 1 detects the second object, the server 1 may change the heat map value so as to be the same as or closer to the state (value) before the first object detection, and reset the heat map value. The configuration to do is not mandatory.
 上述の如く、サーバ1は、第1対象物の検知結果に応じてヒートマップを生成し、第2対象物が検知された場合は、第2対象物が検知された領域のヒートマップの値をリセットする。これにより、清掃が必要な領域を可視化し、衛生管理を好適に支援することができる。 As described above, the server 1 generates a heat map according to the detection result of the first object, and when the second object is detected, the value of the heat map of the region where the second object is detected is used. Reset. As a result, it is possible to visualize the area requiring cleaning and appropriately support hygiene management.
 図4は、ヒートマップの表示画面例を示す説明図である。図4では、上述のヒートマップを端末2が表示する際の表示画面例を図示している。端末2は、サーバ1からヒートマップの出力を受けて当該画面を表示する。当該画面は、テーブル選択欄41、ヒートマップ42を含む。端末2は、テーブル選択欄41で選択されたテーブルに対応するヒートマップ42を表示する。具体的には、端末2は、選択されたテーブルのヒートマップ42を、カメラ3で撮像している動画像に重畳した画像を表示する。サーバ1は、動画像からヒートマップを逐次的に生成し、撮像中の動画像に重畳して端末2に表示させる。 FIG. 4 is an explanatory diagram showing an example of a heat map display screen. FIG. 4 illustrates an example of a display screen when the terminal 2 displays the above heat map. The terminal 2 receives the output of the heat map from the server 1 and displays the screen. The screen includes a table selection field 41 and a heat map 42. The terminal 2 displays the heat map 42 corresponding to the table selected in the table selection field 41. Specifically, the terminal 2 displays an image in which the heat map 42 of the selected table is superimposed on the moving image captured by the camera 3. The server 1 sequentially generates a heat map from the moving image, superimposes it on the moving image being imaged, and displays it on the terminal 2.
 また、端末2は、テーブル情報欄43を当該画面に表示する。テーブル情報欄43は、テーブルに載置されている各物体(食品)の情報を示す一覧表である。端末2はテーブル情報欄43に、テーブルに載置されている各食品の名称、位置等のほかに、各食品の載置時間、及び交換の要否を表示する。 Further, the terminal 2 displays the table information field 43 on the screen. The table information column 43 is a list showing information of each object (food) placed on the table. In the table information column 43, the terminal 2 displays the name, position, and the like of each food placed on the table, as well as the placing time of each food and the necessity of replacement.
 本実施の形態においてサーバ1は、テーブル領域から第1対象物及び第2対象物を検知すると共に、テーブルに載置されている食品を検知(認識)する。そしてサーバ1は、各食品の載置時間をカウントし、食品を交換すべきか否かを判定する。例えばサーバ1は、載置時間を所定の閾値と比較し、閾値以上である場合は交換すべきと判定する。なお、例えばサーバ1は、検知した食品の種類(生鮮食品であるか否か等)に応じて閾値を変更してもよい。 In the present embodiment, the server 1 detects the first object and the second object from the table area, and also detects (recognizes) the food placed on the table. Then, the server 1 counts the placing time of each food and determines whether or not the food should be replaced. For example, the server 1 compares the placement time with a predetermined threshold value, and if it is equal to or more than the threshold value, determines that the server 1 should be replaced. For example, the server 1 may change the threshold value according to the type of detected food (whether or not it is fresh food, etc.).
 なお、上記では交換要否の判定対象とする物体として食品を挙げたが、本実施の形態はこれに限定されるものではない。例えばサーバ1は、テーブルに用意されている食器(取り皿)、食品を取る際に利用するトング等の食器類を対象としてもよい。 In the above, food is mentioned as an object for determining the necessity of replacement, but the present embodiment is not limited to this. For example, the server 1 may target tableware (dish) prepared on the table, tableware such as tongs used when taking food.
 また、上記では載置時間のみで交換の要否を判定したが、本実施の形態はこれに限定されるものではなく、ヒートマップの値も考慮して交換の要否を判定してもよい。例えばサーバ1は、食品が載置されている領域の値が大きいほど閾値を低く設定し、食品が載置されている領域の値が小さいほど閾値を高く設定して、食品の載置時間と比較する。このように、載置時間だけでなくヒートマップの値も参照することで、より好適に交換の要否を判定することができる。 Further, in the above, the necessity of replacement is determined only by the mounting time, but the present embodiment is not limited to this, and the necessity of replacement may be determined in consideration of the value of the heat map. .. For example, the server 1 sets a lower threshold value as the value of the area where the food is placed is larger, and sets a higher threshold value as the value of the area where the food is placed is smaller. Compare. In this way, by referring not only to the mounting time but also to the value of the heat map, it is possible to more preferably determine the necessity of replacement.
 端末2は、サーバ1がカウントしている食品の載置時間、及び交換の要否をテーブル情報欄43に表示する。例えば端末2は、各食品の載置時間を一覧表示すると共に、交換すべきと判定された食品を強調表示する。なお、図4では便宜上、強調表示される様子を太字で図示している。これにより、テーブルの衛生管理だけでなく食品の入れ替えも併せて支援することができる。 The terminal 2 displays the food placement time counted by the server 1 and the necessity of replacement in the table information column 43. For example, the terminal 2 displays a list of the placement times of each food and highlights the foods determined to be replaced. In FIG. 4, for convenience, the highlighted state is shown in bold. As a result, it is possible to support not only the hygiene management of the table but also the replacement of food.
 図5は、サーバ1が実行する処理手順の一例を示すフローチャートである。図5に基づき、サーバ1が実行する処理内容について説明する。
 サーバ1の制御部11は、物体が載置されるテーブルを上方から撮像した動画像をカメラ3から取得する(ステップS11)。制御部11は、動画像からテーブルに対応するテーブル領域を特定し、テーブル領域から特定の対象物、及びテーブルに載置されている物体を検知する(ステップS12)。上述の如く、対象物は第1対象物及び第2対象物を含む。第1対象物は、例えば顧客の手、指、又は食品を取る際に用いるトング等の器具などであり、第2対象物以外の物体、すなわち清掃以外の目的でテーブルに接触する物体である。第2対象物は、例えば従業員が清掃時に使用する清掃用具、手袋等であり、テーブルの清掃作業に関連する物体である。テーブルに載置されている物体は、例えば食品であり、定期的に交換が必要な物体である。
FIG. 5 is a flowchart showing an example of a processing procedure executed by the server 1. The processing contents executed by the server 1 will be described with reference to FIG.
The control unit 11 of the server 1 acquires a moving image of the table on which the object is placed from above from the camera 3 (step S11). The control unit 11 identifies a table area corresponding to the table from the moving image, and detects a specific object and an object placed on the table from the table area (step S12). As described above, the object includes the first object and the second object. The first object is, for example, a customer's hand, a finger, or an instrument such as tongs used when taking food, and is an object other than the second object, that is, an object that comes into contact with a table for a purpose other than cleaning. The second object is, for example, a cleaning tool, gloves, or the like used by an employee during cleaning, and is an object related to table cleaning work. The object placed on the table is, for example, food, which needs to be replaced regularly.
 制御部11は、動画像から第1対象物を検知したか否かを判定する(ステップS13)。第1対象物を検知したと判定した場合(S13:YES)、制御部11は、第1対象物の検知結果に基づき、テーブルの衛生状態を表すヒートマップを生成する(ステップS14)。具体的には、制御部11は、第1対象物が検知された時間、回数等に応じて、第1対象物が検知された領域の値を設定したヒートマップを生成する。なお、制御部11は、第1対象物の検知結果以外に、テーブルの周辺の環境値(気温、湿度等)も参照してヒートマップを生成してもよい。 The control unit 11 determines whether or not the first object is detected from the moving image (step S13). When it is determined that the first object has been detected (S13: YES), the control unit 11 generates a heat map showing the hygiene state of the table based on the detection result of the first object (step S14). Specifically, the control unit 11 generates a heat map in which the value of the region where the first object is detected is set according to the time, the number of times, and the like when the first object is detected. The control unit 11 may generate a heat map by referring to the environmental values (temperature, humidity, etc.) around the table in addition to the detection result of the first object.
 ステップS14の処理を実行後、又はステップS13でNOの場合、制御部11は、動画像から第2対象物を検知したか否かを判定する(ステップS15)。第2対象物を検知したと判定した場合(S15:YES)、制御部11は、第2対象物が検知された領域のヒートマップの値をリセットする(ステップS16)。 After executing the process of step S14, or when NO in step S13, the control unit 11 determines whether or not the second object is detected from the moving image (step S15). When it is determined that the second object has been detected (S15: YES), the control unit 11 resets the heat map value in the region where the second object is detected (step S16).
 ステップS16の処理を実行後、又はステップS15でNOの場合、制御部11は、所定時間が経過したか否かを判定する(ステップS17)。所定時間が経過していないと判定した場合(S17:NO)、制御部11は、所定時間が経過するまでステップS11~S16の処理を繰り返す。所定時間が経過したと判定した場合(S17:YES)、制御部11は、ステップS12で動画像から検知した物体(食品)の載置時間をカウントする(ステップS18)。制御部11は、カウントした載置時間に基づき、テーブルに載置されている物体を交換すべきであるか否かを判定する(ステップS19)。制御部11は、端末2からの要求に応じて、テーブルのヒートマップ、テーブルに載置されている各物体の載置時間、交換の要否に係る判定結果などを出力し(ステップS20)、処理をステップS11に戻す。 After executing the process of step S16, or if NO in step S15, the control unit 11 determines whether or not the predetermined time has elapsed (step S17). When it is determined that the predetermined time has not elapsed (S17: NO), the control unit 11 repeats the processes of steps S11 to S16 until the predetermined time elapses. When it is determined that the predetermined time has elapsed (S17: YES), the control unit 11 counts the placement time of the object (food) detected from the moving image in step S12 (step S18). The control unit 11 determines whether or not the object placed on the table should be replaced based on the counted placement time (step S19). In response to a request from the terminal 2, the control unit 11 outputs a heat map of the table, a placement time of each object placed on the table, a determination result relating to the necessity of replacement, and the like (step S20). The process returns to step S11.
 なお、図5のフローチャートではステップS11~S16の処理(ヒートマップの生成)を実行後に、連続してステップS18~S20の処理(載置時間のカウント)を実行するものとしたが、ステップS11~S16の処理とステップS18~S20の処理とは独立に実行されるものであってよい。 In the flowchart of FIG. 5, after the processing of steps S11 to S16 (generation of heat map) is executed, the processing of steps S18 to S20 (counting of the mounting time) is continuously executed, but steps S11 to S11 to The process of S16 and the process of steps S18 to S20 may be executed independently.
 以上より、本実施の形態1によれば、テーブルの衛生状態を表すヒートマップを生成して提示することで、物体(食品)が載置されるテーブルの衛生管理を好適に行うことができる。 From the above, according to the first embodiment, by generating and presenting a heat map showing the hygiene state of the table, it is possible to suitably perform hygiene management of the table on which the object (food) is placed.
 また、本実施の形態1によれば、ヒートマップ生成と同時に載置時間を管理して物体(食品)の交換の要否を判定することで、店舗の従業員をより好適に支援することができる。 Further, according to the first embodiment, it is possible to more appropriately support the employees of the store by controlling the placement time at the same time as generating the heat map and determining the necessity of exchanging the object (food). can.
 また、本実施の形態1によれば、テーブルの周辺の環境値に基づいてヒートマップを生成することで、物体(食品)が置かれている環境も考慮したヒートマップを提示することができる。 Further, according to the first embodiment, by generating a heat map based on the environmental value around the table, it is possible to present a heat map considering the environment in which the object (food) is placed.
(実施の形態2)
 本実施の形態では、第1対象物及び第2対象物の深度を推定することで、第1対象物及び第2対象物がテーブルに接触したか否かを判定する形態について述べる。なお、以下の説明において、実施の形態1と重複する内容については同一の符号を付して説明を省略する。
(Embodiment 2)
In this embodiment, a mode for determining whether or not the first object and the second object have come into contact with the table by estimating the depths of the first object and the second object will be described. In the following description, the same reference numerals will be given to the contents overlapping with the first embodiment, and the description thereof will be omitted.
 図6は、実施の形態2の概要を示す説明図である。図6では、第1対象物(顧客の手、トング等)がテーブルに接触しかけている様子を、テーブルの真横から見た場合を図示している。図6に基づき、本実施の形態の概要を説明する。 FIG. 6 is an explanatory diagram showing an outline of the second embodiment. FIG. 6 illustrates a case where the first object (customer's hand, tongs, etc.) is approaching the table when viewed from the side of the table. An outline of the present embodiment will be described with reference to FIG.
 サーバ1は実施の形態1と同様に、テーブル領域から第1対象物(及び第2対象物)を検知し、ヒートマップを生成する。この場合にサーバ1は、カメラ3から第1対象物(及びテーブル)までの深度(距離)を推定し、推定した深度に基づき、第1対象物がテーブルに接触したか否かを判定する。 The server 1 detects the first object (and the second object) from the table area and generates a heat map, as in the first embodiment. In this case, the server 1 estimates the depth (distance) from the camera 3 to the first object (and the table), and determines whether or not the first object has touched the table based on the estimated depth.
 例えば、本実施の形態ではカメラ3を、深度センサを備えたデプスカメラとして構成する。サーバ1は、画像内の各画素に対し深度が付加された距離画像(動画像)をカメラ3から取得し、第1対象物を検知した位置に対応する画素の深度を参照する。 For example, in the present embodiment, the camera 3 is configured as a depth camera equipped with a depth sensor. The server 1 acquires a distance image (moving image) in which a depth is added to each pixel in the image from the camera 3 and refers to the depth of the pixel corresponding to the position where the first object is detected.
 なお、デプスカメラは深度の推定手段の一例であって、本実施の形態はこれに限定されるものではない。例えば複数のカメラ3(ステレオカメラ)を設置しておき、各カメラ3の視差から第1対象物の深度を推定してもよい。あるいはサーバ1は、動画像を入力した場合に第1対象物の深度を推定するよう学習済みの機械学習モデル(例えばニューラルネットワーク)を用意しておき、当該モデルを用いて深度を推定してもよい。このように、深度の推定手段はデプスカメラに限定されない。 The depth camera is an example of a depth estimation means, and the present embodiment is not limited to this. For example, a plurality of cameras 3 (stereo cameras) may be installed, and the depth of the first object may be estimated from the parallax of each camera 3. Alternatively, the server 1 may prepare a trained machine learning model (for example, a neural network) so as to estimate the depth of the first object when a moving image is input, and estimate the depth using the model. good. As described above, the depth estimation means is not limited to the depth camera.
 例えばサーバ1は、カメラ3(動画像の撮像地点)から第1対象物、及びテーブルまでの深度をそれぞれ推定する。なお、テーブルの深度は画像から推定せず、固定値としてもよい。サーバ1は、第1対象物の深度をテーブルの深度と比較し、第1対象物がテーブルに接触したか否かを判定する。なお、「テーブルに接触」とは、テーブル本体に接触している場合だけでなく、テーブルに載置されている物体(食品)に接触している場合も含む。また、第1対象物がテーブルに現に接触している場合だけでなく、第1対象物がテーブルに十分近い場合(第1対象物の深度とテーブルの深度との差分が一定値以下の場合)も含み得る。 For example, the server 1 estimates the depth from the camera 3 (the imaging point of the moving image) to the first object and the table, respectively. The depth of the table is not estimated from the image, but may be a fixed value. The server 1 compares the depth of the first object with the depth of the table and determines whether or not the first object has touched the table. The term "contact with the table" includes not only the case of contact with the table body but also the case of contact with an object (food) placed on the table. Not only when the first object is actually in contact with the table, but also when the first object is sufficiently close to the table (when the difference between the depth of the first object and the depth of the table is a certain value or less). Can also be included.
 サーバ1は、上記の判定結果に応じてヒートマップを生成する。すなわち、サーバ1は、第1対象物が接触したと判定した場合、第1対象物が接触した領域の値を検知時間(接触時間)、回数等に応じて増加させ、ヒートマップを生成する。 Server 1 generates a heat map according to the above determination result. That is, when the server 1 determines that the first object has contacted, the server 1 increases the value of the region in which the first object has contacted according to the detection time (contact time), the number of times, and the like, and generates a heat map.
 動画像から第2対象物を検知した場合も、サーバ1は同様に処理を行う。すなわち、サーバ1は、カメラ3から第2対象物までの深度を推定し、第2対象物がテーブルに接触しているか否かを判定する。テーブルに接触していると判定した場合、サーバ1は、第2対象物が接触した領域のヒートマップの値をリセットする。 Even when the second object is detected from the moving image, the server 1 performs the same processing. That is, the server 1 estimates the depth from the camera 3 to the second object, and determines whether or not the second object is in contact with the table. If it is determined that the table is in contact, the server 1 resets the value of the heat map of the area in which the second object is in contact.
 図7は、実施の形態2に係るサーバ1が実行する処理手順の一例を示すフローチャートである。
 第1対象物を検知したと判定した場合(S13:YES)、サーバ1の制御部11は、第1対象物の深度を推定し、推定した深度に基づいて第1対象物がテーブルに接触したか否かを判定する(ステップS201)。第1対象物が接触したと判定した場合(S201:YES)、制御部11は、第1対象物が接触した領域の値を増加させたヒートマップを生成する(ステップS202)。ステップS202の処理を実行後、又はステップS13、S201でNOの場合、制御部11は処理をステップS15に移行する。
FIG. 7 is a flowchart showing an example of a processing procedure executed by the server 1 according to the second embodiment.
When it is determined that the first object has been detected (S13: YES), the control unit 11 of the server 1 estimates the depth of the first object, and the first object comes into contact with the table based on the estimated depth. Whether or not it is determined (step S201). When it is determined that the first object is in contact (S201: YES), the control unit 11 generates a heat map in which the value of the region in which the first object is in contact is increased (step S202). After executing the process of step S202, or if NO in steps S13 and S201, the control unit 11 shifts the process to step S15.
 第2対象物を検知したと判定した場合(S15:YES)、制御部11は、第2対象物の深度を推定し、推定した深度に基づいて第2対象物がテーブルに接触したか否かを判定する(ステップS203)。第2対象物が接触したと判定した場合(S203:YES)、制御部11は、第2対象物が接触した領域のヒートマップの値をリセットする(ステップS204)。ステップS204の処理を実行後、又はステップS15、S203でNOの場合、制御部11は処理をステップS17に移行する。 When it is determined that the second object has been detected (S15: YES), the control unit 11 estimates the depth of the second object, and whether or not the second object has touched the table based on the estimated depth. Is determined (step S203). When it is determined that the second object is in contact (S203: YES), the control unit 11 resets the heat map value of the region in which the second object is in contact (step S204). After executing the process of step S204, or if NO in steps S15 and S203, the control unit 11 shifts the process to step S17.
 以上より、本実施の形態2によれば、第1対象物及び/又は第2対象物がテーブルに接触しているか否かを判定することで、より好適にヒートマップを生成することができる。 From the above, according to the second embodiment, it is possible to more preferably generate a heat map by determining whether or not the first object and / or the second object is in contact with the table.
(実施の形態3)
 本実施の形態では、テーブルの周辺に存在する人物の情報に基づき、ヒートマップを生成する形態について説明する。
(Embodiment 3)
In this embodiment, a mode for generating a heat map based on the information of a person existing around the table will be described.
 図8は、実施の形態3の概要を示す説明図である。図8では、テーブルの周辺に顧客(人物)が存在する場合をテーブルの真横から見た様子を図示している。図8に基づき、本実施の形態の概要を説明する。 FIG. 8 is an explanatory diagram showing an outline of the third embodiment. FIG. 8 illustrates a case where a customer (person) exists around the table as viewed from the side of the table. An outline of the present embodiment will be described with reference to FIG.
 本実施の形態では、テーブルの上方に、カメラ3だけでなくサーモグラフィセンサ4が設置されている。サーモグラフィセンサ4は、テーブルの周辺の温度分布を示す熱画像を計測するセンサである。サーバ1は、カメラ3から動画像を取得すると共に、サーモグラフィセンサ4から熱画像を取得する。 In this embodiment, not only the camera 3 but also the thermography sensor 4 is installed above the table. The thermography sensor 4 is a sensor that measures a thermal image showing a temperature distribution around a table. The server 1 acquires a moving image from the camera 3 and also acquires a thermal image from the thermography sensor 4.
 サーバ1は、ヒートマップを生成する場合、テーブルの周辺に存在する人物に関する周辺人物情報を特定し、ヒートマップ生成の際に参照する。周辺人物情報は、例えばテーブルの周辺に存在する人物の密度、行動などのほかに、周辺人物の体温を含む。サーバ1は、動画像からテーブルの周辺に存在する人物を認識し、周辺人物の密度(人数)を特定する。また、サーバ1は、動画像から周辺人物の行動を特定する。特定する行動は、例えば会話、咳、くしゃみ等である。さらにサーバ1は、動画像から認識した人物の体温を熱画像から特定する。 When generating a heat map, the server 1 identifies peripheral person information about a person existing around the table and refers to it when generating the heat map. Peripheral person information includes, for example, the density and behavior of people around the table, as well as the body temperature of the surrounding people. The server 1 recognizes the people existing around the table from the moving image, and specifies the density (number of people) of the surrounding people. Further, the server 1 identifies the behavior of a peripheral person from the moving image. The behavior to be identified is, for example, conversation, coughing, sneezing, etc. Further, the server 1 identifies the body temperature of the person recognized from the moving image from the thermal image.
 サーバ1は、上記で特定した周辺人物情報に基づき、ヒートマップを生成する。例えばサーバ1は、テーブルの周辺に存在する人物の密度が一定値以上の場合、ヒートマップの値の重み付けを行い、第1対象物を検知した領域の増加幅を大きく設定する。あるいはサーバ1は、周辺人物の会話等の行動を特定した場合、当該人物から一定距離内のテーブル領域のヒートマップの値を大きく設定する。あるいはサーバ1は、周辺人物の体温が一定値以上の場合、当該人物から一定距離内のテーブル領域のヒートマップの値を大きく設定する。なお、以上の処理は一例であって、周辺人物情報に基づいてヒートマップを生成可能であればよい。 Server 1 generates a heat map based on the peripheral person information specified above. For example, when the density of people existing around the table is a certain value or more, the server 1 weights the value of the heat map and sets a large increase width of the area where the first object is detected. Alternatively, when the server 1 identifies an action such as a conversation of a surrounding person, the server 1 sets a large value of the heat map of the table area within a certain distance from the person. Alternatively, when the body temperature of a surrounding person is equal to or higher than a certain value, the server 1 sets a large value of the heat map of the table area within a certain distance from the person. The above process is an example, and it suffices if a heat map can be generated based on peripheral person information.
 図9は、実施の形態3に係るサーバ1が実行する処理手順の一例を示すフローチャートである。ステップS12の処理を実行後、サーバ1は以下の処理を実行する。
 サーバ1の制御部11は、テーブル周辺の温度分布を示す熱画像を取得する(ステップS301)。制御部11は、ステップS11で取得した動画像、及び/又はステップS301で取得した熱画像に基づき、テーブルの周辺に存在する人物(顧客)に関する周辺人物情報を特定する(ステップS302)。具体的には上述の如く、制御部11は、テーブル周辺の人物の密度、行動(会話の有無)等のほかに、熱画像から周辺人物の体温を特定する。制御部11は処理をステップS13に移行する。
FIG. 9 is a flowchart showing an example of a processing procedure executed by the server 1 according to the third embodiment. After executing the process of step S12, the server 1 executes the following process.
The control unit 11 of the server 1 acquires a thermal image showing the temperature distribution around the table (step S301). The control unit 11 identifies peripheral person information regarding a person (customer) existing around the table based on the moving image acquired in step S11 and / or the thermal image acquired in step S301 (step S302). Specifically, as described above, the control unit 11 specifies the body temperature of the surrounding person from the thermal image in addition to the density of the person around the table, the behavior (presence or absence of conversation), and the like. The control unit 11 shifts the process to step S13.
 動画像から第1対象物を検知したと判定した場合(S13:YES)、制御部11は、第1対象物の検知結果と、周辺人物情報とに基づき、ヒートマップを生成する(ステップS303)。具体的には、制御部11は、第1対象物を検知した時間、回数等のほかに、テーブル周辺の人物の密度、行動、体温等に応じて、テーブル領域内の各領域のヒートマップの値を設定する。制御部11は処理をステップS15に移行する。 When it is determined that the first object has been detected from the moving image (S13: YES), the control unit 11 generates a heat map based on the detection result of the first object and the peripheral person information (step S303). .. Specifically, the control unit 11 determines the heat map of each region in the table region according to the density, behavior, body temperature, etc. of the person around the table, in addition to the time, the number of times, etc., when the first object is detected. Set the value. The control unit 11 shifts the process to step S15.
 以上より、本実施の形態3によれば、動画像からテーブルの周辺の人物を認識し、当該人物に関する周辺人物情報を特定してヒートマップ生成の際に参照することで、テーブルの衛生状態をより好適に管理することができる。 Based on the above, according to the third embodiment, the sanitary state of the table is determined by recognizing the person around the table from the moving image, specifying the information on the surrounding person related to the person, and referring to the heat map when generating the heat map. It can be managed more preferably.
 また、本実施の形態3によれば、サーモグラフィセンサ4を設置して周辺人物の体温を特定することで、疾病等を患っている恐れがある人物を特定し、ヒートマップをより好適に生成することができる。 Further, according to the third embodiment, by installing the thermography sensor 4 and specifying the body temperature of the surrounding person, the person who may be suffering from a disease or the like is specified, and the heat map is more preferably generated. be able to.
(実施の形態4)
 本実施の形態では、店舗内の各テーブルのヒートマップから店舗の清潔度を算出し、店舗のレビューを掲載するWebサイト(以下、「レビューサイト」と呼ぶ)など、外部に出力する形態について説明する。
(Embodiment 4)
In this embodiment, a form of calculating the cleanliness of a store from the heat map of each table in the store and outputting it to the outside such as a website for posting a review of the store (hereinafter referred to as "review site") will be described. do.
 図10は、実施の形態4の概要を示す説明図である。図10では、レビューサイトに店舗の清潔度が掲載されている様子を図示している。図10に基づき、本実施の形態の概要を説明する。 FIG. 10 is an explanatory diagram showing an outline of the fourth embodiment. FIG. 10 illustrates how the cleanliness of the store is posted on the review site. An outline of the present embodiment will be described with reference to FIG.
 実施の形態1で説明したように、サーバ1は、物体(食品)が載置されている店舗内のテーブルについて、そのテーブルの衛生状態を表すヒートマップを生成する。本実施の形態では、店舗に設置されている複数のテーブルそれぞれに対応して複数のカメラ3、3、3…が設置され、各テーブルのヒートマップを生成しているものとする。サーバ1は、テーブル毎に生成しているヒートマップのデータに基づき、店舗全体の清潔度を算出する。 As described in the first embodiment, the server 1 generates a heat map showing the hygiene state of the table in the store where the object (food) is placed. In the present embodiment, it is assumed that a plurality of cameras 3, 3, 3, ... Are installed corresponding to each of the plurality of tables installed in the store, and a heat map of each table is generated. The server 1 calculates the cleanliness of the entire store based on the heat map data generated for each table.
 例えばサーバ1は、各テーブルのヒートマップの更新履歴に基づき、店舗の清潔度を算出する。具体的な清潔度の算出方法は特に限定されないが、例えばサーバ1は、ヒートマップの値が閾値以上の領域の面積(グリッドの数)に応じて清潔度を算出する。具体的には、サーバ1は、過去一週間、一ヶ月等の単位で、ヒートマップの値が閾値以上の領域の延べ面積を算出し、算出した延べ面積をテーブル全体の面積で除算することで、閾値以上のテーブル領域の比率(面積割合)をテーブル毎に算出する。そしてサーバ1は、全てのテーブルの比率の平均値、中央値等を取ることで清潔度を算出する。なお、清潔度は数値で表現されてもよく、あるいは「Aランク」、「Bランク」、「Cランク」…のように、複数段階の評価で表現されてもよい。 For example, the server 1 calculates the cleanliness of the store based on the update history of the heat map of each table. The specific method for calculating the cleanliness is not particularly limited, but for example, the server 1 calculates the cleanliness according to the area (the number of grids) of the region where the value of the heat map is equal to or larger than the threshold value. Specifically, the server 1 calculates the total area of the area where the heat map value is equal to or greater than the threshold value in units of the past week, one month, etc., and divides the calculated total area by the area of the entire table. , Calculate the ratio (area ratio) of the table area above the threshold value for each table. Then, the server 1 calculates the cleanliness by taking the average value, the median value, etc. of the ratios of all the tables. The cleanliness may be expressed numerically, or may be expressed by a plurality of stages of evaluation such as "A rank", "B rank", "C rank", and so on.
 なお、上記の算出方法は一例であって、本実施の形態はこれに限定されるものではない。例えばサーバ1は、第2対象物を検知することでヒートマップをリセットした回数に基づき、清潔度を算出してもよい。具体的には、サーバ1は、過去一週間、一ヶ月等の単位でリセット回数をテーブル毎に計数し、全てのテーブルのリセット回数の平均値、中央値等を取って清潔度を算出する。このように、清潔度の算出方法は特に限定されない。 Note that the above calculation method is an example, and the present embodiment is not limited to this. For example, the server 1 may calculate the cleanliness based on the number of times the heat map is reset by detecting the second object. Specifically, the server 1 counts the number of resets for each table in units of the past week, one month, etc., and calculates the cleanliness by taking the average value, the median value, etc. of the reset times of all the tables. As described above, the method of calculating the cleanliness is not particularly limited.
 サーバ1は、算出した店舗の清潔度を、レビューサイトを管理するWebサーバ(不図示)に出力し、レビューサイト内に表示させる。例えば図10に示すように、レビューサイトでは本システムを利用する各店舗の清潔度が「Aランク」、「Bランク」等で掲載される。これにより、各店舗の衛生管理への取り組みを客観的に評価したデータをレビューサイトの利用者に提示することができる。 The server 1 outputs the calculated cleanliness of the store to a Web server (not shown) that manages the review site, and displays it in the review site. For example, as shown in FIG. 10, on the review site, the cleanliness of each store using this system is posted as "A rank", "B rank" and the like. As a result, it is possible to present to the users of the review site data that objectively evaluates the hygiene management efforts of each store.
 図11は、清潔度の算出処理の手順を示すフローチャートである。図11に基づき、本実施の形態においてサーバ1が実行する処理内容について説明する。
 サーバ1の制御部11は、店舗内に設置されている複数のテーブルそれぞれのヒートマップのデータ(更新履歴)を取得する(ステップS401)。制御部11は、取得したヒートマップのデータに基づき、店舗の清潔度を算出する(ステップS402)。例えば制御部11は、ヒートマップの値が閾値以上のテーブル領域の面積、ヒートマップの値がリセットされた回数などに応じてテーブル毎に指標値を算出し、各テーブルの平均値、中央値等を取ることで店舗全体の清潔度を算出する。制御部11は、算出した清潔度を外部のWebサーバに出力し(ステップS403)、一連の処理を終了する。
FIG. 11 is a flowchart showing the procedure of the cleanliness calculation process. Based on FIG. 11, the processing contents executed by the server 1 in the present embodiment will be described.
The control unit 11 of the server 1 acquires heat map data (update history) of each of a plurality of tables installed in the store (step S401). The control unit 11 calculates the cleanliness of the store based on the acquired heat map data (step S402). For example, the control unit 11 calculates an index value for each table according to the area of the table area where the heat map value is equal to or larger than the threshold value, the number of times the heat map value is reset, and the like, and the average value, median value, etc. of each table. The cleanliness of the entire store is calculated by taking. The control unit 11 outputs the calculated cleanliness to an external Web server (step S403), and ends a series of processes.
 以上より、本実施の形態4によれば、各テーブルのヒートマップに基づき、店舗の清潔度合いを客観的に評価することができる。 From the above, according to the fourth embodiment, the cleanliness of the store can be objectively evaluated based on the heat map of each table.
(実施の形態5)
 本実施の形態では、動画像に映る物体の一定時間の動作に基づいて、当該物体を特定の対象物であると認識する形態について説明する。
(Embodiment 5)
In the present embodiment, a mode of recognizing the object as a specific object based on the movement of the object reflected in the moving image for a certain period of time will be described.
 図12は、実施の形態5に係る制御部11の構成を示す模式図である。制御部11は、実施の形態1と同様の機能を有するものである。ここでは、制御部11は、補助記憶部14に記憶されたプログラムPを読み出して実行することにより、取得部11a、検知部11b、生成部11c、変更部11dとして機能する。 FIG. 12 is a schematic diagram showing the configuration of the control unit 11 according to the fifth embodiment. The control unit 11 has the same function as that of the first embodiment. Here, the control unit 11 functions as an acquisition unit 11a, a detection unit 11b, a generation unit 11c, and a change unit 11d by reading and executing the program P stored in the auxiliary storage unit 14.
 取得部11aは、食品が載置されるテーブルを上方から撮像した動画像をカメラ3から取得する。 The acquisition unit 11a acquires a moving image of the table on which the food is placed from above from the camera 3.
 検知部11bは、動画像におけるテーブルに対応するテーブル領域から第1対象物及び第2対象物を検知する。具体的に、検知部11bは、テーブル領域から、テーブルの清掃作業に関連する第2対象物と、その第2対象物以外の第1対象物とを検知する。ここでは、検知部11bは、動画像に映る物体の一定時間の動作に基づいて、第1対象物及び/又は第2対象物を検知するものでもよい。 The detection unit 11b detects the first object and the second object from the table area corresponding to the table in the moving image. Specifically, the detection unit 11b detects a second object related to the cleaning work of the table and a first object other than the second object from the table area. Here, the detection unit 11b may detect the first object and / or the second object based on the movement of the object reflected in the moving image for a certain period of time.
 生成部11cは、第1対象物の検知結果に基づき、テーブル領域内の各領域の値を設定したヒートマップを生成する。具体的に、生成部11cは、第1対象物が検知された時間又は回数に応じて、第1対象物が検知された領域の値を設定したヒートマップを生成する。 The generation unit 11c generates a heat map in which the values of each area in the table area are set based on the detection result of the first object. Specifically, the generation unit 11c generates a heat map in which the value of the region where the first object is detected is set according to the time or the number of times when the first object is detected.
 変更部11dは、第2対象物を検知した場合、第2対象物が検知された領域のヒートマップの値を変化させる。具体的に、変更部11dは、第2対象物が検知された場合、第2対象物が検知された領域の値をリセット又は減算する。 When the second object is detected, the change unit 11d changes the value of the heat map in the area where the second object is detected. Specifically, when the second object is detected, the changing unit 11d resets or subtracts the value in the region where the second object is detected.
 なお、制御部11は、深度推定部11eとしてさらに機能するものでもよい。深度推定部11eは、動画像を撮像した撮像地点から第1対象物又は第2対象物までの深度を推定する。また、深度推定部11eは、推定した深度に基づき、第1対象物又は第2対象物がテーブルに接触したか否かを判定する。制御部11が深度推定部11eとして機能する場合、生成部11cは、判定結果に応じてヒートマップの値を生成又は変化させる。 The control unit 11 may further function as the depth estimation unit 11e. The depth estimation unit 11e estimates the depth from the imaging point where the moving image is captured to the first object or the second object. Further, the depth estimation unit 11e determines whether or not the first object or the second object has come into contact with the table based on the estimated depth. When the control unit 11 functions as the depth estimation unit 11e, the generation unit 11c generates or changes the value of the heat map according to the determination result.
 また、制御部11は、食品判定部11fとしてさらに機能するものでもよい。食品判定部11fは、動画像からテーブルに載置された食品を検知し、食品の載置時間に基づき、食品を交換すべきか否かを判定する。 Further, the control unit 11 may further function as the food determination unit 11f. The food determination unit 11f detects the food placed on the table from the moving image, and determines whether or not the food should be replaced based on the food placement time.
 また、制御部11は、認識部11gとしてさらに機能するものでもよい。認識部11gは、動画像からテーブルの周辺に存在する人物を認識する。また、認識部11gは、認識した人物に関する周辺人物情報を特定する。制御部11が認識部11gとして機能する場合、生成部11cは、周辺人物情報と第1対象物の検知結果とに基づいてヒートマップを生成する。なお、認識部11gは、テーブルの周辺の温度分布を示す熱画像を取得し、熱画像に基づき、人物の体温を特定するものでもよい。この場合、制御部11は、人物の体温に応じてヒートマップを生成する。 Further, the control unit 11 may further function as the recognition unit 11g. The recognition unit 11g recognizes a person existing around the table from the moving image. In addition, the recognition unit 11g identifies peripheral person information regarding the recognized person. When the control unit 11 functions as the recognition unit 11g, the generation unit 11c generates a heat map based on the peripheral person information and the detection result of the first object. The recognition unit 11g may acquire a thermal image showing the temperature distribution around the table and specify the body temperature of the person based on the thermal image. In this case, the control unit 11 generates a heat map according to the body temperature of the person.
 また、制御部11は、清潔度算出部11hとしてさらに機能するものでもよい。清潔度算出部11eは、実施の形態4で述べたような、複数のテーブルそれぞれのヒートマップに基づき、店舗の清潔度を算出する。 Further, the control unit 11 may further function as a cleanliness calculation unit 11h. The cleanliness calculation unit 11e calculates the cleanliness of the store based on the heat maps of each of the plurality of tables as described in the fourth embodiment.
 なお、制御部11は、テーブルの周辺の環境値を取得するものでもよい。この場合は、生成部11cは、環境値と、第1対象物の検知結果とに基づいてヒートマップを生成する。 Note that the control unit 11 may acquire the environmental values around the table. In this case, the generation unit 11c generates a heat map based on the environmental value and the detection result of the first object.
 図13は、サーバ1が実行する処理手順の一例を示すフローチャートである。
 サーバ1の制御部11は、ステップS12で、動画像に映る物体を検知する。ここでは、制御部11は、動画像から第1対象物を検知した場合、即時に第1対象物を検知したと判定するのではなく、第1対象物の候補となる物体を検知したと判定する(S501)。そして、制御部11は、第1対象物の候補となる物体の一定時間の動作を示す第1対象動作の時系列データを主記憶部12に記憶する。次に、制御部11は、予め設定された所定の動作と比較し、第1対象動作の時系列データを解析する(S502)。制御部11は、第1対象動作の時系列データが第1対象物に固有のものであると判定した場合、第1対象物の候補を第1対象物であるとみなして、第1対象物を検知したものとする(S503)。制御部11は、第1対象物を検知したと判定した場合(S503:YES)、テーブルの衛生状態を表すヒートマップを生成する(S504)。ここでは、制御部11は、第1対象物の候補を検知した時点からの時間、回数等に応じて、第1対象物が検知された領域の値を設定したヒートマップを生成する。
FIG. 13 is a flowchart showing an example of a processing procedure executed by the server 1.
The control unit 11 of the server 1 detects an object reflected in the moving image in step S12. Here, when the control unit 11 detects the first object from the moving image, it does not determine that the first object has been detected immediately, but determines that the object that is a candidate for the first object has been detected. (S501). Then, the control unit 11 stores in the main storage unit 12 the time-series data of the first target operation indicating the operation of the object that is a candidate for the first object for a certain period of time. Next, the control unit 11 analyzes the time-series data of the first target operation by comparing with a predetermined operation set in advance (S502). When the control unit 11 determines that the time-series data of the first object operation is unique to the first object, the control unit 11 considers the candidate of the first object to be the first object, and determines that the candidate of the first object is the first object. Is detected (S503). When the control unit 11 determines that the first object has been detected (S503: YES), the control unit 11 generates a heat map showing the sanitary state of the table (S504). Here, the control unit 11 generates a heat map in which the value of the region where the first object is detected is set according to the time, the number of times, etc. from the time when the candidate of the first object is detected.
 ステップS504の処理を実行後、又は、ステップS503でNOの場合、制御部11は、動画像から第2対象物を検知したか否かを判定する。ここでは、制御部11は、動画像から第2対象物を検知した場合、即時に第2対象物を検知したと判定するのではなく、第2対象物の候補となる物体を検知したと判定する(S505)。そして、制御部11は、第2対象物の候補となる物体の一定時間の動作を示す第2対象動作の時系列データを主記憶部12に記憶する。次に、制御部11は、予め設定された所定の動作と比較し、第2対象動作の時系列データを解析する(S506)。制御部11は、第2対象動作の時系列データが第2対象物に固有のものであると判定した場合、第2対象物の候補を第2対象物であるとみなして、第2対象物を検知したものとする(S507)。制御部11は、第2対象物を検知したと判定した場合、第2対象物の候補を検知した時点からの時間、回数等に応じて、テーブル領域内の各領域のヒートマップの値を変化(リセット等)させる(S508)。 After executing the process of step S504, or when NO in step S503, the control unit 11 determines whether or not the second object is detected from the moving image. Here, when the control unit 11 detects the second object from the moving image, it does not determine that the second object has been detected immediately, but determines that the object that is a candidate for the second object has been detected. (S505). Then, the control unit 11 stores in the main storage unit 12 the time-series data of the second target operation indicating the operation of the object that is a candidate for the second object for a certain period of time. Next, the control unit 11 analyzes the time-series data of the second target operation by comparing with a predetermined operation set in advance (S506). When the control unit 11 determines that the time-series data of the second target operation is unique to the second target, the control unit 11 considers the candidate for the second target to be the second target, and determines that the candidate for the second target is the second target. Is detected (S507). When the control unit 11 determines that the second object has been detected, the control unit 11 changes the value of the heat map of each area in the table area according to the time, the number of times, etc. from the time when the candidate for the second object is detected. (Reset, etc.) (S508).
 以上説明したように、本実施の形態5に係るサーバ1は、動画像に映る物体の一定時間の動作に基づいて、第1対象物及び/又は第2対象物を検知するので、対象物の検知精度を高めることができる。補足すると、時系列データを用いずに、形状・色から得られる特徴量に基づいて対象物を特定する手法では、動的に形状の変化する対象物の検知漏れが生じる可能性がある。本実施の形態5に係るサーバ1では、「手」などの動的に形状が変化する物体であっても、その時系列データを用いて対象物を特定するので、対象物の検知漏れを回避できる。結果として、対象物の検知精度を高めることができる。 As described above, the server 1 according to the fifth embodiment detects the first object and / or the second object based on the movement of the object reflected in the moving image for a certain period of time. The detection accuracy can be improved. Supplementally, in the method of specifying the object based on the features obtained from the shape and color without using the time series data, there is a possibility that the detection omission of the object whose shape changes dynamically may occur. In the server 1 according to the fifth embodiment, even if the object has a dynamically changing shape such as a "hand", the object is specified by using the time-series data, so that the omission of detection of the object can be avoided. .. As a result, the detection accuracy of the object can be improved.
 また、例えば、形状・色から得られる特徴量に基づいて対象物を特定する手法では、「白いダスター」など対象物を検知しようとすると、対象物に似た色の物体(白い紙ナプキンなど)を誤検知することがある。本実施の形態5に係るサーバ1は、ダスターでテーブルを拭く動作を、手の形、手の下の布巾の色、拭き取る手の位置の時系列データなどに基づいて特定するので、時系列データを用いない手法に比して対象物の誤検知を回避できる。結果として、対象物の検知精度を高めることができる。 In addition, for example, in the method of specifying an object based on the feature amount obtained from the shape and color, when trying to detect the object such as "white duster", an object of a color similar to the object (white paper napkin, etc.) May be falsely detected. Since the server 1 according to the fifth embodiment specifies the operation of wiping the table with the duster based on the shape of the hand, the color of the cloth under the hand, the time-series data of the position of the hand to be wiped, and the like, the time-series data. It is possible to avoid false detection of an object as compared with a method that does not use. As a result, the detection accuracy of the object can be improved.
 なお、本実施の形態5では、第1対象物と第2対象物とが同一物体から構成されるものであったとしても、両者の識別が可能である。補足すると、形状・色から得られる特徴量に基づいて対象物を特定する手法では、テーブルに触る手である第1対象物と、テーブルを拭く手である第2対象物とを混同して検知することが起こり得る。第1対象物及び第2対象物が同一物体である「手」から構成されるため、形状・色から得られる両者の特徴量に差異が生じないことがあるからである。これに対し、本実施の形態5に係るサーバ1では、時系列データを用いて対象物を識別するので、同一物体から構成される第1対象物及び第2対象物であっても、それらの動作を区別して検知することができる。 In the fifth embodiment, even if the first object and the second object are composed of the same object, they can be distinguished from each other. Supplementally, in the method of specifying the object based on the feature amount obtained from the shape and color, the first object that is the hand that touches the table and the second object that is the hand that wipes the table are confused and detected. Can happen. This is because since the first object and the second object are composed of "hands" which are the same object, there may be no difference in the feature amounts of the two obtained from the shape and color. On the other hand, in the server 1 according to the fifth embodiment, since the object is identified by using the time series data, even if it is the first object and the second object composed of the same object, those objects are identified. It is possible to distinguish and detect the operation.
 今回開示された実施の形態はすべての点で例示であって、制限的なものではないと考えられるべきである。本発明の範囲は、上記した意味ではなく、特許請求の範囲によって示され、特許請求の範囲と均等の意味及び範囲内でのすべての変更が含まれることが意図される。また、本開示は、上記各実施の形態に開示されている複数の構成要素の適宜な組み合わせにより種々の開示を形成できるものである。例えば、実施形態に示される全構成要素から幾つかの構成要素は削除してもよいものである。さらに、異なる実施形態に構成要素を適宜組み合わせてもよいものである。 The embodiments disclosed this time should be considered to be exemplary in all respects and not restrictive. The scope of the present invention is indicated by the scope of claims, not the above-mentioned meaning, and is intended to include all modifications within the meaning and scope equivalent to the scope of claims. Further, in the present disclosure, various disclosures can be formed by an appropriate combination of the plurality of components disclosed in each of the above embodiments. For example, some components may be removed from all the components shown in the embodiments. Further, the components may be appropriately combined in different embodiments.
 1  サーバ(情報処理装置)
 11 制御部
 12 主記憶部
 13 通信部
 14 補助記憶部
 P  プログラム
 2  端末
 3  カメラ
1 Server (information processing device)
11 Control unit 12 Main storage unit 13 Communication unit 14 Auxiliary storage unit P Program 2 Terminal 3 Camera

Claims (12)

  1.  食品が載置されるテーブルを上方から撮像した動画像を取得する取得部と、
     前記動画像における前記テーブルに対応するテーブル領域から第1対象物及び第2対象物を検知する検知部と、
     前記第1対象物の検知結果に基づき、前記テーブル領域内の各領域の値を設定したヒートマップを生成する生成部と、
     前記第2対象物を検知した場合、前記第2対象物が検知された領域の前記ヒートマップの値を変化させる変更部と、
    を備える、情報処理装置。
    An acquisition unit that acquires a moving image of a table on which food is placed from above,
    A detection unit that detects a first object and a second object from a table area corresponding to the table in the moving image.
    Based on the detection result of the first object, a generation unit that generates a heat map in which the values of each region in the table region are set, and a generation unit.
    When the second object is detected, the change part that changes the value of the heat map in the region where the second object is detected, and
    An information processing device equipped with.
  2.  前記生成部は、前記第1対象物が検知された時間又は回数に応じて、前記第1対象物が検知された領域の値を設定した前記ヒートマップを生成する
     請求項1に記載の情報処理装置。
    The information processing according to claim 1, wherein the generation unit generates the heat map in which the value of the region where the first object is detected is set according to the time or the number of times when the first object is detected. Device.
  3.  前記検知部は、前記テーブル領域から、前記テーブルの清掃作業に関連する前記第2対象物と、該第2対象物以外の前記第1対象物とを検知し、
     前記変更部は、前記第2対象物が検知された場合、前記第2対象物が検知された領域の値をリセット又は減算する
     請求項1又は2に記載の情報処理装置。
    The detection unit detects the second object related to the cleaning work of the table and the first object other than the second object from the table area.
    The information processing apparatus according to claim 1 or 2, wherein the changing unit resets or subtracts a value in a region where the second object is detected when the second object is detected.
  4.  前記動画像を撮像した撮像地点から前記第1対象物又は第2対象物までの深度を推定し、推定した深度に基づき、前記第1対象物又は第2対象物が前記テーブルに接触したか否かを判定する深度推定部をさらに備え、
     前記生成部が、判定結果に応じて前記ヒートマップを生成又は変化させる
     請求項1~3のいずれか1項に記載の情報処理装置。
    The depth from the imaging point where the moving image was imaged to the first object or the second object is estimated, and based on the estimated depth, whether or not the first object or the second object comes into contact with the table. Further equipped with a depth estimation unit to determine whether
    The information processing apparatus according to any one of claims 1 to 3, wherein the generation unit generates or changes the heat map according to a determination result.
  5.  前記動画像から前記テーブルに載置された前記食品を検知し、前記食品の載置時間に基づき、前記食品を交換すべきか否かを判定する食品判定部をさらに備える、
     請求項1~4のいずれか1項に記載の情報処理装置。
    Further provided is a food determination unit that detects the food placed on the table from the moving image and determines whether or not the food should be replaced based on the placing time of the food.
    The information processing apparatus according to any one of claims 1 to 4.
  6.  前記動画像から前記テーブルの周辺に存在する人物を認識して、認識した前記人物に関する周辺人物情報を特定する認識部をさらに備え、
     前記生成部は、前記周辺人物情報と、前記第1対象物の検知結果とに基づいて前記ヒートマップを生成する
     請求項1~5のいずれか1項に記載の情報処理装置。
    A recognition unit that recognizes a person existing around the table from the moving image and identifies peripheral person information about the recognized person is further provided.
    The information processing device according to any one of claims 1 to 5, wherein the generation unit generates the heat map based on the peripheral person information and the detection result of the first object.
  7.  前記認識部は、前記テーブルの周辺の温度分布を示す熱画像を取得して、前記熱画像に基づき、前記人物の体温を特定し、
     前記生成部は、前記人物の体温に応じて前記ヒートマップを生成する、
     請求項6に記載の情報処理装置。
    The recognition unit acquires a thermal image showing the temperature distribution around the table, and identifies the body temperature of the person based on the thermal image.
    The generation unit generates the heat map according to the body temperature of the person.
    The information processing apparatus according to claim 6.
  8.  前記生成部は、店舗に設置された複数の前記テーブルそれぞれの前記ヒートマップを生成するものであり、
     前記複数のテーブルそれぞれのヒートマップに基づき、前記店舗の清潔度を算出し、前記店舗の清潔度を外部に出力する清潔度算出部をさらに備える、
     請求項1~7のいずれか1項に記載の情報処理装置。
    The generation unit generates the heat map for each of the plurality of tables installed in the store.
    A cleanliness calculation unit that calculates the cleanliness of the store based on the heat map of each of the plurality of tables and outputs the cleanliness of the store to the outside is further provided.
    The information processing apparatus according to any one of claims 1 to 7.
  9.  前記取得部は、前記テーブルの周辺の環境値を取得し、
     前記生成部は、前記環境値と、前記第1対象物の検知結果とに基づいて前記ヒートマップを生成する
     請求項1~8のいずれか1項に記載の情報処理装置。
    The acquisition unit acquires the environmental values around the table and obtains them.
    The information processing apparatus according to any one of claims 1 to 8, wherein the generation unit generates the heat map based on the environment value and the detection result of the first object.
  10.  前記検知部は、
     前記動画像に映る物体の一定時間の動作に基づいて、前記第1対象物及び/又は前記第2対象物を検知する、
     請求項1~9のいずれか1項に記載の情報処理装置。
    The detector is
    The first object and / or the second object is detected based on the movement of the object reflected in the moving image for a certain period of time.
    The information processing apparatus according to any one of claims 1 to 9.
  11.  食品が載置されるテーブルを上方から撮像した動画像を取得し、
     前記動画像における前記テーブルに対応するテーブル領域から第1対象物及び第2対象物を検知し、
     前記第1対象物の検知結果に基づき、前記テーブル領域内の各領域の値を設定したヒートマップを生成し、
     前記第2対象物を検知した場合、前記第2対象物が検知された領域の前記ヒートマップの値を変化させる
     処理をコンピュータが実行する情報処理方法。
    Acquire a moving image of the table on which food is placed from above,
    The first object and the second object are detected from the table area corresponding to the table in the moving image, and the first object and the second object are detected.
    Based on the detection result of the first object, a heat map in which the values of each area in the table area are set is generated.
    An information processing method in which a computer executes a process of changing the value of the heat map in a region where the second object is detected when the second object is detected.
  12.  前記請求項11に記載の情報処理方法をコンピュータに実行させるプログラム。

     
    A program for causing a computer to execute the information processing method according to claim 11.

PCT/JP2021/030747 2020-08-24 2021-08-23 Information processing device, information processing method, and program WO2022045049A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-141070 2020-08-24
JP2020141070 2020-08-24

Publications (1)

Publication Number Publication Date
WO2022045049A1 true WO2022045049A1 (en) 2022-03-03

Family

ID=77554521

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/030747 WO2022045049A1 (en) 2020-08-24 2021-08-23 Information processing device, information processing method, and program

Country Status (2)

Country Link
JP (1) JP6931511B1 (en)
WO (1) WO2022045049A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024200911A1 (en) * 2023-03-27 2024-10-03 Ignaz Oy A buffet service monitoring system and method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001201057A (en) * 2000-01-18 2001-07-27 Mitsubishi Electric Corp High-frequency heating apparatus
JP2003054311A (en) * 2001-08-09 2003-02-26 Denso Corp Vehicle headlight control device
JP2004252497A (en) * 2002-01-15 2004-09-09 Masanobu Kujirada Method and system for providing dish or drink in restaurant
JP2006020881A (en) * 2004-07-08 2006-01-26 Nec Corp Merchandise management system of circulating conveying path type, method for merchandise management and merchandise management control program which are used for the system
US20110035326A1 (en) * 2008-04-25 2011-02-10 Sholl Jeffrey J System And Method Of Providing Product Quality And Safety
US20110316695A1 (en) * 2009-03-02 2011-12-29 Diversey, Inc. Hygiene monitoring and management system and method
US20140241571A1 (en) * 2013-02-26 2014-08-28 Elwha Llc System and method for contamination monitoring
JP2016194762A (en) * 2015-03-31 2016-11-17 ソニー株式会社 Information processing system, information processing method, and program
JP2018079134A (en) * 2016-11-17 2018-05-24 三菱電機株式会社 Cleaning support device, cleaning support method, and cleaning support system
JP2019153070A (en) * 2018-03-02 2019-09-12 東芝テック株式会社 Information processing apparatus and information processing program

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001201057A (en) * 2000-01-18 2001-07-27 Mitsubishi Electric Corp High-frequency heating apparatus
JP2003054311A (en) * 2001-08-09 2003-02-26 Denso Corp Vehicle headlight control device
JP2004252497A (en) * 2002-01-15 2004-09-09 Masanobu Kujirada Method and system for providing dish or drink in restaurant
JP2006020881A (en) * 2004-07-08 2006-01-26 Nec Corp Merchandise management system of circulating conveying path type, method for merchandise management and merchandise management control program which are used for the system
US20110035326A1 (en) * 2008-04-25 2011-02-10 Sholl Jeffrey J System And Method Of Providing Product Quality And Safety
US20110316695A1 (en) * 2009-03-02 2011-12-29 Diversey, Inc. Hygiene monitoring and management system and method
US20140241571A1 (en) * 2013-02-26 2014-08-28 Elwha Llc System and method for contamination monitoring
JP2016194762A (en) * 2015-03-31 2016-11-17 ソニー株式会社 Information processing system, information processing method, and program
JP2018079134A (en) * 2016-11-17 2018-05-24 三菱電機株式会社 Cleaning support device, cleaning support method, and cleaning support system
JP2019153070A (en) * 2018-03-02 2019-09-12 東芝テック株式会社 Information processing apparatus and information processing program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024200911A1 (en) * 2023-03-27 2024-10-03 Ignaz Oy A buffet service monitoring system and method

Also Published As

Publication number Publication date
JP6931511B1 (en) 2021-09-08
JP2022036866A (en) 2022-03-08

Similar Documents

Publication Publication Date Title
JP7448065B2 (en) Store equipment, store systems, store management methods, programs
JP6702464B2 (en) Image display device, image display method, and program
JP6194777B2 (en) Operation determination method, operation determination apparatus, and operation determination program
TWI778030B (en) Store apparatus, store management method and program
EP2905680B1 (en) Information processing apparatus, information processing method, and program
JP2011253344A (en) Purchase behavior analysis device, purchase behavior analysis method and program
JP6665927B2 (en) Behavior analysis device, behavior analysis system, behavior analysis method and program
JP7081310B2 (en) Behavioral analytics device, behavioral analytics system, behavioral analytics method, program and recording medium
JP6648508B2 (en) Purchasing behavior analysis program, purchasing behavior analysis method, and purchasing behavior analysis device
JP6561639B2 (en) Interest level determination device, interest level determination method, and interest level determination program
US20230000302A1 (en) Cleaning area estimation device and method for estimating cleaning area
JP6565639B2 (en) Information display program, information display method, and information display apparatus
WO2022045049A1 (en) Information processing device, information processing method, and program
JP6852293B2 (en) Image processing system, information processing device, information terminal, program
JP2017102564A (en) Display control program, display control method, and display control apparatus
JP5822315B2 (en) Work support system, work support device, work support method, and program
JP7010030B2 (en) In-store monitoring equipment, in-store monitoring methods, and in-store monitoring programs
JP4338147B2 (en) Product interest level measuring device
JP5765015B2 (en) Gaze detection device, gaze detection method, and gaze detection program
JP7087403B2 (en) Gaze analysis program, gaze analysis method and gaze analyzer
KR102565899B1 (en) System for determining mental state according to user's task performance in virtual reality
WO2021181597A1 (en) Recognition degree estimation device, recognition degree estimation method, and recording medium
JP7501773B2 (en) Work type identification device, control method, and program
JP2019162288A (en) Determination value calculation device, method and program, and micro error occurrence determination device, method and program
JP2018060360A (en) Number estimation method, number estimation device, number estimation program, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21861475

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21861475

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP