WO2022107330A1 - 状態判定装置、状態判定システム、状態判定方法、及び、記録媒体 - Google Patents
状態判定装置、状態判定システム、状態判定方法、及び、記録媒体 Download PDFInfo
- Publication number
- WO2022107330A1 WO2022107330A1 PCT/JP2020/043471 JP2020043471W WO2022107330A1 WO 2022107330 A1 WO2022107330 A1 WO 2022107330A1 JP 2020043471 W JP2020043471 W JP 2020043471W WO 2022107330 A1 WO2022107330 A1 WO 2022107330A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- road
- state
- determination
- state determination
- image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 57
- 238000003384 imaging method Methods 0.000 claims abstract description 7
- 230000008569 process Effects 0.000 claims description 16
- 230000004044 response Effects 0.000 claims description 5
- 230000006866 deterioration Effects 0.000 description 41
- 238000004364 calculation method Methods 0.000 description 37
- 238000010586 diagram Methods 0.000 description 24
- 230000000694 effects Effects 0.000 description 20
- 230000006870 function Effects 0.000 description 13
- 238000001556 precipitation Methods 0.000 description 13
- 238000012937 correction Methods 0.000 description 12
- 238000004891 communication Methods 0.000 description 10
- 238000012545 processing Methods 0.000 description 6
- 238000013473 artificial intelligence Methods 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 4
- 238000010801 machine learning Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 239000010426 asphalt Substances 0.000 description 1
- 239000011449 brick Substances 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 239000004567 concrete Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000003703 image analysis method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 239000004575 stone Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/809—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
Definitions
- the present invention relates to the processing of information related to the road, and particularly to the determination of the state of the road.
- Patent Document 1 In local governments, etc., managing the roads that are in charge requires a lot of labor and cost. Therefore, a device for collecting road information has been proposed (see, for example, Patent Document 1).
- the road measuring device described in Patent Document 1 determines the degree of deterioration of the road using image data of the surface of the road, and outputs a map in which the degree of deterioration and the presented value of the reward are mapped.
- the road measuring device described in Patent Document 1 uses a determination model in which image data is input for determining deterioration. This judgment model is constructed by a machine learning method.
- the image of the road was taken outdoors, and it was taken in various weather and time. Therefore, the captured image is an image captured under various shooting conditions.
- the judgment model executes learning before judgment.
- a judgment model learned using an image in fine weather can appropriately judge an image in fine weather.
- the determination model learned using the image in fine weather cannot always appropriately determine the image taken in cloudy weather and rainy weather.
- the judgment model learned using images of multiple weather conditions such as sunny weather, cloudy weather, and rainy weather can be judged with a certain degree of accuracy for the images in each weather.
- Patent Document 1 uses one determination model. Therefore, the technique described in Patent Document 1 has a problem that it is difficult to improve the accuracy of determining the state of the road.
- An object of the present invention is to provide a state determination device or the like that solves the above-mentioned problems and improves the accuracy of state determination regarding a road.
- the state determination device which is one embodiment of the present invention, is Multiple judgment models learned using teacher data in which at least one of the state of a moving object equipped with a photographing device for acquiring an image of a road, the state of a road, and the external environment is different, Includes an output summing means that sums the outputs from multiple decision models for the input road image.
- the state determination system in one embodiment of the present invention is with the above state judgment device, An imaging device that acquires an image to be judged and outputs the image to a state judgment device, and / or a storage device that stores the image to be judged. Includes a display device that displays the road status output by the status determination device.
- the state determination method in one embodiment of the present invention is A state judgment device including a plurality of judgment models learned using teacher data in which at least one of the state of a moving object, the state of the road, and the external environment, which is equipped with a photographing device for acquiring an image of the road, is different. , Add up the outputs from multiple judgment models for the input road image.
- the state determination method in one embodiment of the present invention is The state determination device executes the state determination method described above, and the state determination device executes the method.
- the photographing device acquires the image to be judged and outputs it to the state judgment device, and / or the storage device saves the image to be judged.
- the display device displays the state of the road output by the state determination device.
- the recording medium in one embodiment of the present invention is A computer containing multiple decision models, each trained using different teacher data for at least one of the condition of a moving object, the condition of the road, and the external environment equipped with a photographing device that acquires an image of the road. Record a program that executes the process of adding up the outputs from multiple judgment models for the input road image.
- FIG. 1 is a block diagram showing an example of the configuration of the state determination device according to the first embodiment.
- FIG. 2 is a diagram showing an example of weight.
- FIG. 3 is a flow chart showing an example of the operation of the learning phase of the determination model.
- FIG. 4 is a flow chart showing an example of the operation of the determination phase of the state determination device according to the first embodiment.
- FIG. 5 is a block diagram showing an example of the hardware configuration of the state determination device.
- FIG. 6 is a block diagram showing an example of the configuration of a state determination system including the state determination device according to the first embodiment.
- FIG. 7 is a block diagram showing an example of the configuration of the state determination device according to the second embodiment.
- FIG. 1 is a block diagram showing an example of the configuration of the state determination device according to the first embodiment.
- FIG. 2 is a diagram showing an example of weight.
- FIG. 3 is a flow chart showing an example of the operation of the learning phase of the determination model.
- FIG. 4 is a
- FIG. 8 is a flow chart showing an example of the operation of the determination phase of the state determination device according to the second embodiment.
- FIG. 9 is a block diagram showing an example of the configuration of the state determination device according to the third embodiment.
- FIG. 10 is a flow chart showing an example of the operation of the learning phase of the weight determination unit according to the third embodiment.
- FIG. 11 is a block diagram showing an example of the configuration of the state determination device according to the fourth embodiment.
- FIG. 12 is a flow chart showing an example of the operation of the learning phase of the weight determination unit according to the fourth embodiment.
- FIG. 13 is a block diagram showing an example of the configuration of a state determination system including the state determination device according to the fourth embodiment.
- FIG. 14 is a block diagram showing an example of the configuration of the state determination device according to the fifth embodiment.
- FIG. 15 is a diagram showing an example of a case where a plurality of drive recorders are used as a photographing device.
- FIG. 16 is a diagram showing an example of display.
- FIG. 17 is a diagram showing an example of display of a plurality of states.
- a “moving body” is a moving body equipped with a photographing device that captures an image of a road.
- the moving body is arbitrary.
- the moving body may be a vehicle (four-wheeled vehicle or two-wheeled vehicle) equipped with a photographing device, or a drone.
- the moving body may be a person who moves while holding the photographing device.
- the "moving body state” is the state or characteristic of the moving body related to the captured image.
- the vehicle model equipped with the photographing device is an example of the characteristics of the moving body.
- a two-wheeled vehicle has a larger change in the inclination of the vehicle body than a four-wheeled vehicle. Therefore, the number of wheels is an example of the characteristics of a moving body.
- the moving object when the moving object is moving at high speed, the object in the captured image may become unclear (for example, motion blur may occur).
- the determination using such an image reduces the accuracy of the determination. That is, the speed of the moving body affects the state of the image used for the determination. As described above, the moving speed of the moving body is an example of the state of the moving body.
- the acceleration and vibration of the moving object at the time of shooting affect the image to be shot. Therefore, the acceleration and vibration of the moving body are examples of the state of the moving body.
- “Road” is not limited to roads through which vehicles and people pass, but also includes structures related to roads.
- a “road” may include signs, white lines, guardrails, reflectors, traffic lights, and / or luminaires.
- the "road” is not limited to a road through which vehicles and people pass, but may be a passage through which other objects pass.
- the "road” may be a runway, a taxiway, and an apron through which an airplane travels.
- the "road condition” is the condition of the road to be determined and the condition of the road that affects the determination.
- the road conditions to be determined include deterioration of the road surface (cracks (vertical, horizontal, or hexagonal), ruts, and / or potholes, etc.), deterioration of the road surface seal, and Types of road degradation, such as fraying around the seal, may be used.
- the condition of the road to be judged is not limited to the deterioration of the road surface.
- the deterioration of the structure related to the road for example, the white line on the road surface and the blurring of the road marking and / or the damage of the sign
- the condition of the road to be determined. good even if the deterioration of the structure related to the road (for example, the white line on the road surface and the blurring of the road marking and / or the damage of the sign) is determined as the condition of the road to be determined. good.
- condition of the road to be judged is not limited to the deterioration of the road and the structures related to the road.
- white lines and road markings on the road surface are constructed so that they can be seen as reflected at night. Therefore, in each embodiment, the state of reflection of the white line and the road marking may be determined as the state of the road to be determined.
- each embodiment may determine the lighting state of the lighting installed on the road, the brightness of the lighting, or the illuminance of the road surface.
- the classification of roads living roads, city roads, prefectural roads, national roads, expressways, etc.
- the number of lanes affect the amount of vehicles traveling on the road surface. Therefore, in each embodiment, the road classification and the number of lanes may be used as the road conditions that affect the determination.
- the type of pavement on the road, the shape of the pavement material, and the condition of the road surface affect the captured image. Therefore, in each embodiment, the type of pavement of the road, the shape of the material of the pavement, and the condition of the road surface may be used as the condition of the road that affects the determination.
- the type of road pavement is asphalt, concrete, stone, brick, or gravel.
- the type of pavement may include a pavement construction method such as drainage pavement.
- the material of the pavement that affects the image is grain roughness and / or color.
- the reflection in the image of the manhole differs between a sunny day and a rainy day.
- the degree of wetness of the road surface due to rain changes the state of the captured image.
- rainfall affects the size of the puddle that obscures the condition of the road surface.
- the dryness and wetness of the road surface affects the image.
- the processing of the surface of the road surface eg, a straight groove for drainage or a circular groove for non-slip on a slope
- the dryness and wetness of the road surface and the processing of the surface affect the image. Therefore, in each embodiment, the dryness and wetness of the road surface and / or the processing of the surface may be used as the state of the road that affects the image.
- the “external environment” is information that excludes the above-mentioned state of the moving body and the state of the road from the information that affects the judgment using the image of the road.
- the shooting conditions for example, the time zone and weather of shooting the image (sunny, cloudy, rainy, after rain, and snow)
- the external environment may include shooting conditions (for example, shooting time zone and weather).
- the ambient sound at the time of shooting becomes loud when the road is congested.
- Road congestion affects the progress of deterioration.
- vibration sound the sound generated by the vibration during movement
- the amount of rainfall affects the image to be taken.
- the sound of rain in rainy weather is proportional to the amount of rainfall to some extent.
- the external environment may include sounds (eg, ambient sounds, vibration sounds, or rain sounds).
- the image quality and size of the image are one of the factors that influence the judgment.
- the frame rate affects the image. Therefore, the external environment may include image specifications such as image quality, size, and frame rate.
- the external environment is the structures installed around the road (eg, advertising signs, signs, trees around the road, and / or mid-to-high-rise buildings (buildings and bridges) installed on the side of the road). ) May be included.
- the condition of moving objects, the condition of roads, and the external environment include many things. Therefore, the user or the like may appropriately select information to be used as the state of the moving object, the state of the road, and the external environment in consideration of the determination target.
- the terms related to weather are as follows. However, these terms may differ from the strict meteorological terms.
- the determination may be made in consideration of the effectiveness and convenience in operation as appropriate.
- “Sunny weather” is the weather including “clear weather” with a cloud cover of 1 or less and “sunny” with a cloud cover of 2 or more and 8 or less.
- Cloudy weather is the weather with cloud cover of 9 or more and no precipitation.
- the amount of clouds is the ratio of clouds that cover the sky, and is an integer that is 0 when there are no clouds and 10 when the clouds cover the entire sky.
- FIG. 1 is a block diagram showing an example of the configuration of the state determination device 10 according to the first embodiment.
- the state determination device 10 includes a plurality of determination models 110, an output summing unit 120, and a state determination unit 130.
- the number of determination models 110 included in the state determination device 10 is not particularly limited as long as it is a plurality.
- the determination model 110 includes a model for determining the state of the road using the input image of the road. For example, the determination model 110 outputs the position of deterioration of the road and its score by using the image of the road.
- the determination model 110 is a trained model that has been previously learned using artificial intelligence (AI) or machine learning.
- AI artificial intelligence
- the determination model 110 is a trained model in which the road condition is learned using the teacher data including the image of the road photographed and the correct answer label of the road condition by using a technique such as AI or machine learning. ..
- the stage of learning a model (for example, the determination model 110) using the teacher data is referred to as a "learning phase”.
- a learned model for example, determination model 110
- the state of the road for example, the position and score of road deterioration
- the determination result is output.
- the stage is called the "judgment phase”.
- the determination model 110 is a trained model in which the determination is learned in the learning phase before operating in the actual determination phase.
- the determination model 110 may have a structure different from that of the other determination model 110.
- at least a part of the determination model 110 may include a neural network having a structure different from that of other determination models 110.
- all determination models 110 may have the same structure.
- Each of the determination models 110 determines the road condition in the learning phase using teacher data containing images in which any one of the moving body condition, the road condition, and at least one of the external environments is different from each other. Be learned.
- the teacher data may be in different states of the moving body.
- the teacher data may differ from each other in road conditions.
- the teacher data may have different external environments from each other.
- the teacher data may differ from each other in any two of the state of the moving body, the state of the road, and the external environment.
- the teacher data may be different from each other in the state of the moving object, the state of the road, and the external environment.
- the points that the images are different may be different.
- one teacher data may be different from other teacher data in the state of the moving object, and may be different from other teacher data in the state of the road.
- the determination model 110 is a trained model learned using teacher data different from each other in this way.
- each of the determination models 110 uses teacher data including images in which any one of the state of the moving body, the state of the road, and at least one of the external environments is different from each other as described above. , Learn to judge the condition of the road.
- the determination of the road condition is learned by using the teacher data including the images in which the states of the moving objects are different. Then, in the learning phase, the other determination model 110 learns the determination of the road condition by using the teacher data including the images having different external environments.
- a plurality of images including an image in which any one of the state of a moving object, the state of a road, and at least one of the external environments is different from each other as teacher data.
- Prepare the teacher data of For example, each group in which the image prepared by the user as teacher data is divided into groups in which one of the state of the moving object, the state of the road, and at least the external environment is different from each other is labeled. It may be used as teacher data.
- the part of the images included in the teacher data may be the same as the other images.
- the amount of images used for teacher data may be the same for each group, and the amount of images of teacher data of at least some groups may be different from the amount of images of teacher data of other groups. ..
- the state determination device 10 executes the learning phase of the determination model 110 so that each determination model 110 learns using different teacher data.
- the learning timing of each determination model 110 is arbitrary.
- the state determination device 10 may execute the learning of the determination model 110 continuously or in parallel.
- the state determination device 10 may perform discrete learning on at least a part of the determination models 110.
- a device different from the state determination device 10 may execute at least a part of the learning of the determination model 110.
- the state determination device 10 may acquire the learned determination model 110 from another device (not shown) as at least a part of the determination model 110.
- the state determination device 10 does not have to include the function of the learning phase.
- state determination device 10 may execute additional learning for at least a part of the determination models 110.
- the state determination device 10 is a plurality of determination models trained using teacher data including images in which any one of the state of the moving body, the state of the road, and at least one of the external environments is different from each other. Includes 110.
- the determination model 110 is assumed to have been learned, except for the explanation of the learning phase in the determination model 110.
- the determination model 110 determines the state of the road using the images of the input roads, and outputs and sums the determined road conditions, specifically, the deterioration of the road and its score. Output to unit 120.
- the output summing unit 120 acquires outputs from a plurality of determination models 110. Then, the output summing unit 120 sums the outputs of the plurality of determination models 110 by using a predetermined method.
- the output summing unit 120 may store weights for each output of the plurality of determination models 110 in advance, and add up the outputs of the determination model 110 using the stored weights.
- the output summing unit 120 stores the weights for each determination model 110 in advance. Then, the determination model 110 outputs, for example, the position of deterioration of the road and its score by using AI for object detection. In this case, the output summing unit 120 outputs, as a result of the summing, the position of deterioration output by the determination model 110 and its score, multiplied by a weight for each of the determination models 110, and then added.
- the output summing unit 120 may store a plurality of weights instead of one weight as the weight for each determination model 110.
- the output summing unit 120 may store a plurality of weights (for example, weights for fine weather, cloudy weather, and rainy weather) corresponding to the state (for example, weather) at the time of shooting the image to be determined.
- the state determination device 10 includes the following three models as the determination model 110.
- Judgment model 110 (hereinafter referred to as “fine weather model”) learned using teacher data in fine weather
- Judgment model 110 hereinafter referred to as “cloudy weather model”
- Judgment model 110 (hereinafter referred to as “rainy weather model”) learned using teacher data in rainy weather.
- the output summing unit 120 determines the weight using the cloud cover and the precipitation amount.
- FIG. 2 is a diagram showing an example of weight.
- the output summing unit 120 stores the weight shown in FIG. 2 in advance. Then, the output summing unit 120 sums the outputs of the determination model 110 as follows.
- the output summing unit 120 acquires the amount of clouds and the amount of precipitation of the image to be determined. For example, the output summing unit 120 acquires the cloud cover and the precipitation amount from the source of the image to be determined.
- the method of acquiring the amount of rainfall and the amount of precipitation in the output summing unit 120 is arbitrary.
- the output summing unit 120 may acquire the shooting position and the shooting date and time of the image, and may acquire the cloud cover and the precipitation amount at the acquired position and date and time from a company or the like that provides the meteorological data.
- the output summing unit 120 may estimate the amount of clouds and the amount of precipitation using the measurement data of the sensors mounted on the moving body (for example, the illuminance sensor and the humidity sensor).
- the output summing unit 120 may apply a predetermined image analysis method to the image to be determined to estimate the amount of clouds and the amount of precipitation.
- the output summing unit 120 is used as a weight for the determination model 110 from among the weights shown in FIG. 2 based on the cloud cover and the precipitation amount when the image to be determined is taken. Select a weight.
- the column marked "-" in FIG. 2 indicates that it is not considered in the selection of weights. For example, when the cloud cover is 1 to 8, the output summing unit 120 determines the weight without considering the amount of precipitation. Further, the output summing unit 120 compares the amount of precipitation with a predetermined threshold value to determine the amount of precipitation.
- the output summing unit 120 selects "0.7” as the weight for the sunny weather model, "0.3” as the weight for the cloudy weather model, and "0.0" as the weight for the rainy weather model. ..
- the output summing unit 120 sums the outputs of the sunny weather model, the cloudy weather model, and the rainy weather model using the weight.
- the output summing unit 120 sums the determination results of the plurality of determination models 110 using a predetermined method.
- the weight "0.0" for the rainy weather model when the cloud cover is 6 means that the output of the rainy weather model is not used for the summation in the output summarizing unit 120. In this way, the output summing unit 120 does not have to sum the outputs of some of the determination models 110.
- the output summing unit 120 calculates the value obtained by dividing the added value by the number of the determination models 110 (that is, the weighted average value) as the summing result. You may.
- the output summing unit 120 may use a weight considering the accuracy of the determination model 110.
- the output summing unit 120 sums the outputs (judgment results) of the plurality of determination models 110 by using a predetermined method.
- the method of adding up the outputs is not limited to the above, and is arbitrary.
- the user of the state determination device 10 may select a method of adding up the outputs based on his / her knowledge and the like.
- the output summing unit 120 may output a position determined by the determination model 110 to be a deteriorated position as a result of the summing as a “deteriorated position”.
- the output summing unit 120 may use the summing method learned using AI.
- the output summing unit 120 outputs the summing result to the state determination unit 130.
- the state determination unit 130 determines the state of the road in the image to be determined based on the result of the total in the output total unit 120.
- Each determination model 110 determines the state of the road. Therefore, the result of the summation in the output summarization unit 120 is the determination of the state of the road. Therefore, the user of the state determination device 10 may use the result of the summation in the output summarization unit 120.
- the road condition required by the user may be determined by using the summing result in addition to the summing result.
- the determination model 110 outputs the position and type of deterioration
- the result of the summation in the output summing unit 120 is the position and type of deterioration. Therefore, the user can grasp the position and type of deterioration.
- the state determination unit 130 may determine the degree of deterioration (hereinafter referred to as "deterioration degree”) based on the output position and type of deterioration as the state of the road. In this way, the state determination unit 130 determines the state of the road using the result of the summation in the output summarization unit 120.
- the method for determining the condition of the road in the condition determination unit 130 is arbitrary.
- the state determination unit 130 may determine the state of the road by using a general image processing method.
- the state determination unit 130 may include information acquired from the output totaling unit 120 (for example, the position and type of deterioration) in the state of the road to be output.
- the state determination unit 130 outputs the state of the road, which is the result of the determination, to a predetermined device. For example, the state determination unit 130 outputs the determination result to a device that displays the state of the road.
- the state determination unit 130 may store the state of the road, which is the result of the determination, in a predetermined storage device.
- the state determination unit 130 may output an image used for the determination in addition to the state of the road.
- the state determination device 10 may determine the state of the road by using a plurality of images (a plurality of still images or moving images) instead of one image.
- FIG. 3 is a flow chart showing an example of the operation of the learning phase of the determination model 110.
- the state determination device 10 has previously acquired a plurality of teacher data in which at least one of the state of the moving body, the state of the road, and the external environment, which is equipped with the photographing device for acquiring the image of the road, is different. ..
- the state determination device 10 executes the following operations for all the determination models 110.
- the state determination device 10 selects one determination model 110 from the unlearned determination model 110 (step S401).
- the state determination device 10 selects one teacher data from the teacher data not used for learning (step S402).
- the state determination device 10 executes learning of the selected determination model 110 using the selected teacher data (step S403).
- the determination model 110 may use general machine learning as learning.
- the state determination device 10 repeats the above operation until the unlearned determination model 110 disappears.
- FIG. 4 is a flow chart showing an example of the operation of the determination phase of the state determination device 10 according to the first embodiment.
- the determination model 110 has already been learned in the determination phase.
- the state determination device 10 acquires an image of the road to be determined from a predetermined device (step S411). For example, the state determination device 10 acquires an image of a road taken by a photographing device (not shown). Alternatively, the state determination device 10 acquires an image of the road stored by a storage device (not shown).
- each of the determination models 110 determines the state of the road using the image of the road (step S412). For example, the determination model 110 determines the deterioration of the road (for example, the position and type of deterioration), respectively. Then, the determination model 110 outputs the determination result to the output summing unit 120.
- the output summing unit 120 sums the outputs (judgment results) of the determination model 110 (step S413). Then, the output summing unit 120 outputs the summed result to the state determination unit 130.
- the state determination unit 130 determines the state of the road (for example, the degree of deterioration) using the total result in the output total unit 120 (step S414).
- the state determination unit 130 outputs the state of the road (for example, the degree of deterioration) to a predetermined device (step S415).
- the state determination unit 130 may include the total result (for example, the position and type of deterioration) of the output total unit 120 in the road condition.
- the state determination device 10 can obtain the effect of improving the accuracy of the determination of the state related to the road.
- the state determination device 10 includes a plurality of determination models 110 and an output summing unit 120.
- the determination model 110 is a determination model learned by using teacher data in which at least one of the state of a moving body, the state of the road, and the external environment, which is equipped with a photographing device for acquiring an image of the road, is different.
- the output summing unit 120 sums the outputs from the plurality of determination models for the input road image.
- the state determination device 10 adds up the determination results of the plurality of determination models 110 that have executed learning using the teacher data in which the states of the moving objects are different from each other. Therefore, the state determination device 10 can improve the accuracy of determination for the input road image.
- “Sunny” includes cloud cover up to 8.
- “cloudy weather” has a cloud cover of 9 or more.
- the actual cloud cover may be between 8 and 9 of the cloud cover. In such a case, it is assumed that the judgment corresponding to the actual cloud cover is performed by using the judgment result of both of the fine weather model and the cloudy weather model rather than using the judgment result.
- the output summing unit 120 sums the judgment results of the plurality of determination models 110 by using a predetermined method (for example, weight). Therefore, the result of the summation of the output summarization unit 120 is a determination with higher accuracy than the determination of one model.
- the state determination device 10 includes a state determination unit 130.
- the state determination unit 130 determines and outputs the state of the road based on the result of the total in the output total unit 120. As a result, the state determination unit 130 determines the state of the road by using the result of adding up the determinations of the plurality of determination models 110, that is, the result with higher accuracy.
- the state determination device 10 can improve the accuracy of the determination of the state related to the road as compared with the determination using one determination model.
- the accuracy of each of the determination models 110 may differ. In such a case, the method of simply using any one of the plurality of determination models 110 for the determination cannot ensure the accuracy of the determination.
- a device using any one of the models includes a sunny weather model, a cloudy weather model, and a rainy weather model. Then, it is assumed that the related device selects and determines the model according to the weather.
- the accuracy of the rainy weather model is lower than the accuracy of the sunny weather model and the cloudy weather model.
- the related device uses a rainy weather model for the rainy weather image. Therefore, the related device cannot ensure the accuracy of the determination in rainy weather.
- the state determination device 10 includes an output totaling unit 120. Then, the output summing unit 120 may sum the outputs in consideration of the accuracy of the plurality of determination models 110. Therefore, even if the accuracy of some of the determination models 110 is low, the state determination device 10 can improve the accuracy of the determination by using the total result of the determinations of the other determination models 110.
- Each component of the state determination device 10 may be configured by a hardware circuit.
- each component may be configured by using a plurality of devices connected via a network.
- the state determination device 10 may be configured by using cloud computing.
- a plurality of components may be configured by one piece of hardware.
- the state determination device 10 may be realized as a computer device including a Central Processing Unit (CPU), a Read Only Memory (ROM), and a Random Access Memory (RAM).
- the state determination device 10 may be realized as a computer device including a network interface circuit (Network Interface Circuit (NIC)).
- NIC Network Interface Circuit
- FIG. 5 is a block diagram showing an example of the hardware configuration of the state determination device 10.
- the state determination device 10 includes a CPU 610, a ROM 620, a RAM 630, a storage device 640, and a NIC 650, and constitutes a computer device.
- the CPU 610 reads the program from the ROM 620 and / or the storage device 640. Then, the CPU 610 controls the RAM 630, the storage device 640, and the NIC 650 based on the read program. Then, the computer including the CPU 610 controls these configurations and realizes each function as the determination model 110, the output summing unit 120, and the state determination unit 130 shown in FIG.
- the CPU 610 may use the RAM 630 or the storage device 640 as a temporary storage medium for the program when realizing each function.
- the CPU 610 may read the program included in the recording medium 690 that stores the program so that it can be read by a computer by using a recording medium reading device (not shown).
- the CPU 610 may receive a program from an external device (not shown) via the NIC 650, store the program in the RAM 630 or the storage device 640, and operate based on the stored program.
- ROM 620 stores programs executed by CPU 610 and fixed data.
- the ROM 620 is, for example, a Programmable-ROM (P-ROM) or a flash ROM.
- the RAM 630 temporarily stores the program and data executed by the CPU 610.
- the RAM 630 is, for example, a Dynamic-RAM (D-RAM).
- D-RAM Dynamic-RAM
- the storage device 640 stores data and programs stored in the state determination device 10 for a long period of time. Further, the storage device 640 may operate as a temporary storage device of the CPU 610.
- the storage device 640 is, for example, a hard disk device, a magneto-optical disk device, a Solid State Drive (SSD), or a disk array device.
- the ROM 620 and the storage device 640 are non-volatile recording media.
- the RAM 630 is a volatile recording medium. Then, the CPU 610 can operate based on the program stored in the ROM 620, the storage device 640, or the RAM 630. That is, the CPU 610 can operate using a non-volatile recording medium or a volatile recording medium.
- the NIC650 relays data exchange with an external device (for example, a source of an image for determination and an output destination device of a road condition) via a network.
- the NIC650 is, for example, a Local Area Network (LAN) card. Further, the NIC650 is not limited to wired, and wireless may be used.
- LAN Local Area Network
- the state determination device 10 of FIG. 5 configured in this way can obtain the same effect as the state determination device 10 of FIG.
- the reason is that the CPU 610 of the state determination device 10 of FIG. 5 can realize each function of the state determination device 10 of FIG. 1 based on the program.
- FIG. 6 is a block diagram showing an example of the configuration of the state determination system 50 including the state determination device 10 according to the first embodiment.
- the state determination system 50 includes a state determination device 10, a photographing device 20, and / or an image storage device 25, and a display device 30.
- the photographing device 20 is mounted on the moving body to acquire teacher data and / or an image to be determined, and outputs the image to the state determination device 10. Alternatively, the photographing device 20 may acquire the teacher data and / or the image to be determined and store it in the image storage device 25.
- the photographing device 20 is, for example, a camera or a drive recorder.
- the number of photographing devices 20 may be one or a plurality.
- FIG. 15 is a diagram showing an example of a case where a plurality of drive recorders 820 are used as a photographing device.
- FIG. 15 shows a vehicle 810 as an example of a moving body.
- the network 850 is a communication path to which the state determination device 10 is connected.
- the wireless communication path 830 connects the drive recorder 820 and the wireless base station 840.
- the radio base station 840 relays between the network 850 to which the state determination device 10 is connected and the radio communication path 830.
- the vehicle 810 is equipped with a drive recorder 820 and travels on the road.
- the drive recorder 820 is mounted on the vehicle 810 and captures an image of the road on which the mounted vehicle 810 travels. Note that FIG. 15 shows the drive recorder 820 adjacent to the outside of the vehicle 810 in order to make it easy to understand.
- the drive recorder 820 transmits the captured image to the state determination device 10 via the wireless communication path 830, the wireless base station 840, and the network 850.
- the wireless communication path 830, the wireless base station 840, and the network 850 are examples of communication paths between the drive recorder 820 and the state determination device 10.
- the drive recorder 820 may be connected to the state determination device 10 by a device or route different from that shown in FIG.
- the vehicle 810, drive recorder 820, wireless communication path 830, wireless base station 840, and network 850 are not particularly limited.
- the state determination device 10 may be mounted on a moving body. In this case, the state determination device 10 may acquire an image for determination from the photographing device 20 directly or via a communication path in the moving body.
- the image storage device 25 stores teacher data and / or an image to be determined. Then, the image storage device 25 outputs the teacher data and / or the image to be determined to the state determination device 10.
- the image storage device 25 is, for example, a hard disk device, a magneto-optical disk device, an SSD, or a disk array device.
- the image storage device 25 receives the teacher data and / or the image to be determined from the photographing device 20. However, the image storage device 25 may receive the teacher data and / or the image to be determined from a device (not shown) different from the photographing device 20.
- the user may appropriately determine whether the state determination device 10 acquires the teacher data and the image to be determined from the photographing device 20 or the image storage device 25.
- the state determination device 10 may acquire teacher data from the image storage device 25 and acquire the image to be determined from the photographing device 20.
- the state determination device 10 operates as described above and executes the learning phase using the teacher data.
- the state determination device 10 operates as described above, determines the state of the road using the image to be determined as the determination phase, and outputs the determined road condition.
- the display device 30 displays the determination result (road condition) output by the status determination device 10.
- the display device 30 is, for example, a liquid crystal display, an organic electroluminescence display, or an electronic paper.
- the display on the display device 30 is arbitrary.
- the user may display the road condition on the display device 30 as needed.
- FIG. 16 is a diagram showing an example of display.
- FIG. 16 shows the position of deterioration using a rectangular frame in the image of the road to be determined. Further, FIG. 16 highlights (hatched lines) the positions determined by the state determination device 10 as “high degree of deterioration”.
- the display device 30 may collectively display the state of a plurality of roads instead of one as the display of the state of the road determined by the state determination device 10.
- FIG. 17 is a diagram showing an example of display of a plurality of states.
- FIG. 17 shows the portion of the road in a predetermined area where deterioration is determined by using an arrow. Further, in FIG. 17, the “large degree of deterioration” is highlighted (black). The direction of the arrow in FIG. 17 is the traveling direction of the vehicle on the road.
- FIG. 7 is a block diagram showing an example of the configuration of the state determination device 11 according to the second embodiment.
- the state determination device 11 includes a plurality of determination models 110, an output totaling unit 121, a state determination unit 130, and a weight determination unit 140.
- the determination model 110 and the state determination unit 130 are the same as the determination model 110 and the state determination unit 130 of the first embodiment. Therefore, these detailed explanations will be omitted.
- the state determination device 11 may be configured by using the computer shown in FIG. Alternatively, the state determination system 50 may include the state determination device 11 instead of the state determination device 10.
- the output summing unit 121 sums up the determination results of the determination model 110 using the weights determined by the weight determination unit 140. For example, when the state determined by the determination model 110 is the deterioration position and the score, the output summing unit 121 uses the weight determined by the weight determination unit 140 to determine the deterioration position and the score output by the determination model 110. To add up.
- the output summing unit 121 operates in the same manner as the output summing unit 120 of the first embodiment, except that the weight determined by the weight determining unit 140 is used.
- the weight determination unit 140 determines the weight to be used when the output summing unit 121 sums the determination model 110 using the image to be determined.
- the weight determination unit 140 stores in advance a mechanism for determining weights using an image (for example, an equation for calculating weights or a determinant).
- the mechanism for determining the weight is referred to as a "weight determination formula".
- the weight determination formula is not limited to a simple scalar formula.
- the weight determination formula may be an equation, a vector formula, a determinant, a function, or a combination thereof.
- the weight determination formula may include a function including a conditional statement.
- the operation of the weight determination unit 140 will be described using a simple example.
- Equation (1) is a weight determination equation representing the summation in the output summarization unit 121.
- Y a 1 (p) X 1 (p) + a 2 (p) X 2 (p) + ... + an (p) X n ( p) ...
- p is an image to be determined (specifically, for example, the value of the pixel of the image).
- n is the number of determination models 110.
- Y is the result of the summation.
- the data format such as the variable of the equation (1) may be appropriately selected from a scalar, a vector, a matrix, a function, or a combination thereof.
- the coefficient of the function ai (p) is an example of the parameter of the weight determination formula.
- the weight determination unit 140 applies the image p to the function ai (p) to determine the weight. Further, the determination model 110 uses the image p to calculate "X i (p)" which is the result of the determination. Then, the output summing unit 120 is the result of summarizing using the weight ( ai (p)) determined by the weight determining unit 140 and the determination result (X i (p)) output by the determination model 110. Calculate "Y".
- the weight determination formula is stored in the state determination device 11 in advance.
- the configuration for storing the weight determination formula is arbitrary.
- the state determination device 11 may store the weight determination formula in a storage unit (not shown).
- the weight determination unit 140 acquires the weight determination formula stored by the storage unit, applies the image to the weight determination formula to determine the weight, and provides the determined weight to the output summing unit 121. do.
- the weight determination unit 140 may store the weight determination formula.
- the weight determination unit 140 may acquire the weight determination formula from a device that stores the weight determination formula (not shown) when the weight is determined.
- the weight determination unit 140 may use a weight determination formula that has been learned in advance by using a predetermined method such as AI.
- the weight determination unit 140 may use a weight determination formula in which a parameter (coefficient of ai (p)) is learned using predetermined teacher data for weight learning.
- FIG. 8 is a flow chart showing an example of the operation of the determination phase of the state determination device 11 according to the second embodiment.
- the determination model 110 has already been learned.
- the state determination device 11 acquires an image of the road to be determined from a predetermined device (step S411). For example, the state determination device 11 may acquire an image of the road taken by the photographing device 20. Alternatively, the state determination device 11 may acquire an image of the road stored by the image storage device 25.
- each of the determination models 110 determines the state of the road using the image of the road (step S412). For example, the determination model 110 determines the deterioration of the road (for example, the position and type of deterioration), respectively. Then, the determination model 110 outputs the determination result to the output summing unit 120.
- the weight determination unit 140 determines the weight using the image (step S416).
- the output summing unit 121 sums the outputs (judgment results) of the determination model 110 using the determined weights (step S417). Then, the output summing unit 121 outputs the summed result to the state determination unit 130.
- the state determination unit 130 determines the state of the road using the total result in the output total unit 121 (step S414).
- the state determination unit 130 outputs the state of the road (for example, the degree of deterioration) to a predetermined device (step S415).
- the state determination unit 130 may include the total result (for example, the position and type of deterioration) of the output total unit 121 in the road condition.
- the state determination device 11 according to the second embodiment can obtain the same effect as the state determination device 10 according to the first embodiment.
- state determination device 11 can obtain the effect of further improving the accuracy of determination.
- the weight determination unit 140 determines the weight used by the output summing unit 121 based on the image to be determined. That is, the weight determination unit 140 determines the weight corresponding to the image to be determined.
- the output summing unit 121 sums the outputs of the determination model 110 using the weight corresponding to the image to be determined. That is, the output summing unit 121 can execute a more appropriate summing.
- the state determination device 11 can further improve the accuracy of determining the state of the road.
- FIG. 9 is a block diagram showing an example of the configuration of the state determination device 12 according to the third embodiment.
- the state determination device 12 includes a plurality of determination models 110, an output totaling unit 121, a state determination unit 130, a weight determination unit 141, a loss calculation unit 150, and a parameter correction unit 160.
- the state determination device 12 may be configured by using the computer shown in FIG. Alternatively, the state determination system 50 may include the state determination device 12 instead of the state determination devices 10 and 11.
- the determination model 110 and the state determination unit 130 are the same as the determination model 110 and the state determination unit 130 of the first embodiment and the second embodiment. Further, the output summing unit 121 is the same as the output summing unit 121 of the second embodiment.
- the loss calculation unit 150 and the parameter correction unit 160 operate in the weight learning phase in the weight determination unit 141. Further, the weight determination unit 141 operates in the same manner as the weight determination unit 140 except that the weight is learned in cooperation with the loss calculation unit 150 and the parameter correction unit 160.
- the loss calculation unit 150 includes data indicating the correct answer to the teacher data for weight learning (for example, the position of deterioration in each image and its score, hereinafter referred to as “correct answer label”) and the result of summing up the output summing unit 121. Calculate the difference (loss).
- the data indicating the correct answer is referred to as a "correct answer label”.
- the loss calculated by the loss calculation unit 150 is referred to as “total loss” or "first loss”.
- the configuration for saving the correct answer label is arbitrary.
- the loss calculation unit 150 may store the correct answer label corresponding to the teacher data for weight learning in advance.
- the state determination device 12 may store the correct answer label in a storage unit (not shown).
- the loss calculation unit 150 may acquire the correct answer label from the device or configuration for storing the teacher data for weight learning in the loss calculation.
- the parameter correction unit 160 corrects the parameters of the weight determination formula used by the weight determination unit 140 (for example, the coefficient of the function ai (p) of the equation (1)) based on the loss calculated by the loss calculation unit 150. ..
- FIG. 10 is a flow chart showing an example of the operation of the learning phase of the weight determination unit 141 according to the third embodiment.
- the weight determination unit 141 has a predetermined weight determination formula (for example, a weight determination formula in which the coefficient of the function ai (p) is a predetermined value) as an initial value of the weight before the operation of the learning phase. Is saved.
- a predetermined weight determination formula for example, a weight determination formula in which the coefficient of the function ai (p) is a predetermined value
- the state determination device 12 acquires an image as teacher data for weight learning (step S421).
- the state determination device 12 selects one image from the teacher data for weight learning. Then, each of the plurality of determination models 110 determines the state of the road using the selected image (step S422). Then, the determination model 110 outputs the determination result to the output summing unit 121.
- the weight determination unit 141 applies the selected image to the weight determination formula to determine the weight (step S423).
- the output summing unit 121 sums the outputs of the determination model 110 using the weight determined by the weight determining unit 141 (step S424).
- the loss calculation unit 150 calculates the loss (total loss) between the correct label and the total result of the output total unit 121 (step S425).
- the parameter correction unit 160 corrects the parameters of the weight determination formula (for example, the coefficient of the function ai (p)) based on the loss (total loss) calculated by the loss calculation unit 150 (step S426).
- the state determination device 12 determines whether or not learning is completed (step S427). Specifically, the state determination device 12 determines whether or not a predetermined end condition (for example, the loss value is smaller than the threshold value, the execution is performed a predetermined number of times, or the end of the teacher data for weight learning) is satisfied. do.
- a predetermined end condition for example, the loss value is smaller than the threshold value, the execution is performed a predetermined number of times, or the end of the teacher data for weight learning
- step S427 If the end condition is not satisfied (No in step S427), the state determination device 12 returns to step S422.
- step S427 the state determination device 12 ends the learning phase.
- the state determination device 12 learns the weight (for example, the parameter of the weight determination formula) determined by the weight determination unit 141 using the teacher data for weight learning.
- the state determination device 12 operates in the same manner as the state determination device 11 of the second embodiment in the determination phase. That is, in the determination phase, the state determination device 12 determines the state of the road by using the determination model 110, the output summing unit 121, the state determination unit 130, and the weight determination unit 141.
- the state determination device 12 acquires an image of the road to be determined from the predetermined device (step S411).
- the determination model 110 outputs the state of the road using the image of the road (step S412).
- the weight determination unit 141 applies a road image to the learned weight determination formula to determine the weight (step S416).
- the output summing unit 121 sums the outputs of the determination model 110 using the determined weights (step S417).
- the state determination unit 130 determines the state of the road using the total result (step S414).
- the state determination unit 130 outputs the determination result (road condition) to a predetermined device (step S415).
- the state determination device 12 according to the third embodiment can obtain the effect of further improving the accuracy of determination in addition to the effects of the first embodiment and the second embodiment.
- the weight determination unit 141 learns weights using teacher data for weight learning in the learning phase. Then, the weight determination unit 141 determines the weight using the learned weight determination formula in the determination phase.
- the output summing unit 121 sums the outputs of the determination model 110 using the determined weights. That is, the output summing unit 121 sums the outputs using the weights learned by using the teacher data for weight learning. Therefore, the output summing unit 121 can execute a more appropriate summing.
- the state determination device 12 can further improve the accuracy of determining the state of the road.
- FIG. 11 is a block diagram showing an example of the configuration of the state determination device 13 according to the fourth embodiment.
- the state determination device 13 includes a plurality of determination models 110, an output summing unit 121, a state determination unit 130, a weight determination unit 142, a loss calculation unit 150, a parameter correction unit 161 and an external environment loss calculation unit 170. including.
- the state determination device 13 may be configured by using the computer shown in FIG.
- the determination model 110 and the state determination unit 130 are the same as the determination model 110 and the state determination unit 130 of the first embodiment to the third embodiment. Further, the output summing unit 121 is the same as the output summing unit 121 of the second embodiment and the third embodiment.
- the loss calculation unit 150 is the same as the loss calculation unit 150 of the third embodiment.
- the loss calculation unit 150, the parameter correction unit 161 and the external environment loss calculation unit 170 operate in the weight learning phase of the weight determination unit 142. Further, the weight determination unit 142 operates in the same manner as the weight determination unit 141, except for the operation of learning the weight in the learning phase.
- the weight determination unit 142 determines the weight using the image of the teacher data for weight learning, similarly to the weight determination unit 141. Further, the weight determination unit 142 estimates the external environment using an image, and outputs the estimated external environment to the external environment loss calculation unit 170.
- the weight determination unit 142 standardizes at least a part of the processing in the calculation of the weight and the estimation of the external environment.
- the weight determination unit 142 may share the lower layer of the deep neural network and specialize the upper layer for weight calculation and estimation of the external environment. In this case, the weight determination unit 142 shares a part of the parameter for determining the weight and the parameter for estimating the external environment. In this case, the learning in the weight determination unit 142 is so-called multi-task learning.
- the weight determination unit 142 uses a deep neural network in which a part is shared in the calculation of the weight and the estimation of the external environment, the weights for the determination model 110 learned using the similar external environment are similar. It tends to be a weight. As a result, the state determination device 13 can obtain the effect of stabilizing the learning in the weight determination unit 142.
- the weight determination unit 142 may use different processes in the calculation of the weight and the estimation of the external environment.
- the external environment loss calculation unit 170 acquires the external environment related to the image of the teacher data for weight learning.
- the acquisition source of the external environment is arbitrary.
- the external environment loss calculation unit 170 may acquire the external environment related to the image of the teacher data for weight learning from the provider of the teacher data for weight learning.
- the external environment loss calculation unit 170 may acquire the external environment from a device (not shown) with reference to the teacher data for weight learning.
- the external environmental loss calculation unit 170 may acquire meteorological data such as the weather from a company or the like that provides the meteorological data by referring to the shooting date and time and place of the image of the teacher data for weight learning.
- the external environment loss calculation unit 170 calculates the difference (loss) between the acquired external environment and the external environment estimated by the weight determination unit 142.
- the loss calculated by the external environment loss calculation unit 170 is referred to as “external environment loss” or “second loss”.
- the external environment loss calculation unit 170 outputs the calculated loss (external environment loss) to the parameter correction unit 161.
- the parameter correction unit 161 is used by the weight determination unit 142 to calculate the weight based on the loss (total loss) calculated by the loss calculation unit 150 and the loss (external environment loss) calculated by the external environment loss calculation unit 170. Correct the parameters of the weight determination formula.
- FIG. 12 is a flow chart showing an example of the operation of the learning phase of the weight determination unit 142 according to the fourth embodiment.
- the weight determination unit 142 saves a predetermined weight determination formula as an initial value of the weight before the operation of the learning phase.
- steps S421 to S425 and S427 are the same operations as in FIG.
- the state determination device 13 acquires an image as teacher data for weight learning (step S421).
- the state determination device 13 selects one image from the teacher data for weight learning. Then, each of the plurality of determination models 110 determines the state of the road using the selected image (step S422). Then, the determination model 110 outputs the determination result to the output summing unit 121.
- the weight determination unit 142 applies the selected image to the weight determination formula to determine the weight (step S423).
- the output summing unit 121 sums the outputs of the determination model 110 using the weight determined by the weight determining unit 141 (step S424).
- the loss calculation unit 150 calculates the loss (total loss) between the correct label and the total result of the output total unit 121 (step S425).
- the weight determination unit 142 estimates the environment using the image (step S431).
- the external environment loss calculation unit 170 calculates the external environment loss (step S432).
- the parameter correction unit 161 is used by the weight determination unit 142 to calculate the weight based on the loss (total loss) calculated by the loss calculation unit 150 and the loss (external environment loss) calculated by the external environment loss calculation unit 170.
- the parameter of the weight determination formula is modified (step S433).
- the state determination device 13 determines whether or not learning is completed (step S427). Specifically, the state determination device 13 satisfies a predetermined end condition (for example, the value of either loss or both losses is smaller than the threshold value, the execution is performed a predetermined number of times, or the end of the teacher data for weight learning). Determine if you are satisfied.
- a predetermined end condition for example, the value of either loss or both losses is smaller than the threshold value, the execution is performed a predetermined number of times, or the end of the teacher data for weight learning.
- step S427 If the end condition is not satisfied (No in step S427), the state determination device 13 returns to step S422.
- step S427 the state determination device 13 ends the learning phase.
- state determination device 13 may change the order of execution of the operations from steps S422 to S425 and the operations from S431 to S432.
- the state determination device 13 may execute at least a part of the operations from steps S422 to S425 and the operations from S431 to S432 in parallel.
- the state determination device 13 learns the weight determined by the weight determination unit 142 by using the teacher data for weight learning and the external environment.
- the state determination device 13 operates in the same manner as the state determination device 11 of the second embodiment and the state determination device 12 of the third embodiment. That is, in the determination phase, the state determination device 13 determines the state of the road by using the determination model 110, the output summing unit 121, the state determination unit 130, and the weight determination unit 142.
- the state determination device 13 acquires an image of the road to be determined from the predetermined device (step S411).
- the determination model 110 outputs the state of the road using the image of the road (step S412).
- the weight determination unit 142 applies a road image to the learned weight determination formula to determine the weight (step S412).
- the output summing unit 121 sums the outputs of the determination model 110 using the determined weights (step S417).
- the state determination unit 130 determines the state of the road using the total result (discard S414).
- the state determination unit 130 outputs the determination result (road condition) to a predetermined device (step S415).
- the state determination device 13 according to the fourth embodiment can obtain the effect of further improving the accuracy of determination in addition to the effects of the first embodiment to the third embodiment.
- the weight determination unit 142 learns the parameters of the weight determination formula using the external environment in addition to the teacher data for weight learning. Then, the weight determination unit 142 determines the weight using the learned weight determination formula in the determination phase.
- the output summing unit 121 sums the outputs of the determination model 110 using the determined weights. That is, the output summing unit 121 sums the outputs using the weights learned using the external environment in addition to the teacher data for weight learning. Therefore, the output summing unit 121 can execute a more appropriate summing.
- the state determination device 13 can further improve the accuracy of determining the state of the road.
- FIG. 13 is a block diagram showing an example of the configuration of the state determination system 51 including the state determination device 13 according to the fourth embodiment.
- the state determination system 51 includes a state determination device 13, a photographing device 20, and / or an image storage device 25, a display device 30, and an information providing device 40.
- the photographing device 20, the image storage device 25, and the display device 30 are the same as those in the first embodiment. Therefore, detailed description thereof will be omitted.
- the information providing device 40 outputs the external environment to the state determination device 13.
- the photographing device 20 may output at least a part of the external environment to the state determination device 13.
- the user or the like may appropriately determine whether the state determination device 13 acquires the external environment from the information providing device 40 or the photographing device 20.
- the state determination device 13 operates as described above, and outputs the state of the road to the display device 30 as a result of the determination.
- FIG. 14 is a diagram showing an example of the configuration of the state determination device 14 according to the fifth embodiment.
- the state determination device 14 includes a plurality of determination models 110 and an output summing unit 120.
- the determination model 110 is the same as the determination model 110 of the first embodiment to the fourth embodiment.
- the output summing unit 120 is the same as the output summing unit 120 of the first embodiment.
- Each configuration of the state determination device 14 operates in the same manner as the corresponding configuration in the state determination device 10 or the like of the first embodiment.
- the state determination device 14 may be configured by using the computer shown in FIG. Alternatively, the state determination system 50 may include the state determination device 14 instead of the state determination devices 10 to 12.
- the state determination device 14 has the effect of improving the accuracy of determining the state of the road.
- the state determination device 14 includes a plurality of determination models 110 and an output summing unit 120.
- the determination model 110 is a determination model learned by using teacher data in which at least one of the state of a moving body, the state of the road, and the external environment, which is equipped with a photographing device for acquiring an image of the road, is different.
- the output summing unit 120 sums the outputs from the plurality of determination models for the input road image.
- each configuration of the state determination device 14 operates in the same manner as the corresponding configuration in the state determination device 10 or the like of the first embodiment. That is, the output summing unit 120 of the state determination device 14 sums the determination results of the plurality of determination models 110 by using a predetermined method. Therefore, the state determination device 14 can output a more accurate road state as compared with the case of using one model.
- the user may use the summation result of the output summarization unit 120 in the state determination device 14 as the state of the road.
- the user may provide the output of the state determination device 14 to a device (not shown) for determining the state of the road in detail to determine the state of the road.
- the state determination device 14 according to the fifth embodiment has the minimum configuration in each embodiment.
- a state determination device including an output summing means for summing the outputs from multiple determination models for an input road image.
- Appendix 2 The state determination device according to Appendix 1, which includes a state determination means for determining and outputting the state of the road based on the result of the addition in the output totaling means.
- Appendix 3 The state determination device according to Appendix 1 or 2, wherein a plurality of judgment models are trained using a plurality of teacher data including images of a moving object, a road condition, and a road in which at least one of the external environments is different from each other. ..
- Appendix 4 It further includes weight determining means for determining the weights used to sum the outputs of multiple decision models in response to the input road image.
- the state determination device according to any one of Supplementary note 1 to 3, wherein the output summing means sums the outputs of a plurality of determination models using the determined weights.
- Appendix 5 The state determination device according to Appendix 4, wherein the weight determining means learns parameters for determining weights using teacher data for weight learning.
- Appendix 6 The state determination device according to Appendix 5, wherein the weight determination means further learns parameters for determining weights in response to an external environment.
- the state determination device according to any one of Supplementary note 1 to 5 and the state determination device.
- An imaging device that acquires an image of the road to be determined and outputs it to the state determination device, and / or a storage device that stores an image of the road to be determined.
- a state judgment system including a display device that displays the state of the road output by the state judgment device.
- Appendix 8 The state determination device described in Appendix 6 and An imaging device that captures an image of the road to be determined and outputs it to the state determination device, and / or a storage device that stores an image of the road to be determined.
- An information providing device that outputs the external environment to the status judgment device, and A state judgment system including a display device that displays the state of the road output by the state judgment device.
- a state judgment device including a plurality of judgment models learned using teacher data in which at least one of the state of a moving object, the state of the road, and the external environment, which is equipped with a photographing device for acquiring an image of the road, is different.
- a state judgment method that adds up the outputs from multiple judgment models for the input road image.
- Appendix 10 The condition determination method according to Appendix 9, which determines the condition of the road based on the result of the summation.
- Appendix 11 The state determination method according to Appendix 9 or 10, wherein a plurality of judgment models are trained using a plurality of teacher data including images of a moving body, a road condition, and a road in which at least one of the external environments is different from each other. ..
- Appendix 13 The state determination method according to Appendix 12, which learns parameters for determining weights using teacher data for weight learning.
- Appendix 14 Further, the state determination method according to Appendix 13, which learns parameters for determining weights in response to an external environment.
- the state determination device executes the state determination method according to any one of the appendices 9 to 13 and performs the state determination method.
- the photographing device acquires the image to be judged and outputs it to the state judgment device, and / or the storage device saves the image to be judged.
- a state judgment method in which the display device displays the state of the road output by the state judgment device.
- the state determination device executes the state determination method described in Appendix 14,
- the photographing device acquires an image of the road to be determined and outputs it to the state determination device, and / or the storage device saves the image to be determined.
- the information providing device outputs the external environment to the status judgment device, A state judgment method in which the display device displays the state of the road output by the state judgment device.
- Appendix 17 A computer containing multiple decision models, each trained using different teacher data for at least one of the condition of a moving object, the condition of the road, and the external environment equipped with a photographing device that acquires an image of the road.
- a recording medium that records a program that executes a process that adds up the outputs from multiple judgment models for an input road image.
- Appendix 18 The recording medium according to Appendix 17, which records a program that causes a computer to execute a process of determining a road condition based on the total result.
- Appendix 19 Record a program that causes a computer to perform a process that trains multiple decision models using multiple teacher data, including images of moving objects, road conditions, and roads where at least one of the external environments is different from each other.
- Appendix 21 The recording medium according to Appendix 20, which records a program that causes a computer to execute a process of learning parameters for determining weights using teacher data for weight learning.
- Appendix 22 Further, the recording medium according to Appendix 21 for recording a program that causes a computer to execute a process of learning parameters for determining weights in response to an external environment.
- State judgment device 11 State judgment device 12 State judgment device 13 State judgment device 20 Imaging device 25 Image storage device 30 Display device 40 Information providing device 50 State judgment system 51 State judgment system 110 Judgment model 120 Output summing unit 121 Output summing unit 130 Status determination unit 140 Weight determination unit 141 Weight determination unit 142 Weight determination unit 150 Loss calculation unit 160 Parameter correction unit 161 Parameter correction unit 170 External environment loss calculation unit 610 CPU 620 ROM 630 RAM 640 storage device 650 NIC 690 Recording medium 810 Vehicle 820 Drive recorder 830 Wireless communication path 840 Wireless base station 850 Network
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Automation & Control Theory (AREA)
- Mathematical Physics (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
Description
道路の画像を取得する撮影装置を搭載した移動体の状態、道路の状態、及び、外部環境の少なくともいずれか一つが異なる教師データを用いてそれぞれ学習された複数の判定モデルと、
入力された道路の画像に対する複数の判定モデルからの出力を合算する出力合算手段と
を含む。
上記の状態判定装置と、
判定対象の画像を取得して状態判定装置に画像を出力する撮影装置、及び/又は、判定対象の画像を保存する記憶装置と、
状態判定装置が出力する道路の状態を表示する表示装置と
を含む。
道路の画像を取得する撮影装置を搭載した移動体の状態、道路の状態、及び、外部環境の少なくともいずれか一つが異なる教師データを用いてそれぞれ学習された複数の判定モデルを含む状態判定装置が、
入力された道路の画像に対する複数の判定モデルからの出力を合算する。
状態判定装置が、上記の記載の状態判定方法を実行し、
撮影装置が、判定対象の画像を取得して状態判定装置に出力し、及び/又は、記憶装置が、判定対象の画像を保存し、
表示装置が、状態判定装置が出力する道路の状態を表示する。
道路の画像を取得する撮影装置を搭載した移動体の状態、道路の状態、及び、外部環境の少なくともいずれか一つが異なる教師データを用いてそれぞれ学習された複数の判定モデルを含むコンピュータに、
入力された道路の画像に対する複数の判定モデルからの出力を合算する処理
を実行させるプログラムを記録する。
次に、図面を参照して、第1の実施形態について説明する。
まず、第1の実施形態にかかる状態判定装置10の構成について、図面を参照して説明する。
(1)晴天の教師データを用いて学習した判定モデル110(以下、「晴天モデル」)、
(2)曇天の教師データを用いて学習した判定モデル110(以下、「曇天モデル」)、
(3)雨天の教師データを用いて学習した判定モデル110(以下、「雨天モデル」)。
次に、状態判定装置10に関連する動作について、図面を参照して説明する。
まず、判定モデル110の学習フェーズの動作について説明する。
次に、状態判定装置10における判定フェーズの動作を説明する。
次に、第1の実施形態にかかる状態判定装置10の効果について説明する。
次に、状態判定装置10のハードウェア構成について説明する。
図6は、第1の実施形態にかかる状態判定装置10を含む状態判定システム50の構成の一例を示すブロック図である。
次に、図面を参照して、第2の実施形態について説明する。
図7は、第2の実施形態にかかる状態判定装置11の構成の一例を示すブロック図である。
Y=a1(p)X1(p)+a2(p)X2(p)+…+an(p)Xn(p) … (1)
上記の等式(1)において、pは、判定対象の画像(具体的には、例えば、画像の画素の値)である。nは、判定モデル110の数である。iは、判定モデル110の示す変数である(i=1、2、…、n)。Yは、合算の結果である。ai(p)(i=1、2、…、n)は、画像pのときの各判定モデル110に対する重みを決定する関数である。Xi(p)(i=1、2、…、n)は、判定モデル110の出力である。
次に、第2の実施形態にかかる状態判定装置11における判定フェーズの動作について、図面を参照して説明する。
次に第2の実施形態にかかる状態判定装置11の効果について説明する。
次に、図面を参照して、第3の実施形態について説明する。
図9は、第3の実施形態にかかる状態判定装置12の構成の一例を示すブロック図である。
次に、第3の実施形態にかかる状態判定装置12の動作について、図面を参照して説明する。
次に、第3の実施形態にかかる状態判定装置12の効果について説明する。
次に、図面を参照して、第4の実施形態について説明する。
図11は、第4の実施形態にかかる状態判定装置13の構成の一例を示すブロック図である。
次に、第4の実施形態にかかる状態判定装置13の動作について、図面を参照して説明する。
次に、第4の実施形態にかかる状態判定装置13の効果について説明する。
図13は、第4の実施形態にかかる状態判定装置13を含む状態判定システム51の構成の一例を示すブロック図である。
図面を参照して、第5の実施形態を説明する。
図14は、第5の実施形態にかかる状態判定装置14の構成の一例を示す図である。
状態判定装置14は、道路に関する状態の判定の精度を向上するとの効果を奏する。
道路の画像を取得する撮影装置を搭載した移動体の状態、道路の状態、及び、外部環境の少なくともいずれか一つが異なる教師データを用いてそれぞれ学習された複数の判定モデルと、
入力された道路の画像に対する複数の判定モデルからの出力を合算する出力合算手段と
を含む状態判定装置。
出力合算手段における合算の結果に基づいて道路の状態を判定して出力する状態判定手段
を含む付記1に記載の状態判定装置。
移動体の状態、道路の状態、及び、外部環境の少なくとも一つが相互に異なる道路の画像を含む複数の教師データを用いて複数の判定モデルをそれぞれ学習させる
付記1又は2に記載の状態判定装置。
入力された道路の画像に対応して、複数の判定モデルの出力を合算するために用いる重みを決定する重み決定手段をさらに含み、
出力合算手段が、決定された重みを用いて複数の判定モデルの出力を合算する
付記1ないし3のいずれか1項に記載の状態判定装置。
重み決定手段が、重み学習用の教師データを用いて、重みを決定するためのパラメータを学習する
付記4に記載の状態判定装置。
重み決定手段が、さらに、外部環境に対応して、重みを決定するパラメータを学習する
付記5に記載の状態判定装置。
付記1ないし5のいずれか1項に記載の状態判定装置と、
判定対象の道路の画像を取得して状態判定装置に出力する撮影装置、及び/又は、判定対象の道路の画像を保存する記憶装置と、
状態判定装置が出力する道路の状態を表示する表示装置と
を含む状態判定システム。
付記6に記載の状態判定装置と、
判定対象の道路の画像を撮影して状態判定装置に出力する撮影装置、及び/又は、判定対象の道路の画像を保存する記憶装置と、
状態判定装置に外部環境を出力する情報提供装置と、
状態判定装置が出力する道路の状態を表示する表示装置と
を含む状態判定システム。
道路の画像を取得する撮影装置を搭載した移動体の状態、道路の状態、及び、外部環境の少なくともいずれか一つが異なる教師データを用いてそれぞれ学習された複数の判定モデルを含む状態判定装置が、
入力された道路の画像に対する複数の判定モデルからの出力を合算する
状態判定方法。
合算の結果に基づいて道路の状態を判定する
付記9に記載の状態判定方法。
移動体の状態、道路の状態、及び、外部環境の少なくとも一つが相互に異なる道路の画像を含む複数の教師データを用いて複数の判定モデルをそれぞれ学習させる
付記9又は10に記載の状態判定方法。
入力された道路の画像に対応して、複数の判定モデルの出力を合算するために用いる重みを決定し、
決定された重みを用いて複数の判定モデルの出力を合算する
付記9ないし11のいずれか1項に記載の状態判定方法。
重み学習用の教師データを用いて、重みを決定するためのパラメータを学習する
付記12に記載の状態判定方法。
さらに、外部環境に対応して、重みを決定するパラメータを学習する
付記13に記載の状態判定方法。
状態判定装置が、付記9ないし13のいずれか1項に記載の状態判定方法を実行し、
撮影装置が、判定対象の画像を取得して状態判定装置に出力し、及び/又は、記憶装置が、判定対象の画像を保存し、
表示装置が、状態判定装置が出力する道路の状態を表示する
状態判定方法。
状態判定装置が、付記14に記載の状態判定方法を実行し、
撮影装置が、判定対象の道路の画像を取得して状態判定装置に出力し、及び/又は、記憶装置が、判定対象の画像を保存し、
情報提供装置が、状態判定装置に外部環境を出力し、
表示装置が、状態判定装置が出力する道路の状態を表示する
状態判定方法。
道路の画像を取得する撮影装置を搭載した移動体の状態、道路の状態、及び、外部環境の少なくともいずれか一つが異なる教師データを用いてそれぞれ学習された複数の判定モデルを含むコンピュータに、
入力された道路の画像に対する複数の判定モデルからの出力を合算する処理
を実行させるプログラムを記録する記録媒体。
合算の結果に基づいて道路の状態を判定する処理
をコンピュータに実行させるプログラムを記録する付記17に記載の記録媒体。
移動体の状態、道路の状態、及び、外部環境の少なくとも一つが相互に異なる道路の画像を含む複数の教師データを用いて複数の判定モデルをそれぞれ学習させる処理
をコンピュータに実行させるプログラムを記録する付記17又は18に記載の記録媒体。
入力された道路の画像に対応して、複数の判定モデルの出力を合算するために用いる重みを決定する処理と、
決定された重みを用いて複数の判定モデルの出力を合算する処理と
をコンピュータに実行させるプログラムを記録する付記17ないし19のいずれか1項に記載の記録媒体。
重み学習用の教師データを用いて、重みを決定するためのパラメータを学習する処理
をコンピュータに実行させるプログラムを記録する付記20に記載の記録媒体。
さらに、外部環境に対応して、重みを決定するパラメータを学習する処理
をコンピュータに実行させるプログラムを記録する付記21に記載の記録媒体。
11 状態判定装置
12 状態判定装置
13 状態判定装置
20 撮影装置
25 画像保存装置
30 表示装置
40 情報提供装置
50 状態判定システム
51 状態判定システム
110 判定モデル
120 出力合算部
121 出力合算部
130 状態判定部
140 重み決定部
141 重み決定部
142 重み決定部
150 ロス算出部
160 パラメータ修正部
161 パラメータ修正部
170 外部環境ロス算出部
610 CPU
620 ROM
630 RAM
640 記憶装置
650 NIC
690 記録媒体
810 車両
820 ドライブレコーダー
830 無線通信路
840 無線基地局
850 ネットワーク
Claims (22)
- 道路の画像を取得する撮影装置を搭載した移動体の状態、道路の状態、及び、外部環境の少なくともいずれか一つが異なる教師データを用いてそれぞれ学習された複数の判定モデルと、
入力された道路の画像に対する複数の前記判定モデルからの出力を合算する出力合算手段と
を含む状態判定装置。 - 前記出力合算手段における合算の結果に基づいて道路の状態を判定して出力する状態判定手段
を含む請求項1に記載の状態判定装置。 - 移動体の状態、道路の状態、及び、外部環境の少なくとも一つが相互に異なる道路の画像を含む複数の教師データを用いて複数の前記判定モデルをそれぞれ学習させる
請求項1又は2に記載の状態判定装置。 - 入力された道路の画像に対応して、複数の前記判定モデルの出力を合算するために用いる重みを決定する重み決定手段をさらに含み、
前記出力合算手段が、決定された前記重みを用いて複数の前記判定モデルの出力を合算する
請求項1ないし3のいずれか1項に記載の状態判定装置。 - 前記重み決定手段が、前記重み学習用の教師データを用いて、前記重みを決定するためのパラメータを学習する
請求項4に記載の状態判定装置。 - 前記重み決定手段が、さらに、外部環境に対応して、前記重みを決定する前記パラメータを学習する
請求項5に記載の状態判定装置。 - 請求項1ないし5のいずれか1項に記載の状態判定装置と、
判定対象の道路の画像を取得して前記状態判定装置に出力する撮影装置、及び/又は、判定対象の道路の画像を保存する記憶装置と、
前記状態判定装置が出力する道路の状態を表示する表示装置と
を含む状態判定システム。 - 請求項6に記載の状態判定装置と、
判定対象の道路の画像を撮影して前記状態判定装置に出力する撮影装置、及び/又は、判定対象の道路の画像を保存する記憶装置と、
前記状態判定装置に外部環境を出力する情報提供装置と、
前記状態判定装置が出力する道路の状態を表示する表示装置と
を含む状態判定システム。 - 道路の画像を取得する撮影装置を搭載した移動体の状態、道路の状態、及び、外部環境の少なくともいずれか一つが異なる教師データを用いてそれぞれ学習された複数の判定モデルを含む状態判定装置が、
入力された道路の画像に対する複数の前記判定モデルからの出力を合算する
状態判定方法。 - 合算の結果に基づいて道路の状態を判定する
請求項9に記載の状態判定方法。 - 移動体の状態、道路の状態、及び、外部環境の少なくとも一つが相互に異なる道路の画像を含む複数の教師データを用いて複数の前記判定モデルをそれぞれ学習させる
請求項9又は10に記載の状態判定方法。 - 入力された道路の画像に対応して、複数の前記判定モデルの出力を合算するために用いる重みを決定し、
決定された前記重みを用いて複数の前記判定モデルの出力を合算する
請求項9ないし11のいずれか1項に記載の状態判定方法。 - 前記重み学習用の教師データを用いて、前記重みを決定するためのパラメータを学習する
請求項12に記載の状態判定方法。 - さらに、外部環境に対応して、前記重みを決定する前記パラメータを学習する
請求項13に記載の状態判定方法。 - 状態判定装置が、請求項9ないし13のいずれか1項に記載の状態判定方法を実行し、
撮影装置が、判定対象の画像を取得して前記状態判定装置に出力し、及び/又は、記憶装置が、判定対象の画像を保存し、
表示装置が、前記状態判定装置が出力する道路の状態を表示する
状態判定方法。 - 状態判定装置が、請求項14に記載の状態判定方法を実行し、
撮影装置が、判定対象の道路の画像を取得して前記状態判定装置に出力し、及び/又は、記憶装置が、判定対象の画像を保存し、
情報提供装置が、前記状態判定装置に外部環境を出力し、
表示装置が、前記状態判定装置が出力する道路の状態を表示する
状態判定方法。 - 道路の画像を取得する撮影装置を搭載した移動体の状態、道路の状態、及び、外部環境の少なくともいずれか一つが異なる教師データを用いてそれぞれ学習された複数の判定モデルを含むコンピュータに、
入力された道路の画像に対する複数の前記判定モデルからの出力を合算する処理
を実行させるプログラムを記録する記録媒体。 - 合算の結果に基づいて道路の状態を判定する処理
をコンピュータに実行させるプログラムを記録する請求項17に記載の記録媒体。 - 移動体の状態、道路の状態、及び、外部環境の少なくとも一つが相互に異なる道路の画像を含む複数の教師データを用いて複数の前記判定モデルをそれぞれ学習させる処理
をコンピュータに実行させるプログラムを記録する請求項17又は18に記載の記録媒体。 - 入力された道路の画像に対応して、複数の前記判定モデルの出力を合算するために用いる重みを決定する処理と、
決定された前記重みを用いて複数の前記判定モデルの出力を合算する処理と
をコンピュータに実行させるプログラムを記録する請求項17ないし19のいずれか1項に記載の記録媒体。 - 前記重み学習用の教師データを用いて、前記重みを決定するためのパラメータを学習する処理
をコンピュータに実行させるプログラムを記録する請求項20に記載の記録媒体。 - さらに、外部環境に対応して、前記重みを決定する前記パラメータを学習する処理
をコンピュータに実行させるプログラムを記録する請求項21に記載の記録媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/043471 WO2022107330A1 (ja) | 2020-11-20 | 2020-11-20 | 状態判定装置、状態判定システム、状態判定方法、及び、記録媒体 |
JP2022563539A JPWO2022107330A5 (ja) | 2020-11-20 | 状態判定装置、状態判定システム、状態判定方法、及び、プログラム | |
US18/036,354 US20230419687A1 (en) | 2020-11-20 | 2020-11-20 | State determination device, state determination method and recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/043471 WO2022107330A1 (ja) | 2020-11-20 | 2020-11-20 | 状態判定装置、状態判定システム、状態判定方法、及び、記録媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022107330A1 true WO2022107330A1 (ja) | 2022-05-27 |
Family
ID=81708704
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/043471 WO2022107330A1 (ja) | 2020-11-20 | 2020-11-20 | 状態判定装置、状態判定システム、状態判定方法、及び、記録媒体 |
Country Status (2)
Country | Link |
---|---|
US (1) | US20230419687A1 (ja) |
WO (1) | WO2022107330A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04328669A (ja) * | 1991-04-26 | 1992-11-17 | Toyo Electric Mfg Co Ltd | 統合ニューラルネットワーク及びその学習方式 |
JP2007136430A (ja) * | 2005-11-22 | 2007-06-07 | Kochi Univ Of Technology | いりこ等の選別方法とその装置 |
JP2009282783A (ja) * | 2008-05-22 | 2009-12-03 | Fuji Heavy Ind Ltd | リスク融合認識システム |
JP2018147261A (ja) * | 2017-03-06 | 2018-09-20 | Kddi株式会社 | モデル統合装置、モデル統合システム、方法およびプログラム |
JP2020013537A (ja) * | 2018-04-25 | 2020-01-23 | トヨタ自動車株式会社 | 路面状態推定装置及び路面状態推定方法 |
JP2020147961A (ja) * | 2019-03-12 | 2020-09-17 | 東芝インフラシステムズ株式会社 | 道路維持管理システム、舗装種別判定装置、舗装劣化判定装置、修繕優先度判定装置、道路維持管理方法、舗装種別判定方法、舗装劣化判定方法、修繕優先度判定方法 |
-
2020
- 2020-11-20 WO PCT/JP2020/043471 patent/WO2022107330A1/ja active Application Filing
- 2020-11-20 US US18/036,354 patent/US20230419687A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04328669A (ja) * | 1991-04-26 | 1992-11-17 | Toyo Electric Mfg Co Ltd | 統合ニューラルネットワーク及びその学習方式 |
JP2007136430A (ja) * | 2005-11-22 | 2007-06-07 | Kochi Univ Of Technology | いりこ等の選別方法とその装置 |
JP2009282783A (ja) * | 2008-05-22 | 2009-12-03 | Fuji Heavy Ind Ltd | リスク融合認識システム |
JP2018147261A (ja) * | 2017-03-06 | 2018-09-20 | Kddi株式会社 | モデル統合装置、モデル統合システム、方法およびプログラム |
JP2020013537A (ja) * | 2018-04-25 | 2020-01-23 | トヨタ自動車株式会社 | 路面状態推定装置及び路面状態推定方法 |
JP2020147961A (ja) * | 2019-03-12 | 2020-09-17 | 東芝インフラシステムズ株式会社 | 道路維持管理システム、舗装種別判定装置、舗装劣化判定装置、修繕優先度判定装置、道路維持管理方法、舗装種別判定方法、舗装劣化判定方法、修繕優先度判定方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022107330A1 (ja) | 2022-05-27 |
US20230419687A1 (en) | 2023-12-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10453256B2 (en) | Lane boundary detection data generation in virtual environment | |
US11263726B2 (en) | Method, apparatus, and system for task driven approaches to super resolution | |
EP3706072B1 (en) | Method, apparatus, and system for detecting degraded ground paint in an image | |
CN110155053A (zh) | 提供用于驾驶车辆的信息的方法和设备 | |
US9823085B2 (en) | System for calculating routes | |
CN115686005A (zh) | 训练自动驾驶系统的虚拟模型的系统及计算机执行方法 | |
CN109644144A (zh) | 无线网络优化 | |
US20120271864A1 (en) | Method for assisted road extrapolation from imagery | |
JP2019196680A (ja) | 舗装情報収集点検システム、舗装情報収集点検方法、及びプログラム | |
US11182607B2 (en) | Method, apparatus, and system for determining a ground control point from image data using machine learning | |
CN108831183A (zh) | 基于机器视觉的停车场管理系统 | |
JP5500388B2 (ja) | 撮影位置特定システム、撮影位置特定プログラム、及び撮影位置特定方法 | |
WO2021186989A1 (ja) | 劣化診断装置、劣化診断システム、劣化診断方法、及び、記録媒体 | |
US12202472B2 (en) | Apparatus and methods for predicting a state of visibility for a road object based on a light source associated with the road object | |
Neuhold et al. | Predicting and optimizing traffic flow at toll plazas | |
WO2019239477A1 (ja) | 地図生成装置および地図生成システム | |
US20230360407A1 (en) | Method, apparatus, and computer program product for map data generation from probe data imagery | |
KR102227649B1 (ko) | 자율주행 기능 검증을 위한 데이터베이스를 이용한 자율주행 검증장치 및 방법 | |
CN106504210A (zh) | 一种modis影像数据缺失修复方法 | |
WO2022107330A1 (ja) | 状態判定装置、状態判定システム、状態判定方法、及び、記録媒体 | |
US10489923B2 (en) | Estimating conditions from observations of one instrument based on training from observations of another instrument | |
Burghardt et al. | Contrast ratio of road markings in Poland-evaluation for machine vision applications based on naturalistic driving study | |
JP7468711B2 (ja) | 教師データ生成装置、教師データ生成方法、及び、プログラム | |
CN114252085A (zh) | 一种导航处理方法、装置以及设备 | |
US20230358564A1 (en) | Method, apparatus, and computer program product for probe data-based geometry generation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20962492 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022563539 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18036354 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20962492 Country of ref document: EP Kind code of ref document: A1 |