CN111932543B - Method, electronic device and storage medium for determining pixilated car damage data - Google Patents
Method, electronic device and storage medium for determining pixilated car damage data Download PDFInfo
- Publication number
- CN111932543B CN111932543B CN202011106235.0A CN202011106235A CN111932543B CN 111932543 B CN111932543 B CN 111932543B CN 202011106235 A CN202011106235 A CN 202011106235A CN 111932543 B CN111932543 B CN 111932543B
- Authority
- CN
- China
- Prior art keywords
- vehicle
- model
- determining
- vehicles
- injury
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000006378 damage Effects 0.000 title claims abstract description 224
- 238000000034 method Methods 0.000 title claims abstract description 32
- 208000027418 Wounds and injury Diseases 0.000 claims abstract description 126
- 208000014674 injury Diseases 0.000 claims abstract description 98
- 230000002776 aggregation Effects 0.000 claims abstract description 7
- 238000004220 aggregation Methods 0.000 claims abstract description 7
- 230000004931 aggregating effect Effects 0.000 claims abstract description 3
- 238000012545 processing Methods 0.000 claims description 15
- 238000012423 maintenance Methods 0.000 claims description 9
- 238000004590 computer program Methods 0.000 claims description 7
- 230000004044 response Effects 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 11
- 238000004891 communication Methods 0.000 description 8
- 238000007405 data analysis Methods 0.000 description 8
- 238000007689 inspection Methods 0.000 description 7
- 230000006399 behavior Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000013461 design Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 2
- 230000008439 repair process Effects 0.000 description 2
- 238000012163 sequencing technique Methods 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 241001244708 Moroccan pepper virus Species 0.000 description 1
- 229910052799 carbon Inorganic materials 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000009528 severe injury Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
Abstract
The present disclosure provides a method, electronic device, and computer-readable storage medium for determining pixelated vehicular injury data. The method comprises the following steps: acquiring pictures of a plurality of surfaces of a vehicle acquired by using a planar array camera; determining a plane vehicle model; determining a traffic lane image of the vehicle based on the pictures of the plurality of surfaces of the vehicle and the planar vehicle model; determining a pixelated vehicle damage image of the vehicle based on the vehicle damage image of the vehicle and the smallest pixel block of the planar vehicle model; determining a vehicle injury location of the vehicle based on the pixelated vehicle injury image of the vehicle to generate pixelated vehicle injury data for the vehicle; and acquiring pixilated car damage data of the vehicles, aggregating the pixilated car damage data of the vehicles by taking the models of the vehicles as an aggregation condition, and determining the car damage factor of the vehicles of the models in unit time. By the aid of the scheme, vehicle injury conditions of the vehicle can be accurately judged, and then an accurate vehicle injury model can be determined.
Description
Technical Field
The present disclosure relates generally to the field of vehicle management technology, and more particularly, to a method, electronic device, and computer-readable storage medium for determining pixelated vehicle injury data.
Background
In the vehicle rental industry, when a customer returns a vehicle, the vehicle rental company staff needs to check the damage condition of the vehicle to determine whether the customer should be charged for maintenance. In the used-car trading industry, when a used-car trading company purchases used cars from customers, the damage condition of the vehicles of the customers also needs to be checked so as to accurately estimate the vehicles.
Currently, one way to check for damage to a vehicle is to use a paper vehicle checklist. The staff inspects the vehicle through naked eyes, judges the position and the size of the vehicle injury, and paints the part with the vehicle injury on the paper vehicle model. In this case, the results of the vehicle injury check of the same vehicle may be greatly different due to the difference in subjective judgment of different persons. In addition, paper vehicle test tickets typically require multiple copies to be made of carbon paper to be provided to customers and company personnel, respectively. On the one hand, the copying quality of the paper vehicle inspection list cannot be guaranteed, and on the other hand, the paper vehicle inspection list is easy to lose, so that the vehicle damage cannot be traced.
Another way to check for damage to the vehicle is to use a photo ticket. The front end staff takes a picture of the surface of the vehicle, and the rear end staff or the computer identifies the picture to judge the position and the size of the vehicle injury. However, in this case, since the photo is a result of the optical system of the camera being exposed to light, the light intensity, the pixels of the camera, the resolution, the imaging algorithm, even the color of the car body, etc. all affect the quality of the photo, thereby making the photo type car inspection list unreliable. For example, overexposure tends to occur more easily as the vehicle body is lighter in color, so that some surface depressions cannot be recognized by the photograph.
Disclosure of Invention
In view of at least one of the above problems, the present disclosure provides a scheme for determining pixelated vehicular damage data that is capable of accurately determining a vehicular damage condition of a vehicle.
According to one aspect of the present disclosure, a method of determining pixelated vehicular injury data is provided. The method comprises the following steps: acquiring pictures of a plurality of surfaces of a vehicle acquired by using a planar array camera; establishing a two-dimensional vehicle model based on the three-dimensional vehicle model; determining a traffic lane image of the vehicle based on the pictures of the plurality of surfaces of the vehicle and the two-dimensional vehicle model; determining a pixelated vehicle damage image of the vehicle based on the vehicle damage image of the vehicle and the smallest pixel block of the two-dimensional vehicle model; determining a vehicle injury location of the vehicle based on the pixelated vehicle injury image of the vehicle to generate pixelated vehicle injury data for the vehicle; acquiring pixilated vehicle injury data of a plurality of vehicles, wherein the pixilated vehicle injury data of each vehicle comprises the model of the vehicle, the running time length in a given time period and the vehicle injury pixel number of the vehicle, wherein the vehicle injury pixel number of the vehicle indicates the number of minimum pixel blocks contained in the vehicle injury position of the vehicle; aggregating the pixelated vehicle damage data of the plurality of vehicles by taking the models of the vehicles as aggregation conditions, and determining the sum of the vehicle damage pixel numbers of each model of the vehicles and the sum of the running time length of each model of the vehicles in a given time period; and determining the vehicle damage factor of each model of vehicle in unit time based on the sum of the number of vehicle damage pixels of the vehicle and the sum of the running time length of the vehicle in the given time period of the vehicle.
According to another aspect of the present disclosure, an electronic device is provided. The electronic device includes: at least one processing unit; and at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit, the instructions when executed by the at least one processing unit, cause the electronic device to perform steps in accordance with the above-described method.
According to yet another aspect of the present disclosure, a computer-readable storage medium is provided, having stored thereon computer program code, which, when executed, performs the method as described above.
Drawings
The present disclosure will be better understood and other objects, details, features and advantages thereof will become more apparent from the following description of specific embodiments of the disclosure given with reference to the accompanying drawings.
Fig. 1 shows a schematic diagram of a system for determining pixelated vehicular injury data, in accordance with an embodiment of the present disclosure.
Fig. 2 illustrates a flow diagram of a method for determining pixelated vehicular injury data, in accordance with an embodiment of the present disclosure.
Fig. 3 shows an exemplary schematic of a plan view of a vehicle according to an embodiment of the disclosure.
FIG. 4 shows a schematic diagram of a two-dimensional vehicle model in a planar rectangular coordinate system according to an embodiment of the disclosure.
Fig. 5 shows a schematic view of a car wound image according to an embodiment of the present disclosure.
Fig. 6 shows a schematic diagram of a pixelated car wound image, according to an embodiment of the disclosure.
FIG. 7 shows a flowchart of one embodiment of the steps for obtaining a car injury model according to the present disclosure.
FIG. 8 shows a flowchart of another embodiment of the steps for obtaining a car injury model according to the present disclosure.
FIG. 9 shows a flowchart of yet another embodiment of the steps for obtaining a car injury model according to the present disclosure.
FIG. 10 illustrates a block diagram of an electronic device suitable for implementing embodiments of the present disclosure.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The term "include" and variations thereof as used herein is meant to be inclusive in an open-ended manner, i.e., "including but not limited to". Unless specifically stated otherwise, the term "or" means "and/or". The term "based on" means "based at least in part on". The terms "one embodiment" and "some embodiments" mean "at least one example embodiment". The term "another embodiment" means "at least one additional embodiment". The terms "first," "second," and the like may refer to different or the same object.
Fig. 1 shows a schematic diagram of a system 1 for determining pixelated vehicular injury data, according to an embodiment of the present disclosure. As shown in fig. 1, the system 1 includes a plurality of clients 10 and a server 20 in communication with the plurality of clients 10. Here, each client 10 may be a mobile or fixed terminal such as a tablet computer, a notebook computer, a desktop computer, or the like equipped at a store or a car inspection point of an enterprise such as a car rental company or a used car trading company. The client 10 may communicate with the server 20 of the enterprise to send information to the server 20 and/or to receive information from the server 20. The client 10 and the server 20 can communicate with each other through various communication means, such as a fiber optic network, a mobile internet, and the like. Client 10 and server 20 may each include at least one processing unit and at least one memory coupled to the at least one processing unit having stored therein instructions executable by the at least one processing unit that, when executed by the at least one processing unit, perform at least a portion of method 100 as described below. The specific structure of the client 10 and the server 20 may be described below in conjunction with fig. 10, for example.
Fig. 2 shows a flow diagram of a method 100 for determining pixelated vehicular injury data, in accordance with an embodiment of the present disclosure. The method 100 may be performed, for example, by the client 10 or the server 20 in the system 1 shown in fig. 1. Where the method 100 is performed by the client 10, the client 10 may obtain pictures of multiple surfaces of a vehicle to be inspected from a face-line camera of a store or a vehicle inspection point and determine pixelated vehicle damage data for the vehicle based on the pictures, and optionally may obtain various vehicle damage models from the pixelated vehicle damage data for the multiple vehicles. In the case where the method 100 is performed by the server 20, the server 20 may receive pictures of a plurality of surfaces of a vehicle to be inspected, acquired with a planar camera, from the client 10 of a store or a vehicle inspection point, and determine pixelated vehicle damage data of the vehicle based on the pictures, and further optionally may obtain various vehicle damage models from the pixelated vehicle damage data of the plurality of vehicles. The method 100 is described in more detail below with reference to fig. 1 to 10, taking the server 20 as an example.
As shown in fig. 2, the server 20 acquires pictures of a plurality of surfaces of the vehicle captured using a planar array camera at step 110. The area array camera is a camera for capturing images in units of planes, and can acquire an image of the entire surface at one time by using a plurality of photosensitive elements such as a CCD (Charge Coupled Device) arranged in a matrix. In the present disclosure, a plurality of area-array cameras may be installed at a store or a vehicle inspection point to respectively perform image acquisition of fixed spots on a plurality of surfaces of a vehicle. In one embodiment, a face-array camera may be installed right in front of, right behind, left side, right side, and top of the vehicle to capture pictures of the right front surface, right rear surface, left surface, right surface, and top surface of the vehicle, respectively. Considering that the front and rear sides of the vehicle are not generally flat surfaces but are surfaces with a certain chamfer, and these surfaces are surfaces where vehicle injuries often occur, in another embodiment, a planar camera may be additionally installed at the front left, front right, rear left and rear right of the vehicle to respectively take pictures of the front left surface, front right surface, rear left surface and rear right surface of the vehicle. In this case, when the vehicle enters the designated vehicle-checking position, the area-array cameras can simultaneously take area-scan shots of the respective surfaces of the vehicle to obtain a picture of each surface.
At step 120, the server 20 determines a two-dimensional vehicle model based on the three-dimensional vehicle model. Specifically, the exterior surfaces of a three-dimensional (3D) vehicle model may first be unfolded and tiled to generate a plan view 200 of the 3D vehicle model. Fig. 3 illustrates an exemplary schematic of a plan view 200 of a vehicle according to an embodiment of the disclosure. As shown in fig. 3, a front surface 201, a rear surface 202, a left surface 203, a right surface 204, and a top surface 205 of a 3D vehicle model may be unfolded and tiled into a plan view 200 of the 3D vehicle model. Note that, depending on the arrangement of the area-array camera, the front surface 201 here may correspond to the picture of the front surface of the vehicle acquired in the above-described step 110 or a stitched picture of the front surface, the left front surface, and the right front surface. Similarly, the rear surface 202 here may correspond to the picture of the rear surface of the vehicle acquired in step 110 above or a stitched picture of the rear surface, the left rear surface, and the right rear surface.
In addition, the 3D vehicle model may be a unified conceptualized vehicle model or may be a dedicated vehicle model for each specific vehicle type. Alternatively, in some embodiments, different 3D vehicle models for vehicle sizes, such as small cars, utility vehicles (MPVs or SUVs), may also be used.
Next, a planar rectangular coordinate system is established for the planar view 200 of the 3D vehicle model, and the established planar rectangular coordinate system is gridded in units of one minimum pixel block to generate a two-dimensional vehicle model 210. FIG. 4 shows a schematic diagram of a two-dimensional vehicle model 210 in a planar orthogonal coordinate system XOY, in accordance with an embodiment of the present disclosure. As shown in fig. 4, a planar orthogonal coordinate system XOY is established based on a planar view 200 of the 3D vehicle model, where a first dimension OX is, for example, along the vehicle body length direction in the planar view 200, and a second dimension OY is, for example, along the vehicle body width direction in the planar view 200. The minimum pixel block 211 may be determined based on the minimum area to be marked as a car injury. For example, in some cases, there may be a spot-like car injury on the car body, which is a small enough area that no additional maintenance is required. Therefore, the minimum pixel block 211 can be determined based on the smallest defect area to be subjected to the repair process. As shown in fig. 4, after gridding the rectangular plane coordinate system XOY on which the plane view 200 is located in units of the minimum pixel block 211, each point on the plane view 200 of the vehicle is in only one grid (i.e., the minimum pixel block 211), so that each point on the surface of the vehicle body can be uniquely located.
Here, the two-dimensional vehicle model 210 may be generated and stored in advance by the client 10 or the server 20 for recall each time the vehicle injury check is performed. Alternatively, the two-dimensional vehicle model 210 may also be temporarily generated by the client 10 or the server 20 each time the vehicle injury check is performed. Further, depending on the 3D vehicle model used, the two-dimensional vehicle model 210 may be globally uniform or may be a dedicated model for each vehicle type or different vehicle sizes.
Next, at step 130, a traffic injury image 220 of the vehicle is determined based on the pictures of the plurality of surfaces of the vehicle acquired at step 110 and the two-dimensional vehicle model 210 determined at step 120. Fig. 5 shows a schematic view of a car wound image 220 according to an embodiment of the present disclosure. Specifically, the pictures of the surfaces of the vehicle obtained in step 110 may be processed to extract the car damage 221 therein, and the car damage 221 is mapped onto the two-dimensional vehicle model 210 to generate the car damage image 220. Here, the car damage 221 may be determined by performing edge detection on the pictures of the plurality of surfaces of the vehicle acquired in step 110, or may be determined by comparing the pictures of the plurality of surfaces of the vehicle acquired in step 110 with standard pictures of the plurality of surfaces of the vehicle (for example, pictures of surfaces of vehicles of the same type provided by vehicle manufacturers at the time of shipment).
At step 140, a pixelated vehicular injury image 230 of the vehicle may be determined based on the vehicular injury image 220 of the vehicle and the smallest pixel block 211 of the two-dimensional vehicle model 210. Fig. 6 shows a schematic diagram of a pixelated car wound image 230, according to an embodiment of the disclosure. As shown in fig. 6, the car damage 221 on the car damage image 220 shown in fig. 5 may be pixilated based on the minimum pixel block 211 to obtain a pixilated car damage 231. After all of the car injuries on the car injury image 220 are pixilated, the resulting image is referred to as a pixilated car injury image 230.
Next, at step 150, a vehicle injury location of the vehicle may be determined based on the pixelated vehicle injury image 230 of the vehicle to generate pixelated vehicle injury data for the vehicle. For example, as shown in fig. 6, the car wound location of the pixelated car wound 231 may be represented using coordinates of its four vertices as R1(10,2), R2(13,2), R3(13,3), R4(10,3), which is 3 times the size of the minimum pixel block 211. Alternatively, the car damage position of the pixelated car damage 231 may also be represented by coordinates or numbers of pixel points occupied by the car damage position, for example, the car damage position of the pixelated car damage 231 may be represented by the minimum pixel block 211 as pixel points with coordinates S1(11,3), S2(12,3), S3(13,3) or 51, 52, 53, that is, occupying 3 pixel points. The location of the pixelated car damage 231 may also be referred to hereinafter as the car damage location 231. The pixelated vehicular damage data of a vehicle at least comprises vehicular damage positions of all pixelated vehicular damages of the vehicle, such as a vertex coordinate set of the vehicular damage positions or coordinates or numbers of pixel points of the vehicular damage positions. Of course, the pixelated lane damage data may also include other information, as described below.
Thus, pixelized vehicular injury data is obtained, which represents vehicular injury information in the form of pixel coordinates, for example, and the data thus obtained is conveniently stored digitally for subsequent further use and backtracking. The three-dimensional vehicle model is converted into the two-dimensional processing model, and the pixelized vehicle injury image of the vehicle is determined on the two-dimensional vehicle injury model, so that the obtained pixelized vehicle injury data only need to be expressed by two-dimensional coordinates, and compared with the three-dimensional coordinate expression in the three-dimensional model, the data volume and the operation complexity of subsequent data processing are greatly reduced.
In some embodiments, step 150 may be further followed by: an electronic checklist is generated based on the pixelated vehicle injury data for the vehicle, which may reproduce the vehicle injury location 231 for the vehicle in a two-dimensional coordinate system. That is, after the pixelated vehicular injury data is determined and stored, an electronic checklist may be generated for viewing by back-end workers based on the pixelated vehicular injury data. The electronic checklist may contain only the coordinates of the car wound location for each pixelated car wound, or may include both the plan view 200 of the vehicle and the coordinates of the car wound location. In the latter case, the electronic checklist is similar in form to the pixelated car wound image 230 shown in FIG. 6.
In other embodiments, step 150 may be followed by step 170 (not shown in fig. 2) in which the pixelated vehicle injury data for a plurality of vehicles is statistically analyzed to obtain various vehicle injury models, such as an overall vehicle injury model, a vehicle injury model classified by vehicle type, a vehicle injury location model, a vehicle injury level model, a vehicle injury model based on driving behavior, and so forth, as described below. Here, in order to make the acquired car damage model as accurate as possible, the large data analysis should be performed using the pixelated car damage data of a large number of vehicles (e.g., at least 1000 vehicles). Several embodiments of the step 170 of obtaining a car injury model according to the present disclosure are described below in conjunction with fig. 7-9, respectively. However, it will be understood by those skilled in the art that the embodiments described in fig. 7-9 are not mutually exclusive and may be implemented in combination. For example, pixelated vehicular damage data of a plurality of vehicles may be acquired, and a vehicular damage model classified by vehicle type as illustrated in fig. 7, an overall vehicular damage model as illustrated in fig. 8, a vehicular damage location model as illustrated in fig. 9, and the like may be sequentially determined based on the pixelated vehicular damage data.
FIG. 7 shows a flowchart of one embodiment of the step 170 for obtaining a car wound model according to the present disclosure. In the embodiment shown in fig. 7, a vehicle damage condition (e.g., a vehicle damage factor per unit time) of a vehicle is subjected to big data analysis for a vehicle type of the vehicle to obtain a vehicle damage model classified by the vehicle type.
As shown in fig. 7, step 170 may include a substep 172 in which pixelated vehicular injury data may be acquired for a plurality of vehicles. The pixelated traffic injury data for each vehicle may include a variety of information, such as the model of the vehicle, the coordinates of the pixelated traffic injury location for the vehicle, the length of travel time for a given period of time (e.g., 1 month or 1 year) for the vehicle, the number of traffic injury pixels for the vehicle, the pixel location of the traffic injury location, and the like, which may be stored in a database in the form of a list. Here, the number of damaged pixels of the vehicle indicates the number of minimum pixel blocks included in the damaged position of the vehicle, and for example, as shown in fig. 6, the number of minimum pixel blocks included in the damaged position 231 of the vehicle is 3, so the number of damaged pixels of the vehicle may be 3 (assuming that the vehicle has no other damage). Note that the number of vehicle crashes for the vehicle may be a separate entry in the pixelated vehicle crashes data for the vehicle, or may be derived based on the coordinates of the location of the pixelated vehicle crashes in the pixelated vehicle crashes data for the vehicle. In the latter case, taking the car damage position 231 shown in fig. 6 as an example, the length of the car damage position 231 is 3 and the width thereof is 1, which are determined based on the vertex coordinates R1(10,2), R2(13,2), R3(13,3), and R4(10,3) of the car damage position 231, and thus it includes 3 minimum pixel blocks 211 and the number of car damage pixels is 3. Table 1 below shows an exemplary list of pixelated vehicular injury data for a plurality of vehicles acquired by sub-step 172. Note that table 1 is only one example of pixelated car damage data, which according to the present invention may also include other information, such as pixel points of a car damage location, etc., as described below.
TABLE 1
Where N is the total number of vehicles acquired and is a larger integer, such as an integer greater than 1000, for the purpose of big data analysis. For vehicle V1In other words, the vehicle model is C1, and the running time period in a given time period is T1It comprises three car injury positions with coordinates E11、E12、E13The number of the damaged pixels is n1(i.e., the sum of the number of car damage pixels for the three car damage locations); for vehicle V2In other words, the vehicle model is C2, and the running time period in a given time period is T2Comprising a location of the car wound with coordinates E21The number of the damaged pixels is n2(ii) a And so on. Here, the coordinate E of the traffic damage position is a form of a set of coordinates R1(10,2), R2(13,2), R3(13,3), and R4(10,3) of the traffic damage position 231 as shown in fig. 6. Furthermore, the acquired plurality of vehicles V are illustrated in table 1 by way of example in three vehicle types C1, C2 and C31、V2、……、VNThe vehicle model of (a), however, this is merely exemplary and is not intended to limit the scope of the present disclosure.
In the embodiment shown in fig. 7, the obtained pixelated vehicular damage data for each vehicle includes at least the model of the vehicle, the length of time the vehicle has traveled over a given period of time (e.g., 1 month or 1 year), and the number of vehicular damage pixels for the vehicle.
In sub-step 174, the pixelated vehicular damage data for the plurality of vehicles acquired in sub-step 172 may be aggregated with the model of the vehicle as the aggregation condition to determine a sum of the number of vehicular damage pixels for each model of vehicle and a sum of the travel time periods for each model of vehicle over a given time period. For example, for the pixelated vehicular damage data shown in table 1, the pixelated vehicular damage data may be aggregated with the vehicle type C1 as the aggregation condition to determine the sum n of the vehicular damage pixel numbers of the vehicle type C1C1=n1+n5+……+nNAnd the sum T of the running timeC1=T1+T5+……+TN. Similarly, the sum n of the number of vehicle damage pixels of the vehicle type C2 may also be determinedC2And the sum T of the running timeC2And the sum n of the number of damaged pixels of the vehicle type C3C3And the sum T of the running timeC3。
Next, in sub-step 176, the sum of the number of damaged pixels for each model of vehicle and the sum of the length of travel time for a given period of time for that model of vehicle may be basedAnd determining the damage factor of the vehicle of the model in unit time. For example, the vehicle damage factor R per unit time of the vehicle type C1C1= nC1/ TC1Vehicle damage factor R per unit time for vehicle type C2C2= nC2/ TC2Vehicle damage factor R per unit time for vehicle type C3C3= nC3/ TC3。
In this way, the vehicle damage factors per unit time of the vehicles of each vehicle type can be counted to determine the vehicle damage condition of the vehicles of each vehicle type.
In one embodiment, sub-step 176 may further include: determining an average travel time period T for a plurality of vehicles over a given time period based on pixelated lane damage data for the vehicles as shown in Table 1, where T = (T =)1+T2+……+TN) N, and determining an average travel time (t) for each model of vehicle within a given time periodC1=TC1/M1、tC2=TC2/M2、tC3=TC3/M3Wherein t isC1、tC2、tC3Average travel time length, M, in a given period of time for vehicles of models C1, C2, C3, respectively, among these vehicles1、M2、M3The number of vehicles of models C1, C2, C3, respectively) and the average running time period t for a given period of time of the vehicles is greater than a predetermined value. Assuming that the predetermined value is 0.5, the ratio t can be determined separatelyC1/t、tC2/t、tC3Whether/t is greater than 0.5. And determining the vehicle damage factor per unit time of the model of vehicle based on the sum of the vehicle damage pixel numbers of the model of vehicle and the sum of the running periods within the given period of time of the model of vehicle only when it is determined that the ratio is greater than the predetermined value. For example, only when the ratio tC1When/t is greater than the predetermined value of 0.5, the sum n of the number of damaged pixels of the vehicle based on the model C1C1And the sum T of the running time periods in a given period of time of the vehicle of model C1C1Determining the damage factor R of the vehicle of the model in unit timeC1= nC1/ TC1. That is, inIn such an embodiment, the damage factor per unit time is calculated only for the vehicle types whose ratio of the average usage time period to the total average usage time period t is greater than a certain value (e.g., 0.5). For these vehicle types, the service life of the vehicle can be considered to be long enough, so that the acquired vehicle injury data has referential property, and statistical distortion caused by individual difference of individual vehicles is avoided.
Further, in such an embodiment, it is also possible to obtain a vehicle damage factor (R) per unit time for each model of vehicleC1、RC2、RC3) And sequencing to obtain sequencing results of the vehicle damage conditions of the vehicles of various models. In some cases, the vehicle damage factor R per unit time of the plurality of vehicles shown in table 1 may also be calculated (R = n/T, n = n)1+n2+……+nN) And determining the vehicle type with the vehicle damage factor larger than R in unit time. These vehicle models may be marked as those that are worse than average due to vehicle damage, which may be a problem due to vehicle model design, which may be fed back to the company's operating department or the vehicle's manufacturer for subsequent operation or to improve the design.
FIG. 8 shows a flowchart of another embodiment of the step 170 for obtaining a car wound model according to the present disclosure. In the embodiment shown in fig. 8, the damage condition of the vehicle (e.g., the damage factor per unit time) is subjected to big data analysis to obtain an overall damage model, regardless of the model of the vehicle.
As shown in fig. 8, step 170 may include a sub-step 172' in which pixelated vehicular injury data may be acquired for a plurality of vehicles. Since the influence of the vehicle type of the vehicle on the vehicle damage condition is not considered in the present embodiment, the obtained pixilated vehicle damage data of the vehicle may be the same as the embodiment shown in fig. 7, or may only include the running time length of the vehicle in a given time period and the vehicle damage pixel number of the vehicle.
At sub-step 174 ', the sum of the number of damaged pixels of the vehicle obtained at sub-step 172' may be determined (n = n)1+n2+……+nN) And the sum of the travel periods in a given period of time for these vehicles (T = T)1+T2+……+TN)。
In sub-step 176', a damage factor R per unit time for the vehicles may be determined based on the sum n of the number of damaged pixels for the vehicles and the sum T of the travel time periods for the vehicles over a given time period, where R = n/T.
In this way, the traffic damage factors per unit time for a plurality of vehicles may be counted to determine an average traffic damage condition R for a given operating time. This condition can be used to guide the operation condition of the company in the following, for example, to price the maintenance cost of the car damage of the rental car, to set the influence factor of the car damage on the price of the used car in the used car transaction, etc.
FIG. 9 shows a flowchart of yet another embodiment of the step 170 for obtaining a car injury model according to the present disclosure. In the embodiment shown in fig. 9, the vehicle injury position can be subjected to big data analysis to determine a vehicle injury position model of the vehicle, namely, a position where the vehicle injury is most likely to occur, so as to guide subsequent operation and maintenance.
Specifically, as shown in fig. 9, step 170 may include a sub-step 172 ″ in which pixelated vehicular injury data may be acquired for a plurality of vehicles. Here, the pixelated vehicular damage data of each vehicle may or may not include the contents in the pixelated vehicular damage data shown in fig. 7 and 8, and may further include a pixel point of a vehicular damage position of the vehicle, where the pixel point of the vehicular damage position of the vehicle indicates a pixel point of each minimum pixel block included in the vehicular damage position of the vehicle. For example, the car damage position 231 as shown in fig. 6 may be expressed in the form of coordinates S1(11,3), S2(12,3), S3(13,3) of a pixel point position in units of the minimum pixel block 211 or in the form of numbers 51, 52, 53.
Next, in sub-step 174 ", a total number of times each pixel point location in the pixelated vehicular injury image 230 is labeled as a vehicular injury location may be determined based on the pixelated vehicular injury data for the vehicles. For example, in the pixelated car wound image 230 shown in fig. 6, the 3 pixel points occupied by the car wound location 231 are respectively labeled as primary car wound locations.
In sub-step 176 ", a corresponding operation and maintenance operation may be performed based on the total number of times each pixel point location in the pixelated lane-defect image 230 is marked as a lane-defect location. For example, the number of pixel points in the pixelated car wound image 230 that are labeled as car wound locations the greatest total number of times may be determined and labeled as car wound prone locations. Further, these locations may be provided to an operating system, to provide a user with a vehicle use prompt by the operating system, or to a maintenance system, to prepare in advance, by the maintenance system, spare parts needed to repair these vehicle injury prone locations.
In addition, in some embodiments, the pixelated car injury data acquired in step 150 may also be used for big data analysis of car injury level to obtain a car injury level model. Specifically, the number of pixel points of the car damage position of each order can be counted, and the car damage program is divided into slight damage, normal damage, severe damage and the like according to the counting result. For example, a car wound with the number of pixel points at the car wound position smaller than 3 may be classified as a minor damage, a car wound with the number of pixel points at the car wound position between 3 and 5 may be classified as a normal damage, and a car wound with the number of pixel points at the car wound position larger than 5 may be classified as a major damage.
In addition, in some embodiments, the pixelated vehicular injury data acquired in step 150 may also be used for big data analysis of the driving behavior of the user to obtain a vehicular injury model based on the driving behavior. For example, the number of pixel points of the car injury position may be counted by using the user name as an aggregation condition, and users with the number of pixel points of the car injury position larger than a certain value or ranked at the top (e.g., the top 100 bits or the top 10%) are marked as users with poor driving behavior, and additional safety guidance is performed on the users or additional service strategies are made. Or similarly, the number of pixel points of the car injury position of the corresponding user can be counted by taking the age group, the driving age, the gender, the car type and the like of the user as aggregation conditions to obtain the driving behavior characteristics of the user with the corresponding age group, the driving age, the gender and the car type, and personalized product recommendation or service strategies are provided for the user in the future.
FIG. 10 illustrates a block diagram of an electronic device 1000 suitable for implementing embodiments of the present disclosure. The electronic device 1000 may be, for example, a client 10 or a server 20 as described above.
As shown in fig. 10, electronic device 1000 may include one or more Central Processing Units (CPUs) 1010 (only one shown schematically) that may perform various suitable actions and processes in accordance with computer program instructions stored in Read Only Memory (ROM) 1020 or loaded from storage unit 1080 into Random Access Memory (RAM) 1030. In the RAM 1030, various programs and data required for the operation of the electronic apparatus 1000 can also be stored. The CPU 1010, ROM 1020, and RAM 1030 are connected to each other via a bus 1040. An input/output (I/O) interface 1050 is also connected to bus 1040.
A number of components in the electronic device 1000 are connected to the I/O interface 1050, including: an input unit 1060 such as a keyboard, a mouse, or the like; an output unit 1070 such as various types of displays, speakers, and the like; a storage unit 1080, such as a magnetic disk, optical disk, or the like; and a communication unit 1090 such as a network card, modem, wireless communication transceiver, or the like. The communication unit 1090 allows the electronic device 1000 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The method 100 described above may be performed, for example, by the CPU 1010 of the electronic device 1000 (e.g., the client 10 or the server 20). For example, in some embodiments, method 100 may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 1080. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 1000 via the ROM 1020 and/or the communication unit 1090. When the computer program is loaded into RAM 1030 and executed by CPU 1010, one or more operations of method 100 described above may be performed. Further, the communication unit 1090 may support wired or wireless communication functions.
Those skilled in the art will appreciate that the electronic device 1000 shown in fig. 10 is merely illustrative. In some embodiments, the client 10 or server 20 may contain more or fewer components than the electronic device 1000.
By utilizing the scheme disclosed by the invention, the vehicle injury data obtained is more objective and accurate by pixelating the vehicle injury of the vehicle and projecting the pixelation data into a standard electronic plane diagram, and the vehicle injury data can be stored in a digital mode for subsequent tracing or big data analysis so as to provide guidance for the operation and maintenance of a company.
A method 100 for determining pixelated vehicular injury data and an electronic device 1000 operable as a client 10 and server 20 in accordance with the present disclosure are described above with reference to the figures. However, it will be appreciated by those skilled in the art that the performance of the steps of the method 100 is not limited to the order shown in the figures and described above, but may be performed in any other reasonable order. For example, steps 110 and 120 of method 100 shown in FIG. 2 may be performed in parallel or in other orders. Further, the electronic device 1000 also need not include all of the components shown in fig. 10, it may include only some of the components necessary to perform the functions described in the present disclosure, and the manner in which these components are connected is not limited to the form shown in the drawings.
The present disclosure may be methods, apparatus, systems, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for carrying out various aspects of the present disclosure.
In one or more exemplary designs, the functions described in this disclosure may be implemented in hardware, software, firmware, or any combination thereof. For example, if implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium.
The units of the apparatus disclosed herein may be implemented using discrete hardware components, or may be integrally implemented on a single hardware component, such as a processor. For example, the various illustrative logical blocks, modules, and circuits described in connection with the disclosure may be implemented or performed with a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both.
The previous description of the disclosure is provided to enable any person skilled in the art to make or use the disclosure. Various modifications to the disclosure will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other variations without departing from the spirit or scope of the disclosure. Thus, the disclosure is not intended to be limited to the examples and designs described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (7)
1. A method of determining pixelated vehicular injury data, comprising:
acquiring pictures of a plurality of surfaces of a vehicle acquired by using a planar array camera;
determining a two-dimensional vehicle model based on the three-dimensional vehicle model;
determining a traffic lane image of the vehicle based on the pictures of the plurality of surfaces of the vehicle and the two-dimensional vehicle model;
determining a pixelated vehicle damage image of the vehicle based on the vehicle damage image of the vehicle and the smallest pixel block of the two-dimensional vehicle model;
determining a traffic lane change location of the vehicle based on the pixelated traffic lane change image of the vehicle to generate pixelated traffic lane change data of the vehicle;
acquiring pixilated vehicle injury data of a plurality of vehicles, wherein the pixilated vehicle injury data of each vehicle comprises the model of the vehicle, the running time length in a given time period and the vehicle injury pixel number of the vehicle, wherein the vehicle injury pixel number of the vehicle indicates the number of minimum pixel blocks contained in the vehicle injury position of the vehicle;
aggregating the pixelated vehicle damage data of the plurality of vehicles by taking the models of the vehicles as an aggregation condition, and determining the sum of the number of vehicle damage pixels of each model of the vehicles and the sum of the running time length of each model of the vehicles in a given time period; and
determining a damage factor per unit time for each model of vehicle based on a sum of the number of damaged pixels for the model of vehicle and a sum of the running time periods for a given period of time for the model of vehicle,
wherein determining the two-dimensional vehicle model based on the three-dimensional vehicle model comprises:
unfolding and tiling a plurality of exterior surfaces of the three-dimensional vehicle model to produce a plan view of the three-dimensional vehicle model;
establishing a plane rectangular coordinate system for the plane view of the three-dimensional vehicle model; and
and gridding the plane rectangular coordinate system by taking the minimum pixel block as a unit to generate the two-dimensional vehicle model.
2. The method of claim 1, further comprising:
generating an electronic checklist based on the pixelated car damage data of the vehicle, the electronic checklist for reproducing a car damage location of the vehicle in a two-dimensional coordinate system.
3. The method of claim 1, wherein determining a damage factor per unit time for the model of vehicle further comprises:
determining an average length of travel over a given period of time for the plurality of vehicles based on the pixelated vehicular injury data for the plurality of vehicles;
determining whether a ratio between an average travel time period in a given period of time for each model of vehicle and an average travel time period in a given period of time for the plurality of vehicles is greater than a predetermined value; and
in response to determining that the ratio is greater than the predetermined value, a vehicle damage factor per unit time for the model of vehicle is determined based on a sum of the number of vehicle damaged pixels for the model of vehicle and a sum of the travel time periods for the given period of time for the model of vehicle.
4. The method of claim 1, further comprising:
determining a sum of the number of damaged pixels of the plurality of vehicles and a sum of the travel periods within a given time period of the plurality of vehicles; and
determining a traffic damage factor per unit time for the plurality of vehicles based on a sum of the number of traffic damage pixels for the plurality of vehicles and a sum of the travel periods over a given period of time for the plurality of vehicles.
5. The method of claim 1, wherein the pixelated traffic injury data for each vehicle of the plurality of vehicles further comprises pixel point locations for traffic injury locations of the vehicle indicating pixel point locations for each minimum pixel block occupied by the traffic injury location of the vehicle, the method further comprising:
determining a total number of times each pixel point location in the pixelated lane wound image is labeled as a lane wound location based on the pixelated lane wound data for the plurality of vehicles; and
performing a corresponding operation and maintenance operation based on a total number of times each pixel point location in the pixelated lane-defect image is labeled as a lane-defect location.
6. An electronic device, comprising:
at least one processing unit; and
at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit, the instructions when executed by the at least one processing unit, cause the electronic device to perform the steps of the method of any of claims 1-5.
7. A computer readable storage medium having stored thereon computer program code which, when executed, performs the method of any of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011106235.0A CN111932543B (en) | 2020-10-16 | 2020-10-16 | Method, electronic device and storage medium for determining pixilated car damage data |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011106235.0A CN111932543B (en) | 2020-10-16 | 2020-10-16 | Method, electronic device and storage medium for determining pixilated car damage data |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111932543A CN111932543A (en) | 2020-11-13 |
CN111932543B true CN111932543B (en) | 2020-12-25 |
Family
ID=73334515
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011106235.0A Active CN111932543B (en) | 2020-10-16 | 2020-10-16 | Method, electronic device and storage medium for determining pixilated car damage data |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111932543B (en) |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105138970B (en) * | 2015-08-03 | 2018-11-16 | 西安电子科技大学 | Classification of Polarimetric SAR Image method based on spatial information |
CN106971374B (en) * | 2016-01-13 | 2020-06-23 | 北大方正集团有限公司 | Image pixelation method and image pixelation system |
CN107180389B (en) * | 2017-05-10 | 2018-06-15 | 平安科技(深圳)有限公司 | People hinders Claims Resolution setting loss fee calculating method, device, server and medium |
GB2566491B (en) * | 2017-09-15 | 2022-04-06 | Atkins Ltd | Damage detection and repair system |
CN110930405B (en) * | 2020-01-19 | 2022-07-26 | 南京理工大学 | Cutter damage detection method based on image area division |
-
2020
- 2020-10-16 CN CN202011106235.0A patent/CN111932543B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN111932543A (en) | 2020-11-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108683907B (en) | Optical module pixel defect detection method, device and equipment | |
US20190213689A1 (en) | Image-based vehicle damage determining method and apparatus, and electronic device | |
CN102542272B (en) | Information reading apparatus | |
TW201932827A (en) | Substrate defect inspection device, substrate defect inspection method, and storage medium | |
JP7054436B2 (en) | Detection system, information processing device, evaluation method and program | |
TW202009681A (en) | Sample labeling method and device, and damage category identification method and device | |
US20230044043A1 (en) | Method and system for automated grading and trading of numismatics and trading cards | |
CN102170545A (en) | Correction information calculating device, image processing apparatus, image display system, and image correcting method | |
CN112580707A (en) | Image recognition method, device, equipment and storage medium | |
CN112489240B (en) | Commodity display inspection method, inspection robot and storage medium | |
BR112021005196A2 (en) | object scanning for a network-based service | |
CN103383732A (en) | Image processing method and device | |
US20190244282A1 (en) | Computerized exchange network | |
CN112507923A (en) | Certificate copying detection method and device, electronic equipment and medium | |
CN116067671B (en) | A method, system and medium for testing vehicle paint quality | |
CN109102324B (en) | Model training method, and red packet material laying prediction method and device based on model | |
WO2024001309A1 (en) | Method and apparatus for generating and producing template for infrared thermal image analysis report | |
CN112287905A (en) | Vehicle damage identification method, device, equipment and storage medium | |
CN111932543B (en) | Method, electronic device and storage medium for determining pixilated car damage data | |
CN110135288B (en) | Method and device for quickly checking electronic certificate | |
JP2017219996A (en) | Population estimation system and population estimation method | |
CN109829401A (en) | Traffic sign recognition method and device based on double capture apparatus | |
CN114383564A (en) | Depth measurement method, device, device and storage medium based on binocular camera | |
CN113344084A (en) | Jewelry quality identification method and device based on image recognition | |
US8644618B2 (en) | Automatic evaluation of line weights |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |