[go: up one dir, main page]

CN114419478B - Flight on/off block time identification method, device, equipment and storage medium - Google Patents

Flight on/off block time identification method, device, equipment and storage medium Download PDF

Info

Publication number
CN114419478B
CN114419478B CN202111489016.XA CN202111489016A CN114419478B CN 114419478 B CN114419478 B CN 114419478B CN 202111489016 A CN202111489016 A CN 202111489016A CN 114419478 B CN114419478 B CN 114419478B
Authority
CN
China
Prior art keywords
time
rectangular coordinate
coordinate frame
time node
aircraft
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111489016.XA
Other languages
Chinese (zh)
Other versions
CN114419478A (en
Inventor
陈超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ucloud Technology Co ltd
Original Assignee
Ucloud Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ucloud Technology Co ltd filed Critical Ucloud Technology Co ltd
Priority to CN202111489016.XA priority Critical patent/CN114419478B/en
Publication of CN114419478A publication Critical patent/CN114419478A/en
Application granted granted Critical
Publication of CN114419478B publication Critical patent/CN114419478B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of image recognition, and discloses a method, a device, equipment and a storage medium for recognizing up-and-down gear time of flights. The method for identifying the time of the upper and lower gear of the aviation work comprises the steps of collecting video images fixedly shot by the airplane stand at intervals of preset time, inputting the video images into a pre-trained airplane identification model for identification, outputting an identified circumscribed rectangular coordinate frame of a target airplane, calculating a first overlapping degree value and a second overlapping degree value of the circumscribed rectangular coordinate frame of the target airplane and a preset reference circumscribed rectangular coordinate frame based on a preset overlapping degree value calculation formula, recording a first time node when the first overlapping degree value is effective and a second time node when the second overlapping degree value is effective, and determining the upper gear time or the lower gear time of the target airplane based on the first time node and the second time node. The invention realizes the automatic identification of the up-and-down gear time of the airplane, reduces the identification cost and improves the recording efficiency and accuracy.

Description

Method, device, equipment and storage medium for identifying time of upper and lower gear of flight
Technical Field
The present invention relates to the field of image recognition technologies, and in particular, to a method, an apparatus, a device, and a storage medium for identifying time of an up/down gear of a flight.
Background
With the rapid development of air transportation industry, to ensure the safety and the flight order of flights, an airport needs to collect a plurality of time nodes of each flight. Because of various subjective factors and objective factors, flight delays often occur, and thus, it is necessary to accurately obtain the take-off time and landing time of the aircraft so as to facilitate administration.
In the prior art, an airplane needs to be on a gear after arriving at an airport and stopping at a stop position to prevent the airplane from sliding, and the airplane needs to be off the gear before taking off and then glidingly taking off, so the landing or taking off time of the airplane is usually determined by the time of the airplane on the gear or the gear. The time of the upper wheel gear and the lower wheel gear of the airplane are generally determined and recorded in a manual reporting mode, so that uncertainty is caused, and accuracy and efficiency are low.
Disclosure of Invention
The invention mainly aims to provide a method, a device, equipment and a storage medium for identifying the time of the up-down gear of a flight, and aims to solve the technical problems that the method for recording the time of the up-down gear of the flight is inaccurate and low in efficiency in the prior art.
The first aspect of the invention provides a method for identifying the time of the up-and-down gear of a flight, which comprises the following steps:
Video images fixedly shot by an airplane stand are acquired at intervals of preset time;
Inputting the video image into a pre-trained airplane identification model for identification, and outputting an external rectangular coordinate frame of the identified target airplane;
Calculating a first overlapping degree value and a second overlapping degree value of an external rectangular coordinate frame of the target aircraft and a preset reference external rectangular coordinate frame based on a preset overlapping degree value calculation formula;
Recording a first time node when the first overlap value is effective and a second time node when the second overlap value is effective;
and determining the up gear time or the down gear time of the target aircraft based on the first time node and the second time node.
Optionally, in a first implementation manner of the first aspect of the present invention, the calculating, based on a preset overlap value calculation formula, a first overlap value and a second overlap value of an circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame includes:
calculating a first overlapping degree value of an circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset first overlapping degree value calculation formula;
calculating a second overlapping degree value of the circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset second overlapping degree value calculation formula;
The first overlap value calculation formula and the second overlap value calculation formula are as follows:
Where IOU 1 represents a first overlap value, IOU 2 represents a second overlap value, B p represents a circumscribed rectangular coordinate frame of the target aircraft, and B a represents a reference circumscribed rectangular coordinate frame.
Optionally, in a second implementation manner of the first aspect of the present invention, the recording a first time node when the first overlap value is valid and a second time node when the second overlap value is valid includes:
Judging whether the first overlapping degree value is between a preset maximum overlapping degree threshold value and a preset minimum overlapping degree threshold value range, if so, confirming that the first overlapping degree value is effective, and recording the shooting time of the video image as a first time node;
And if so, confirming that the second overlapping degree value is effective, and recording the shooting time of the video image as a second time node.
Optionally, in a third implementation manner of the first aspect of the present invention, the determining, based on the first time node and the second time node, an up gear time or a down gear time of the target aircraft includes:
Comparing the sizes of the first time node and the second time node;
if the first time node is larger than the second time node, determining that the gear-down time of the target aircraft is the second time node;
And if the first time node is smaller than the second time node, determining that the gear-up time of the target aircraft is the second time node.
Optionally, in a fourth implementation manner of the first aspect of the present invention, the method for identifying an up-down gear time of a flight further includes:
If the lower gear time of the target aircraft is determined to be the second time node, judging whether the difference value between the first time node and the second time node is within a first preset time range or not;
If the difference value between the first time node and the second time node is in the first preset time range, judging whether the difference value between the lower gear time of the target aircraft and the current flight take-off time of the aircraft stand is in a second preset time range, and if so, determining that the lower gear time of the target aircraft is correct;
If the gear-up time of the target aircraft is determined to be the second time node, judging whether the difference value between the first time node and the second time node is within the first preset time range or not;
If the difference value between the first time node and the second time node is in the first preset time range, judging whether the difference value between the upper gear time of the target aircraft and the current flight arrival time of the aircraft stand is in the second preset time range, and if so, determining that the upper gear time of the target aircraft is correct.
Optionally, in a fifth implementation manner of the first aspect of the present invention, before the capturing video images of the fixed shooting of the aircraft stand at preset time intervals, the method further includes:
Acquiring sample video images of upper and lower wheel blocks of a sample aircraft shot by the same aircraft stand;
Inputting the sample video images into the aircraft identification model in sequence for identification, and outputting the identified circumscribed rectangular coordinate frames of each sample aircraft;
Clustering the circumscribed rectangular coordinate frames of each sample aircraft by adopting a preset clustering algorithm to obtain a plurality of clusters;
And counting the average circumscribed rectangular coordinate frame, the minimum circumscribed rectangular coordinate frame and the maximum circumscribed rectangular coordinate frame of the cluster with the largest number of the circumscribed rectangular coordinate frames, and taking the average circumscribed rectangular coordinate frame as the reference circumscribed rectangular coordinate frame.
Optionally, in a sixth implementation manner of the first aspect of the present invention, a calculation formula of the minimum overlap threshold is as follows:
Wherein Th (min) is a minimum overlap threshold, B min is the minimum circumscribed rectangular coordinate frame, and B a is the reference circumscribed rectangular coordinate frame;
the calculation formula of the maximum overlapping degree threshold value is as follows:
Wherein Th (max) is a maximum overlap threshold, B max is the maximum circumscribed rectangular coordinate frame, and B a is the reference circumscribed rectangular coordinate frame.
The second aspect of the present invention provides a device for identifying the time of the up-down gear of a flight, comprising:
the acquisition module is used for acquiring video images fixedly shot by the airplane stand at preset time intervals;
The identification module is used for inputting the video image into a pre-trained aircraft identification model for identification and outputting an external rectangular coordinate frame of the identified target aircraft;
The calculation module is used for calculating a first overlapping degree value and a second overlapping degree value of the circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset overlapping degree value calculation formula;
The recording module is used for recording a first time node when the first overlapping degree value is effective and a second time node when the second overlapping degree value is effective;
And the determining module is used for determining the up gear time or the down gear time of the target aircraft based on the first time node and the second time node.
Optionally, in a first implementation manner of the second aspect of the present invention, the calculating module is specifically configured to:
calculating a first overlapping degree value of an circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset first overlapping degree value calculation formula;
calculating a second overlapping degree value of the circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset second overlapping degree value calculation formula;
The first overlap value calculation formula and the second overlap value calculation formula are as follows:
Where IOU 1 represents a first overlap value, IOU 2 represents a second overlap value, B p represents a circumscribed rectangular coordinate frame of the target aircraft, and B a represents a reference circumscribed rectangular coordinate frame.
Optionally, in a second implementation manner of the second aspect of the present invention, the recording module is specifically configured to:
Judging whether the first overlapping degree value is between a preset maximum overlapping degree threshold value and a preset minimum overlapping degree threshold value range, if so, confirming that the first overlapping degree value is effective, and recording the shooting time of the video image as a first time node;
And if so, confirming that the second overlapping degree value is effective, and recording the shooting time of the video image as a second time node.
Optionally, in a third implementation manner of the second aspect of the present invention, the determining module is specifically configured to:
Comparing the sizes of the first time node and the second time node;
if the first time node is larger than the second time node, determining that the gear-down time of the target aircraft is the second time node;
And if the first time node is smaller than the second time node, determining that the gear-up time of the target aircraft is the second time node.
Optionally, in a fourth implementation manner of the second aspect of the present invention, the device for identifying an up-down gear time of a flight further includes:
The first verification module is used for judging whether the difference value between the first time node and the second time node is in a first preset time range or not if the lower gear time of the target aircraft is determined to be the second time node;
The second verification module is used for judging whether the difference value between the next gear time of the target aircraft and the current flight take-off time of the aircraft stand is in a second preset time range or not if the difference value between the first time node and the second time node is in the first preset time range, and if so, determining that the next gear time of the target aircraft is correct;
The third verification module is used for judging whether the difference value between the first time node and the second time node is within the first preset time range or not if the gear-up time of the target aircraft is determined to be the second time node;
And the fourth verification module is used for judging whether the difference value between the upper gear time of the target aircraft and the current flight arrival time of the aircraft stand is in the second preset time range or not if the difference value between the first time node and the second time node is in the first preset time range, and if so, determining that the upper gear time of the target aircraft is correct.
Optionally, in a fifth implementation manner of the second aspect of the present invention, the device for identifying an up-down gear time of a flight further includes:
The sample acquisition module is used for acquiring sample video images of upper and lower wheel blocks of a sample aircraft shot by the same aircraft stand;
The sample identification module is used for sequentially inputting the sample video images into the aircraft identification model for identification and outputting the identified circumscribed rectangular coordinate frames of each sample aircraft;
The sample clustering module is used for clustering the circumscribed rectangular coordinate frames of each sample plane by adopting a preset clustering algorithm to obtain a plurality of clusters;
And the sample statistics module is used for counting the average circumscribed rectangular coordinate frame, the minimum circumscribed rectangular coordinate frame and the maximum circumscribed rectangular coordinate frame of the cluster with the largest number of the circumscribed rectangular coordinate frames, and taking the average circumscribed rectangular coordinate frame as the reference circumscribed rectangular coordinate frame.
Optionally, in a sixth implementation manner of the second aspect of the present invention, the calculation formula of the minimum overlap threshold is as follows:
Wherein Th (min) is a minimum overlap threshold, B min is the minimum circumscribed rectangular coordinate frame, and B a is the reference circumscribed rectangular coordinate frame;
the calculation formula of the maximum overlapping degree threshold value is as follows:
Wherein Th (max) is a maximum overlap threshold, B max is the maximum circumscribed rectangular coordinate frame, and B a is the reference circumscribed rectangular coordinate frame.
The third aspect of the invention provides an electronic device, which comprises a memory and at least one processor, wherein the memory stores instructions, and the at least one processor calls the instructions in the memory so that the electronic device executes the flight up-and-down gear time identification method.
A fourth aspect of the present invention provides a computer readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform the above-described method of identifying a flight up-down time.
According to the technical scheme provided by the invention, image acquisition is carried out at intervals of preset time, the acquired images are identified through a preset airplane identification model, an external rectangular coordinate frame of the airplane is obtained, the overlapping degree of the external rectangular coordinate frame of the airplane and a reference external rectangular coordinate frame is calculated through an overlapping degree calculation formula, an effective time node is recorded, and the upper gear time or the lower gear time of the airplane is identified based on the time node. The method automatically collects and identifies the images fixedly shot by the airplane stand in real time, and judges whether the airplane is at the target position or not through the preset overlapping degree value calculation formula, so that the automatic identification of the airplane up-down gear time is realized, the efficiency and accuracy of time recording are greatly improved, and the identification cost of the airplane up-down gear time is reduced.
Drawings
FIG. 1 is a schematic diagram of a first embodiment of a method for identifying a time of a top/bottom gear of a flight according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a second embodiment of a method for identifying a time for a top/bottom gear of a flight according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a third embodiment of a method for identifying a time for a top/bottom gear of a flight according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a first embodiment of a device for identifying the time of a flight up and down gear in an embodiment of the present invention;
FIG. 5 is a schematic diagram of a second embodiment of a device for identifying the time of a flight up and down gear in an embodiment of the present invention;
FIG. 6 is a diagram illustrating a third embodiment of a device for identifying a time for a flight to get on/off a gear according to an embodiment of the present invention;
fig. 7 is a schematic diagram of an embodiment of an electronic device according to an embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a method, a device, equipment and a storage medium for identifying the time of the upper and lower gear of a flight, which are used for automatically identifying the flight through a preset aircraft identification model and calculating the overlapping degree of the aircraft and a preset reference external rectangular coordinate frame so as to determine the time of the upper and lower gear of the aircraft. The automatic identification of the time of the upper wheel gear and the lower wheel gear of the airplane greatly improves the time recording efficiency and accuracy and reduces the labor cost.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
For easy understanding, the following describes a specific flow of an embodiment of the present invention, referring to fig. 1, and a first embodiment of a method for identifying a time of a flight up and down gear in an embodiment of the present invention includes:
101. Video images fixedly shot by an airplane stand are acquired at intervals of preset time;
It can be understood that the execution body of the present invention may be a flight up-down gear time identifying device, and may also be a terminal or a server, which is not limited herein. The embodiment of the invention is described by taking a server as an execution main body as an example.
In this embodiment, a camera for fixed shooting is installed in the airport for shooting a fixed stand.
In this embodiment, a captured monitoring video is obtained, and the obtained monitoring video is captured at intervals of a preset time to obtain an image to be detected, i.e., a video image, which can be used for aircraft identification.
In this embodiment, video images may be acquired at intervals of a preset time, or may be acquired at intervals of a preset frame number.
102. Inputting the video image into a pre-trained airplane identification model for identification, and outputting an external rectangular coordinate frame of the identified target airplane;
in this embodiment, the aircraft recognition model recognizes whether an aircraft exists in the image through an image recognition technology, and if so, marks out an external rectangular coordinate frame of the recognized aircraft.
In this embodiment, the aircraft recognition model is trained by using pictures of various parts such as the whole machine, the machine head, the machine tail and the like and different directions.
In this embodiment, the target aircraft is the aircraft in the video image identified by the aircraft identification model.
In this embodiment, the circumscribed rectangular coordinate frame is a circumscribed rectangular frame containing the target object, the position of the target object is marked, and the circumscribed rectangular coordinate frame is represented by coordinates.
In this embodiment, the method for representing the circumscribed rectangular coordinate frame is not limited, and optionally, in one embodiment, the circumscribed rectangular coordinate frame is represented by the top-left corner vertex coordinate and the bottom-right corner vertex coordinate of the circumscribed rectangular coordinate frame, or the circumscribed rectangular coordinate frame is represented by the width and height of a certain fixed vertex coordinate and the circumscribed rectangular coordinate frame.
In this embodiment, the method for training the aircraft recognition model is not limited, and includes, but is not limited to, MASK-RCNN, YOLO, faster-RCNN.
Optionally, in an embodiment, a MASK R-CNN model is used to train an aircraft recognition model, where the MASK R-CNN model sequentially includes a feature extraction network, an RPN (Region Proposal Network, region generation network) network, an ROI alignment (Region Ofinterest Align, region of interest alignment) layer, and an FCN (Fully Convolutional Networks, full convolution network) network, where the feature extraction network is used to extract a feature map of a sample image, the RPN network is used to obtain preset anchor frame information in the feature map, determine whether an anchor frame contains a recognition target, if yes, reserve the anchor frame, and adjust a position of the anchor frame in the feature map to obtain a pre-selected frame of the sample image, and the alignment layer is used to identify a position of the pre-selected frame on the feature map, fuse the pre-selected frame and the target feature map, and divide and end-pool the pre-selected frame to obtain a labeling feature map, and the FCN network is used to predict each pixel point of the labeling feature map to obtain a prediction result corresponding to the sample image.
Based on the MASK R-CNN model, the aircraft recognition model training comprises the following steps:
(1) Inputting label information corresponding to the aircraft sample image and the aircraft sample image into a preset MASK R-CNN model;
(2) Extracting a target feature map of the aircraft sample image through the feature extraction network;
(3) Inputting the target feature map into the RPN network so as to generate a pre-selected frame corresponding to the target feature map through the RPN network according to preset anchor frame information;
(4) Inputting the pre-selection frame and the target feature map into the ROI alignment layer for fusing the pre-selection frame and the target feature map through the ROI alignment layer pair, and dividing the pre-selection frame and pooling endpoints to obtain a labeling feature map;
(5) Inputting the labeling feature map into the FCN network so as to predict each pixel point of the labeling feature map through the FCN network, obtaining a prediction result corresponding to the sample image and outputting the prediction result;
(6) And optimizing parameters of the MASK R-CNN model according to the prediction result and the label information until the MASK R-CNN model converges to obtain an aircraft recognition model.
103. Calculating a first overlapping degree value and a second overlapping degree value of an external rectangular coordinate frame of the target aircraft and a preset reference external rectangular coordinate frame based on a preset overlapping degree value calculation formula;
In this embodiment, the overlapping degree value is used to represent the overlapping degree of two circumscribed rectangular coordinate frames.
In this embodiment, the overlap value calculation formula is used to calculate the overlap value, and generally uses the intersection ratio of two circumscribed rectangular coordinate frames for calculation.
In this embodiment, the preset reference circumscribed rectangular coordinate frame, that is, the circumscribed rectangular coordinate frame when the aircraft is stopped at the aircraft stand, is a reference coordinate frame.
In this embodiment, the first overlap value is used to determine whether the target aircraft is within a specified range, which is a range of areas at a distance from the aircraft stand, and the second overlap value is used to determine whether the target aircraft is at the aircraft stand.
104. Recording a first time node when the first overlap value is effective and a second time node when the second overlap value is effective;
In this embodiment, the first overlap value is valid, that is, the first overlap value satisfies a preset condition, where the preset condition is used to determine whether the target aircraft is at a specified position, that is, the first overlap value effectively indicates that the target aircraft is at the specified position, where in this embodiment, the specified position is a region range that is a certain distance from the aircraft stand.
In this embodiment, the second overlap value is valid, that is, the second overlap value satisfies another preset condition, where the preset condition is used to determine whether the target aircraft is at a specified position, that is, the second overlap value effectively indicates that the target aircraft is at a specified position, where in this embodiment, the specified position is an aircraft stand.
In this embodiment, the time node is the capturing time of the video image for which the overlapping degree value is valid.
105. And determining the up gear time or the down gear time of the target aircraft based on the first time node and the second time node.
In this embodiment, the first time node is a time node when the target aircraft is at a position at a distance from the aircraft stand, and the second time node is a time node when the target aircraft is at the aircraft stand.
In this embodiment, whether the target aircraft takes off or lands is determined according to the sequence of the first time node and the second time node, so as to determine whether the target aircraft needs to go up or down.
In this embodiment, the second time node is a time node when the target aircraft is at the aircraft stand, and thus the second time node is an up gear time or a down gear time of the target aircraft.
In the embodiment of the invention, image acquisition is carried out at intervals of preset time, the acquired images are identified through a preset airplane identification model, an external rectangular coordinate frame of the airplane is obtained, the overlapping degree of the external rectangular coordinate frame of the airplane and a reference external rectangular coordinate frame is calculated through an overlapping degree calculation formula, an effective time node is recorded, and the upper gear time or the lower gear time of the airplane is identified based on the time node. The method automatically collects and identifies the images fixedly shot by the airplane stand in real time, and judges whether the airplane is at the target position or not through the preset overlapping degree value calculation formula, so that the automatic identification of the airplane up-down gear time is realized, the efficiency and accuracy of time recording are greatly improved, and the identification cost of the airplane up-down gear time is reduced.
Referring to fig. 2, a second embodiment of a method for identifying a time of an up/down shift of a flight according to an embodiment of the present invention includes:
201. video images fixedly shot by an airplane stand are acquired at intervals of preset time;
202. Inputting the video image into a pre-trained airplane identification model for identification, and outputting an external rectangular coordinate frame of the identified target airplane;
203. Calculating a first overlapping degree value of an circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset first overlapping degree value calculation formula;
optionally, in an embodiment, the first overlap value calculation formula is as follows:
Where IOU 1 represents a first overlap value, B p represents a circumscribed rectangular coordinate frame of the target aircraft, and B a represents a reference circumscribed rectangular coordinate frame.
In this embodiment, B p∩Ba represents an intersection of the circumscribed rectangular coordinate frame of the target aircraft and the reference circumscribed rectangular coordinate frame, that is, an area of an intersection portion of the circumscribed rectangular coordinate frame of the target aircraft and the reference circumscribed rectangular coordinate frame, B p∪Ba represents a union of the circumscribed rectangular coordinate frame of the target aircraft and the reference circumscribed rectangular coordinate frame, that is, an area of the union of the circumscribed rectangular coordinate frame of the target aircraft and the reference circumscribed rectangular coordinate frame, and the overlapping degree of the target aircraft and the reference aircraft stand is represented by an intersection ratio, so as to determine whether the aircraft is present at the target position, and in this embodiment, the method is used for determining whether the target aircraft is present in an area range having a certain distance from the aircraft stand.
204. Calculating a second overlapping degree value of the circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset second overlapping degree value calculation formula;
Optionally, in an embodiment, the second overlap value calculation formula is as follows:
where IOU 2 represents a second overlap value, B p represents a circumscribed rectangular coordinate frame of the target aircraft, and B a represents a reference circumscribed rectangular coordinate frame.
In this embodiment, B p∩Ba represents an intersection of the circumscribed rectangular coordinate frame of the target aircraft and the reference circumscribed rectangular coordinate frame, that is, an area of an intersection portion of the circumscribed rectangular coordinate frame of the target aircraft and the reference circumscribed rectangular coordinate frame.
In this embodiment, B a of the denominator represents the area of the reference circumscribed rectangular coordinate frame.
In this embodiment, the overlapping degree of the target aircraft and the reference aircraft stand is represented by the ratio of the intersection area of the circumscribed rectangular coordinate frame of the target aircraft and the reference circumscribed rectangular coordinate frame to the area of the reference circumscribed rectangular coordinate frame, so as to determine whether the aircraft is present at the aircraft stand. The position of the aircraft after the aircraft is stopped is relatively stable, so that the calculation formula of the IOU 2 is used for judging whether the aircraft reaches the aircraft stand or not more accurately.
205. Judging whether the first overlapping degree value is between a preset maximum overlapping degree threshold value and a preset minimum overlapping degree threshold value range, if so, confirming that the first overlapping degree value is effective, and recording the shooting time of the video image as a first time node;
In this embodiment, the maximum overlapping degree threshold value and the minimum overlapping degree threshold value are used to determine whether the target aircraft is at the designated position, and when the first overlapping degree value is between the maximum overlapping degree threshold value and the minimum overlapping degree threshold value, the target aircraft is within a region range having a certain distance from the aircraft stop point.
In this embodiment, when it is determined that the target aircraft is within a range of a certain distance from the aircraft stop point, the capturing time of the current video image is recorded as the first time node.
Optionally, in an embodiment, the calculation formula of the minimum overlap threshold is as follows:
Wherein Th (min) is a minimum overlap threshold, B min is the minimum circumscribed rectangular coordinate frame, and B a is the reference circumscribed rectangular coordinate frame;
the calculation formula of the maximum overlapping degree threshold value is as follows:
Wherein Th (max) is a maximum overlap threshold, B max is the maximum circumscribed rectangular coordinate frame, and B a is the reference circumscribed rectangular coordinate frame.
In the embodiment, the minimum circumscribed rectangular coordinate frame is the rectangular coordinate frame of the airplane with the smallest area and capable of being circumscribed with the airplane stop point, and the maximum circumscribed rectangular coordinate frame is the rectangular coordinate frame of the airplane with the largest area and capable of being circumscribed with the airplane stop point.
206. Judging whether the second overlapping degree value is larger than the maximum overlapping degree threshold value or not, if so, confirming that the second overlapping degree value is effective, and recording the shooting time of the video image as a second time node;
In this embodiment, when the second overlapping degree value is greater than the maximum overlapping degree threshold value, determining that the target aircraft is at the aircraft stop point, and when determining that the target aircraft is at the aircraft stop point, recording the shooting time of the current video image as the second time node.
207. Determining an up-gear time or a down-gear time of the target aircraft based on the first time node and the second time node;
Optionally, in an embodiment, the determining the up gear time or the down gear time of the target aircraft based on the first time node and the second time node includes:
Comparing the sizes of the first time node and the second time node;
if the first time node is larger than the second time node, determining that the gear-down time of the target aircraft is the second time node;
And if the first time node is smaller than the second time node, determining that the gear-up time of the target aircraft is the second time node.
In this embodiment, the first time node is a time node when the target aircraft is at a position at a distance from the aircraft stand, and the second time node is a time node when the target aircraft is at the aircraft stand.
In the embodiment, whether the target airplane takes off or lands is judged according to the sequence of the first time node and the second time node, so that whether the target airplane needs to go on or off gear is judged, when the first time node is larger than the second time node, the target airplane moves from the airplane stand to a position which is a certain distance away from the airplane stand, the airplane takes off, the recorded time is the off gear time, when the first time node is smaller than the second time node, the target airplane moves from the position which is a certain distance away from the airplane stand to the airplane stand, the airplane lands, and the recorded time is the on gear time.
In this embodiment, the second time node is a time node when the target aircraft is at the aircraft stand, and thus the second time node is an up gear time or a down gear time of the target aircraft.
208. If the lower gear time of the target aircraft is determined to be the second time node, judging whether the difference value between the first time node and the second time node is within a first preset time range or not;
In this embodiment, the first preset time range is an empirical value that can be set empirically, for example, if the time for the target aircraft to glide out of the reference circumscribed rectangular coordinate frame from the aircraft stop point is within one minute, the first preset time range can be set to one minute.
209. If the difference value between the first time node and the second time node is in the first preset time range, judging whether the difference value between the lower gear time of the target aircraft and the current flight take-off time of the aircraft stand is in a second preset time range, and if so, determining that the lower gear time of the target aircraft is correct;
In this embodiment, since the aircraft is in gear before taking off, the aircraft take-off time can be used to verify whether the identified aircraft take-off time is correct.
In this embodiment, the second preset time range is an empirical value that can be empirically set, for example, 30 minutes.
210. If the gear-up time of the target aircraft is determined to be the second time node, judging whether the difference value between the first time node and the second time node is within the first preset time range or not;
211. If the difference value between the first time node and the second time node is in the first preset time range, judging whether the difference value between the upper gear time of the target aircraft and the current flight arrival time of the aircraft stand is in the second preset time range, and if so, determining that the upper gear time of the target aircraft is correct.
In the embodiment of the invention, image acquisition is carried out at intervals of preset time, the acquired images are identified through a preset aircraft identification model, an external rectangular coordinate frame of a target aircraft is obtained, the overlapping degree of the external rectangular coordinate frame of the target aircraft and a reference external rectangular coordinate frame is calculated through an IOU 1 and IOU 2 overlapping degree calculation formula, the time node of the target aircraft at a designated position is recorded, and the upper gear time or the lower gear time of the aircraft is identified based on the time node. According to the embodiment of the invention, the image shot by fixing the airplane stand is automatically acquired and identified in real time, whether the airplane is at the target position is judged by presetting the overlapping degree value calculation formula, so that the automatic identification of the up-and-down gear time of the airplane is realized, the efficiency and accuracy of time recording are greatly improved, the labor cost is reduced, the IOU 2 overlapping degree calculation formula is applied to judge whether the target airplane is at the airplane stand, the higher accuracy is realized, the identified up-and-down gear time is verified based on the flight time, and the error rate is reduced.
Referring to fig. 3, a third embodiment of a method for identifying a time of an up/down shift of a flight according to an embodiment of the present invention includes:
301. acquiring sample video images of upper and lower wheel blocks of a sample aircraft shot by the same aircraft stand;
in this embodiment, a video image captured by a history may be obtained, or a video image of an aircraft at an aircraft stop point may be obtained by performing an aircraft test run and capturing the video image.
302. Inputting the sample video images into the aircraft identification model in sequence for identification, and outputting the identified circumscribed rectangular coordinate frames of each sample aircraft;
in this embodiment, the circumscribed rectangular coordinate frame is represented by coordinates of its upper left corner vertex and lower right corner vertex.
303. Clustering the circumscribed rectangular coordinate frames of each sample aircraft by adopting a preset clustering algorithm to obtain a plurality of clusters;
In this embodiment, the clustering method used is not limited, and includes but is not limited to k-means, EM algorithm.
304. Counting the average circumscribed rectangular coordinate frame, the minimum circumscribed rectangular coordinate frame and the maximum circumscribed rectangular coordinate frame of the cluster with the largest number of the circumscribed rectangular coordinate frames, and taking the average circumscribed rectangular coordinate frame as the reference circumscribed rectangular coordinate frame;
In this embodiment, the average circumscribed rectangular coordinate frame is obtained by averaging the top left corner vertices and the bottom right corner vertices of all circumscribed rectangular coordinate frames of the cluster, the circumscribed rectangular coordinate frame represented by the obtained coordinate values is the average circumscribed rectangular coordinate frame, the smallest circumscribed rectangular coordinate frame is the circumscribed rectangular coordinate frame with the smallest area, and the largest circumscribed rectangular coordinate frame is the circumscribed rectangular coordinate frame with the largest area.
305. Video images fixedly shot by an airplane stand are acquired at intervals of preset time;
306. inputting the video image into a pre-trained airplane identification model for identification, and outputting an external rectangular coordinate frame of the identified target airplane;
307. Calculating a first overlapping degree value and a second overlapping degree value of an external rectangular coordinate frame of the target aircraft and a preset reference external rectangular coordinate frame based on a preset overlapping degree value calculation formula;
308. recording a first time node when the first overlap value is effective and a second time node when the second overlap value is effective;
309. And determining the up gear time or the down gear time of the target aircraft based on the first time node and the second time node.
In the embodiment of the invention, a sample video image is acquired, the sample video image is identified, all external rectangular coordinate frames are obtained, a clustering algorithm is adopted to cluster all the external rectangular coordinate frames, the external rectangular coordinate frames used as a judging reference are obtained, image acquisition is carried out at intervals of preset time intervals, the acquired images are identified through a preset airplane identification model, the external rectangular coordinate frames of the airplane are obtained, the overlapping degree of the external rectangular coordinate frames of the airplane and the reference external rectangular coordinate frames is calculated through an overlapping degree calculation formula, an effective time node is recorded, and the upper gear time or the lower gear time of the airplane is identified based on the time node. According to the embodiment of the invention, the circumscribed rectangular coordinate frames in the sample video image are classified by a clustering method, and the cluster with the largest sample is selected for calculation, so that the noise is reduced, and the accuracy of the reference circumscribed rectangular coordinate frames is improved.
The method for identifying the time of the up-down gear of the flight in the embodiment of the present invention is described above, and the device for identifying the time of the up-down gear of the flight in the embodiment of the present invention is described below, referring to fig. 4, a first embodiment of the device for identifying the time of the up-down gear of the flight in the embodiment of the present invention includes:
the acquisition module 401 is used for acquiring video images fixedly shot by the aircraft stand at preset time intervals;
The recognition module 402 is configured to input the video image into a pre-trained aircraft recognition model for recognition, and output a circumscribed rectangular coordinate frame of the recognized target aircraft;
The calculating module 403 is configured to calculate a first overlap value and a second overlap value of the circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset overlap value calculation formula;
a recording module 404, configured to record a first time node when the first overlap value is valid and a second time node when the second overlap value is valid;
A determining module 405, configured to determine an up gear time or a down gear time of the target aircraft based on the first time node and the second time node.
Optionally, in an embodiment, the computing module 403 is specifically configured to:
calculating a first overlapping degree value of an circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset first overlapping degree value calculation formula;
calculating a second overlapping degree value of the circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset second overlapping degree value calculation formula;
The first overlap value calculation formula and the second overlap value calculation formula are as follows:
Where IOU 1 represents a first overlap value, IOU 2 represents a second overlap value, B p represents a circumscribed rectangular coordinate frame of the target aircraft, and B a represents a reference circumscribed rectangular coordinate frame.
Optionally, in an embodiment, the recording module 404 is specifically configured to:
Judging whether the first overlapping degree value is between a preset maximum overlapping degree threshold value and a preset minimum overlapping degree threshold value range, if so, confirming that the first overlapping degree value is effective, and recording the shooting time of the video image as a first time node;
And if so, confirming that the second overlapping degree value is effective, and recording the shooting time of the video image as a second time node.
Optionally, in an embodiment, the determining module 405 is specifically configured to:
Comparing the sizes of the first time node and the second time node;
if the first time node is larger than the second time node, determining that the gear-down time of the target aircraft is the second time node;
And if the first time node is smaller than the second time node, determining that the gear-up time of the target aircraft is the second time node.
In the embodiment of the invention, image acquisition is carried out at intervals of preset time, the acquired images are identified through a preset airplane identification model, an external rectangular coordinate frame of the airplane is obtained, the overlapping degree of the external rectangular coordinate frame of the airplane and a reference external rectangular coordinate frame is calculated through an overlapping degree calculation formula, an effective time node is recorded, and the upper gear time or the lower gear time of the airplane is identified based on the time node. According to the embodiment of the invention, the image shot by fixing the airplane stand is automatically acquired and identified in real time, and whether the airplane is at the target position is judged by presetting an overlapping degree value calculation formula, so that the time of the upper wheel gear and the lower wheel gear of the airplane is automatically identified, the efficiency and the accuracy of time recording are greatly improved, and the labor cost is reduced.
Referring to fig. 5, a second embodiment of the device for identifying the time of the up/down shift of a flight according to the embodiment of the present invention includes:
the acquisition module 401 is used for acquiring video images fixedly shot by the aircraft stand at preset time intervals;
The recognition module 402 is configured to input the video image into a pre-trained aircraft recognition model for recognition, and output a circumscribed rectangular coordinate frame of the recognized target aircraft;
The calculating module 403 is configured to calculate a first overlap value and a second overlap value of the circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset overlap value calculation formula;
a recording module 404, configured to record a first time node when the first overlap value is valid and a second time node when the second overlap value is valid;
a determining module 405, configured to determine an up gear time or a down gear time of the target aircraft based on the first time node and the second time node;
a first verification module 406, configured to determine whether a difference between the first time node and the second time node is within a first preset time range if it is determined that the gear-down time of the target aircraft is the second time node;
A second verification module 407, configured to determine whether a difference between the next gear time of the target aircraft and the current flight departure time of the aircraft stand is within a second preset time range if the difference between the first time node and the second time node is within the first preset time range, and if yes, determine that the next gear time of the target aircraft is correct;
A third verification module 408, configured to determine whether a difference between the first time node and the second time node is within the first preset time range if it is determined that the gear-up time of the target aircraft is the second time node;
and a fourth verification module 409, configured to determine whether a difference between the up gear time of the target aircraft and the current flight arrival time of the aircraft stand is within the second preset time range if the difference between the first time node and the second time node is within the first preset time range, and if so, determine that the up gear time of the target aircraft is correct.
In the embodiment of the invention, image acquisition is carried out at intervals of preset time, the acquired images are identified through a preset aircraft identification model, an external rectangular coordinate frame of a target aircraft is obtained, the overlapping degree of the external rectangular coordinate frame of the target aircraft and a reference external rectangular coordinate frame is calculated through an IOU 1 and IOU 2 overlapping degree calculation formula, the time node of the target aircraft at a designated position is recorded, and the upper gear time or the lower gear time of the aircraft is identified based on the time node. According to the embodiment of the invention, the image shot by fixing the airplane stand is automatically acquired and identified in real time, and whether the airplane is at the target position is judged by presetting an overlapping degree value calculation formula, so that the automatic identification of the airplane up-down gear time is realized, the efficiency and accuracy of time recording are greatly improved, the labor cost is reduced, the identified up-down gear time is verified based on the flight time, the error rate is reduced, and the accuracy is higher.
Referring to fig. 6, a third embodiment of a device for identifying a time of an up/down shift of a flight according to an embodiment of the present invention includes:
the acquisition module 401 is used for acquiring video images fixedly shot by the aircraft stand at preset time intervals;
The recognition module 402 is configured to input the video image into a pre-trained aircraft recognition model for recognition, and output a circumscribed rectangular coordinate frame of the recognized target aircraft;
The calculating module 403 is configured to calculate a first overlap value and a second overlap value of the circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset overlap value calculation formula;
a recording module 404, configured to record a first time node when the first overlap value is valid and a second time node when the second overlap value is valid;
a determining module 405, configured to determine an up gear time or a down gear time of the target aircraft based on the first time node and the second time node;
The sample acquiring module 410 is configured to acquire sample video images of upper and lower wheel rails of a sample aircraft captured by the same aircraft stand;
the sample recognition module 411 is configured to sequentially input the sample video images into the aircraft recognition model for recognition, and output the recognized circumscribed rectangular coordinate frames of each sample aircraft;
the sample clustering module 412 is configured to cluster the circumscribed rectangular coordinate frames of each sample aircraft by using a preset clustering algorithm to obtain a plurality of clusters;
And the sample statistics module 413 is configured to count an average circumscribed rectangular coordinate frame, a minimum circumscribed rectangular coordinate frame, and a maximum circumscribed rectangular coordinate frame of the cluster with the largest number of circumscribed rectangular coordinate frames, and take the average circumscribed rectangular coordinate frame as the reference circumscribed rectangular coordinate frame.
In the embodiment of the invention, a sample video image is acquired, the sample video image is identified, all external rectangular coordinate frames are obtained, a clustering algorithm is adopted to cluster all the external rectangular coordinate frames, the external rectangular coordinate frames used as a judging reference are obtained, image acquisition is carried out at intervals of preset time intervals, the acquired images are identified through a preset airplane identification model, the external rectangular coordinate frames of the airplane are obtained, the overlapping degree of the external rectangular coordinate frames of the airplane and the reference external rectangular coordinate frames is calculated through an overlapping degree calculation formula, an effective time node is recorded, and the upper gear time or the lower gear time of the airplane is identified based on the time node. According to the embodiment of the invention, the circumscribed rectangular coordinate frames in the sample video image are classified by a clustering method, and the cluster with the largest sample is selected for calculation, so that the noise is reduced, and the accuracy of the reference circumscribed rectangular coordinate frames is improved.
The above-mentioned fig. 4, fig. 5 and fig. 6 describe the device for identifying the time of the up-down gear of the flight in the embodiment of the present invention in detail from the point of view of the modularized functional entity, and the following describes the electronic device in the embodiment of the present invention in detail from the point of view of the hardware processing.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, where the electronic device 500 may have a relatively large difference due to different configurations or performances, and may include one or more processors (central processing units, CPU) 510 (e.g., one or more processors) and a memory 520, and one or more storage mediums 530 (e.g., one or more mass storage devices) storing application programs 533 or data 532. Wherein memory 520 and storage medium 530 may be transitory or persistent storage. The program stored on the storage medium 530 may include one or more modules (not shown), each of which may include a series of instruction operations in the electronic device 500. Still further, the processor 510 may be arranged to communicate with a storage medium 530 and to execute a series of instruction operations in the storage medium 530 on the electronic device 500.
The electronic device 500 may also include one or more power supplies 540, one or more wired or wireless network interfaces 550, one or more input/output interfaces 560, and/or one or more operating systems 531, such as Windows Serve, mac OS X, unix, linux, freeBSD, and the like. It will be appreciated by those skilled in the art that the electronic device structure shown in fig. 5 is not limiting of the electronic device and may include more or fewer components than shown, or may combine certain components, or may be arranged in a different arrangement of components.
The invention also provides an electronic device, which comprises a memory and a processor, wherein the memory stores computer readable instructions, and the computer readable instructions, when executed by the processor, cause the processor to execute the steps of the flight up-down gear time identification method in the above embodiments.
The present invention also provides a computer readable storage medium, which may be a non-volatile computer readable storage medium, and may also be a volatile computer readable storage medium, where instructions are stored in the computer readable storage medium, where the instructions, when executed on a computer, cause the computer to perform the steps of the method for identifying a time of a flight up and down gear.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. The storage medium includes a U disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
While the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those skilled in the art that the foregoing embodiments may be modified or equivalents may be substituted for some of the features thereof, and that the modifications or substitutions do not depart from the spirit and scope of the embodiments of the invention.

Claims (7)

1.一种航班上下轮档时间识别方法,其特征在于,所述航班上下轮档时间识别方法包括:1. A method for identifying flight on/off block times, characterized in that the method for identifying flight on/off block times comprises: 每隔预置时间间隔采集飞机停机位固定拍摄的视频图像;Collect video images taken at fixed positions of aircraft parking positions at preset time intervals; 将所述视频图像输入预先训练的飞机识别模型进行识别,输出识别到的目标飞机的外接矩形坐标框;Inputting the video image into a pre-trained aircraft recognition model for recognition, and outputting a circumscribed rectangular coordinate frame of the recognized target aircraft; 基于预置重叠度值计算公式,计算所述目标飞机的外接矩形坐标框与预置基准外接矩形坐标框的第一重叠度值与第二重叠度值,其中,所述预置基准外接矩形坐标框是飞机停在飞机停机位时的外接矩形坐标框;Based on a preset overlap value calculation formula, a first overlap value and a second overlap value of a circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame are calculated, wherein the preset reference circumscribed rectangular coordinate frame is a circumscribed rectangular coordinate frame when the aircraft is parked at an aircraft parking position; 记录所述第一重叠度值有效时的第一时间节点和所述第二重叠度值有效时的第二时间节点,其中,所述第一时间节点是目标飞机在距飞机停机位预设距离的位置时的时间节点,所述第二时间节点是目标飞机在飞机停机位的时间节点;Recording a first time node when the first overlap value is valid and a second time node when the second overlap value is valid, wherein the first time node is a time node when the target aircraft is at a preset distance from the aircraft parking position, and the second time node is a time node when the target aircraft is at the aircraft parking position; 基于所述第一时间节点与所述第二时间节点,确定所述目标飞机的上轮档时间或下轮档时间;Determining an on-block time or an off-block time of the target aircraft based on the first time node and the second time node; 其中,所述基于预置重叠度值计算公式,计算所述目标飞机的外接矩形坐标框与预置基准外接矩形坐标框的第一重叠度值与第二重叠度值包括:Wherein, the calculating of the first overlap value and the second overlap value between the circumscribed rectangular coordinate frame of the target aircraft and the preset reference circumscribed rectangular coordinate frame based on the preset overlap value calculation formula includes: 基于预置第一重叠度值计算公式,计算所述目标飞机的外接矩形坐标框与预置基准外接矩形坐标框的第一重叠度值;Calculating a first overlap value between the circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset first overlap value calculation formula; 基于预置第二重叠度值计算公式,计算所述目标飞机的外接矩形坐标框与预置基准外接矩形坐标框的第二重叠度值;Calculating a second overlap value between the circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset second overlap value calculation formula; 其中,所述第一重叠度值计算公式与第二重叠度值计算公式如下:The first overlap value calculation formula and the second overlap value calculation formula are as follows: , ; 其中,IOU1表示第一重叠度值,IOU2表示第二重叠度值,Bp表示目标飞机的外接矩形坐标框,Ba表示基准外接矩形坐标框;Wherein, IOU 1 represents the first overlap value, IOU 2 represents the second overlap value, B p represents the circumscribed rectangular coordinate frame of the target aircraft, and Ba represents the reference circumscribed rectangular coordinate frame; 其中,所述记录所述第一重叠度值有效时的第一时间节点和所述第二重叠度值有效时的第二时间节点包括:The recording of the first time node when the first overlap value is valid and the second time node when the second overlap value is valid includes: 判断所述第一重叠度值是否在预置最大重叠度阈值与最小重叠度阈值范围之间;若是,则确认所述第一重叠度值有效,并记录所述视频图像的拍摄时间为第一时间节点;Determine whether the first overlap value is within a preset maximum overlap threshold and a minimum overlap threshold; if so, confirm that the first overlap value is valid, and record the shooting time of the video image as the first time node; 判断所述第二重叠度值是否大于所述最大重叠度阈值;若是,则确认所述第二重叠度值有效,并记录所述视频图像的拍摄时间为第二时间节点;Determine whether the second overlap value is greater than the maximum overlap threshold; if so, confirm that the second overlap value is valid, and record the shooting time of the video image as the second time node; 其中,所述基于所述第一时间节点与所述第二时间节点,确定所述目标飞机的上轮档时间或下轮档时间包括:Wherein, determining the on-block time or off-block time of the target aircraft based on the first time node and the second time node includes: 比较所述第一时间节点与所述第二时间节点的大小;Comparing the first time node with the second time node; 若所述第一时间节点大于所述第二时间节点,则确定所述目标飞机的下轮档时间为所述第二时间节点;If the first time node is greater than the second time node, determining the next block time of the target aircraft to be the second time node; 若所述第一时间节点小于所述第二时间节点,则确定所述目标飞机的上轮档时间为所述第二时间节点。If the first time node is less than the second time node, the on-block time of the target aircraft is determined to be the second time node. 2.根据权利要求1所述的航班上下轮档时间识别方法,其特征在于,所述航班上下轮档时间识别方法还包括:2. The flight on-and-off block time identification method according to claim 1, characterized in that the flight on-and-off block time identification method further comprises: 若确定所述目标飞机的下轮档时间为所述第二时间节点,则判断所述第一时间节点与所述第二时间节点的差值是否在第一预置时间范围内;If it is determined that the next block time of the target aircraft is the second time node, determining whether the difference between the first time node and the second time node is within a first preset time range; 若所述第一时间节点与所述第二时间节点的差值在所述第一预置时间范围内,则判断所述目标飞机的下轮档时间与所述飞机停机位的当前航班起飞时间的差值是否在第二预置时间范围内,若是,则确定所述目标飞机的下轮档时间正确;If the difference between the first time node and the second time node is within the first preset time range, determining whether the difference between the off-block time of the target aircraft and the current flight take-off time of the aircraft parking position is within the second preset time range, and if so, determining that the off-block time of the target aircraft is correct; 若确定所述目标飞机的上轮档时间为所述第二时间节点,则判断所述第一时间节点与所述第二时间节点的差值是否在所述第一预置时间范围内;If it is determined that the block time of the target aircraft is the second time node, determining whether the difference between the first time node and the second time node is within the first preset time range; 若所述第一时间节点与所述第二时间节点的差值在所述第一预置时间范围内,则判断所述目标飞机的上轮档时间与所述飞机停机位的当前航班到达时间的差值是否在所述第二预置时间范围内;若是,则确定所述目标飞机的上轮档时间正确。If the difference between the first time node and the second time node is within the first preset time range, it is determined whether the difference between the on-block time of the target aircraft and the arrival time of the current flight at the aircraft parking position is within the second preset time range; if so, it is determined that the on-block time of the target aircraft is correct. 3.根据权利要求1所述的航班上下轮档时间识别方法,其特征在于,在所述每隔预置时间间隔采集飞机停机位固定拍摄的视频图像之前,还包括:3. The method for identifying the block time of a flight according to claim 1, characterized in that before collecting the video images taken at the aircraft parking position at preset time intervals, it also includes: 获取同一飞机停机位拍摄的样本飞机上下轮档的样本视频图像;Obtain sample video images of the upper and lower wheels of a sample aircraft shot at the same aircraft parking position; 依次将所述样本视频图像输入所述飞机识别模型进行识别,输出识别到的各样本飞机的外接矩形坐标框;sequentially inputting the sample video images into the aircraft recognition model for recognition, and outputting the circumscribed rectangular coordinate frame of each recognized sample aircraft; 采用预置聚类算法对各所述样本飞机的外接矩形坐标框进行聚类,得到多个簇;Using a preset clustering algorithm to cluster the circumscribed rectangular coordinate frames of each of the sample aircraft to obtain a plurality of clusters; 统计外接矩形坐标框数量最多的簇的平均外接矩形坐标框、最小外接矩形坐标框与最大外接矩形坐标框,并将所述平均外接矩形坐标框作为所述基准外接矩形坐标框。The average bounding rectangular coordinate frame, the minimum bounding rectangular coordinate frame and the maximum bounding rectangular coordinate frame of the cluster with the largest number of bounding rectangular coordinate frames are counted, and the average bounding rectangular coordinate frame is used as the reference bounding rectangular coordinate frame. 4.根据权利要求3所述的航班上下轮档时间识别方法,其特征在于,所述最小重叠度阈值的计算公式如下:4. The flight block time identification method according to claim 3, characterized in that the calculation formula of the minimum overlap threshold is as follows: 其中,Th(min)为最小重叠度阈值,Bmin为所述最小外接矩形坐标框,Ba为所述基准外接矩形坐标框;Wherein, Th(min) is the minimum overlap threshold, Bmin is the minimum bounding rectangle coordinate frame, and Ba is the reference bounding rectangle coordinate frame; 所述最大重叠度阈值的计算公式如下:The calculation formula of the maximum overlap threshold is as follows: 其中,Th(max)为最大重叠度阈值,Bmax为所述最大外接矩形坐标框,Ba为所述基准外接矩形坐标框。Wherein, Th(max) is the maximum overlap threshold, B max is the maximum circumscribed rectangular coordinate frame, and Ba is the reference circumscribed rectangular coordinate frame. 5.一种航班上下轮档时间识别装置,其特征在于,所述航班上下轮档时间识别装置包括:5. A flight on/off block time identification device, characterized in that the flight on/off block time identification device comprises: 采集模块,用于每隔预置时间间隔采集飞机停机位固定拍摄的视频图像;An acquisition module is used to acquire video images taken at fixed positions of aircraft parking positions at preset time intervals; 识别模块,用于将所述视频图像输入预先训练的飞机识别模型进行识别,输出识别到的目标飞机的外接矩形坐标框;A recognition module, used for inputting the video image into a pre-trained aircraft recognition model for recognition, and outputting a circumscribed rectangular coordinate frame of the recognized target aircraft; 计算模块,用于基于预置重叠度值计算公式,计算所述目标飞机的外接矩形坐标框与预置基准外接矩形坐标框的第一重叠度值与第二重叠度值,其中,所述预置基准外接矩形坐标框是飞机停在飞机停机位时的外接矩形坐标框;a calculation module, configured to calculate a first overlap value and a second overlap value between a circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset overlap value calculation formula, wherein the preset reference circumscribed rectangular coordinate frame is a circumscribed rectangular coordinate frame when the aircraft is parked at an aircraft parking position; 记录模块,用于记录所述第一重叠度值有效时的第一时间节点和所述第二重叠度值有效时的第二时间节点,其中,所述第一时间节点是目标飞机在距飞机停机位预设距离的位置时的时间节点,所述第二时间节点是目标飞机在飞机停机位的时间节点;a recording module, configured to record a first time node when the first overlap value is valid and a second time node when the second overlap value is valid, wherein the first time node is a time node when the target aircraft is at a preset distance from an aircraft parking position, and the second time node is a time node when the target aircraft is at the aircraft parking position; 确定模块,用于基于所述第一时间节点与所述第二时间节点,确定所述目标飞机的上轮档时间或下轮档时间;A determination module, configured to determine an on-block time or an off-block time of the target aircraft based on the first time node and the second time node; 其中,所述计算模块具体用于:Wherein, the calculation module is specifically used for: 基于预置第一重叠度值计算公式,计算所述目标飞机的外接矩形坐标框与预置基准外接矩形坐标框的第一重叠度值;Calculating a first overlap value between the circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset first overlap value calculation formula; 基于预置第二重叠度值计算公式,计算所述目标飞机的外接矩形坐标框与预置基准外接矩形坐标框的第二重叠度值;Calculating a second overlap value between the circumscribed rectangular coordinate frame of the target aircraft and a preset reference circumscribed rectangular coordinate frame based on a preset second overlap value calculation formula; 其中,所述第一重叠度值计算公式与第二重叠度值计算公式如下:The first overlap value calculation formula and the second overlap value calculation formula are as follows: , ; 其中,IOU1表示第一重叠度值,IOU2表示第二重叠度值,Bp表示目标飞机的外接矩形坐标框,Ba表示基准外接矩形坐标框;Wherein, IOU 1 represents the first overlap value, IOU 2 represents the second overlap value, B p represents the circumscribed rectangular coordinate frame of the target aircraft, and Ba represents the reference circumscribed rectangular coordinate frame; 其中,所述记录模块具体用于:Wherein, the recording module is specifically used for: 判断所述第一重叠度值是否在预置最大重叠度阈值与最小重叠度阈值范围之间;若是,则确认所述第一重叠度值有效,并记录所述视频图像的拍摄时间为第一时间节点;Determine whether the first overlap value is within a preset maximum overlap threshold and a minimum overlap threshold; if so, confirm that the first overlap value is valid, and record the shooting time of the video image as the first time node; 判断所述第二重叠度值是否大于所述最大重叠度阈值;若是,则确认所述第二重叠度值有效,并记录所述视频图像的拍摄时间为第二时间节点;Determine whether the second overlap value is greater than the maximum overlap threshold; if so, confirm that the second overlap value is valid, and record the shooting time of the video image as the second time node; 其中,所述确定模块具体用于:Wherein, the determination module is specifically used for: 比较所述第一时间节点与所述第二时间节点的大小;Comparing the first time node with the second time node; 若所述第一时间节点大于所述第二时间节点,则确定所述目标飞机的下轮档时间为所述第二时间节点;If the first time node is greater than the second time node, determining the next block time of the target aircraft to be the second time node; 若所述第一时间节点小于所述第二时间节点,则确定所述目标飞机的上轮档时间为所述第二时间节点。If the first time node is less than the second time node, the on-block time of the target aircraft is determined to be the second time node. 6.一种电子设备,其特征在于,所述电子设备包括:存储器和至少一个处理器,所述存储器中存储有指令;6. An electronic device, characterized in that the electronic device comprises: a memory and at least one processor, wherein instructions are stored in the memory; 所述至少一个处理器调用所述存储器中的所述指令,以使得所述电子设备执行如权利要求1-4中任一项所述的航班上下轮档时间识别方法。The at least one processor calls the instructions in the memory to enable the electronic device to execute the flight on/off block time identification method according to any one of claims 1 to 4. 7.一种计算机可读存储介质,所述计算机可读存储介质上存储有指令,其特征在于,所述指令被处理器执行时实现如权利要求1-4中任一项所述的航班上下轮档时间识别方法。7. A computer-readable storage medium having instructions stored thereon, wherein when the instructions are executed by a processor, the flight on/off block time identification method according to any one of claims 1 to 4 is implemented.
CN202111489016.XA 2021-12-07 2021-12-07 Flight on/off block time identification method, device, equipment and storage medium Active CN114419478B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111489016.XA CN114419478B (en) 2021-12-07 2021-12-07 Flight on/off block time identification method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111489016.XA CN114419478B (en) 2021-12-07 2021-12-07 Flight on/off block time identification method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114419478A CN114419478A (en) 2022-04-29
CN114419478B true CN114419478B (en) 2025-02-28

Family

ID=81265613

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111489016.XA Active CN114419478B (en) 2021-12-07 2021-12-07 Flight on/off block time identification method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114419478B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116342537A (en) * 2023-03-27 2023-06-27 重庆中科云从科技有限公司 Wheel block state recognition method, control device and readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109697426A (en) * 2018-12-24 2019-04-30 北京天睿空间科技股份有限公司 Flight based on multi-detector fusion shuts down berth detection method
CN109902676A (en) * 2019-01-12 2019-06-18 浙江工业大学 A Parking Violation Detection Algorithm Based on Dynamic Background

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11651689B2 (en) * 2019-08-19 2023-05-16 Here Global B.V. Method, apparatus, and computer program product for identifying street parking based on aerial imagery
CN110910655A (en) * 2019-12-11 2020-03-24 深圳市捷顺科技实业股份有限公司 Parking management method, device and equipment
CN111292353B (en) * 2020-01-21 2023-12-19 成都恒创新星科技有限公司 Parking state change identification method
CN113436440A (en) * 2021-06-28 2021-09-24 浙江同善人工智能技术有限公司 Auxiliary early warning monitoring system for temporary parking

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109697426A (en) * 2018-12-24 2019-04-30 北京天睿空间科技股份有限公司 Flight based on multi-detector fusion shuts down berth detection method
CN109902676A (en) * 2019-01-12 2019-06-18 浙江工业大学 A Parking Violation Detection Algorithm Based on Dynamic Background

Also Published As

Publication number Publication date
CN114419478A (en) 2022-04-29

Similar Documents

Publication Publication Date Title
US20170032514A1 (en) Abandoned object detection apparatus and method and system
US20180096209A1 (en) Non-transitory computer-readable storage medium, event detection apparatus, and event detection method
CN109658454B (en) Pose information determination method, related device and storage medium
US20160379049A1 (en) Video monitoring method, video monitoring system and computer program product
CN105631418A (en) People counting method and device
CN110889328B (en) Method, device, electronic equipment and storage medium for detecting road traffic condition
RU2016146354A (en) DETECTION OF THE STATE USING IMAGE PROCESSING
KR20200002066A (en) Method for detecting vehicles and apparatus using the same
CN111091057A (en) Information processing method and device and computer readable storage medium
CN114419478B (en) Flight on/off block time identification method, device, equipment and storage medium
KR102308892B1 (en) System and method for traffic measurement of image based
CN108229473A (en) Vehicle annual inspection label detection method and device
CN112686162B (en) Method, device, equipment and storage medium for detecting clean state of warehouse environment
CN108021949B (en) Crowd crowding degree detection method, device and system and electronic equipment
CN107403429B (en) Method for quickly and automatically acquiring parameters of periodic sequence image model
CN115496814B (en) Floor calibration method based on distorted image correction and related device
CN113298811B (en) Automatic counting method, device and equipment for number of people in intelligent classroom and storage medium
CN111027390A (en) Object class detection method and device, electronic equipment and storage medium
CN110580708A (en) A fast moving detection method, device and electronic equipment
CN113033443A (en) Unmanned aerial vehicle-based automatic pedestrian crossing facility whole road network checking method
CN114092956A (en) Store passenger flow statistical method and device, computer equipment and storage medium
CN111242054B (en) Method and device for detecting capture rate of detector
KR100865531B1 (en) Separation method of individual walks by clustering foreground pixels
CN115171220A (en) Abnormal traffic early warning method and device and electronic equipment
US12548171B2 (en) Information processing apparatus, method and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant