[go: up one dir, main page]

CN109102702A - Vehicle speed measuring method based on video encoder server and Radar Signal Fusion - Google Patents

Vehicle speed measuring method based on video encoder server and Radar Signal Fusion Download PDF

Info

Publication number
CN109102702A
CN109102702A CN201810973204.1A CN201810973204A CN109102702A CN 109102702 A CN109102702 A CN 109102702A CN 201810973204 A CN201810973204 A CN 201810973204A CN 109102702 A CN109102702 A CN 109102702A
Authority
CN
China
Prior art keywords
vehicle
radar
video
coordinate
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810973204.1A
Other languages
Chinese (zh)
Inventor
韩玉兵
任洁心
王尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN201810973204.1A priority Critical patent/CN109102702A/en
Publication of CN109102702A publication Critical patent/CN109102702A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Geometry (AREA)
  • Multimedia (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

本发明公开一种基于视频车辆检测和雷达信号融合的车辆测速方法,利用摄像头采集车辆视频信息,进行车辆的识别和跟踪;利用雷达获取的回波信号,确定车辆的速度、距离和方位信息;根据雷达检测的车辆距离和方位信息,确定车辆质心坐标和行驶方向,将其与视频检测得到的车辆质心坐标和行驶方向进行匹配,在视频上显示车辆的速度信息。本发明能够对多个车道多辆汽车同时进行车速监控,测速精度高,加强了道路车辆行驶监控的可视化程度,生产成本低廉。

The invention discloses a vehicle speed measurement method based on video vehicle detection and radar signal fusion, which uses a camera to collect vehicle video information to identify and track the vehicle; uses the echo signal obtained by the radar to determine the speed, distance and orientation information of the vehicle; According to the distance and orientation information of the vehicle detected by the radar, the coordinates of the center of mass and the direction of travel of the vehicle are determined, which are matched with the coordinates of the center of mass and the direction of travel of the vehicle detected by the video, and the speed information of the vehicle is displayed on the video. The invention can simultaneously monitor the vehicle speed of multiple vehicles in multiple lanes, has high speed measurement accuracy, enhances the visibility of road vehicle running monitoring, and has low production cost.

Description

Vehicle speed measuring method based on video encoder server and Radar Signal Fusion
Technical field
The present invention relates to vehicle speed measuring technologies, and in particular to a kind of vehicle based on video encoder server and Radar Signal Fusion Speed-measuring method.
Background technique
With the development of national economy, the car ownership sustainable growth in China causes road car contradiction more violent, only leads to It crosses and increases the method for the vehicles and road and cannot achieve the target of road car coordinate operation, in order to optimize road occupation rate, Traffic administration ability is improved, needs to carry out vehicle speed measuring using intelligent transportation system.
The method that current vehicle tests the speed mainly has video frequency speed-measuring, ground sensing coil speed measuring and radar velocity measurement.Video frequency speed-measuring is logical The pixel coordinate difference for crossing calculating vehicle driving is poorer than upper frame number, is multiplied by a fixed proportion and obtains the speed of vehicle, the side of testing the speed Method is simple, but error is larger.Ground sensing coil speed measuring is by two coil spacing fixed ranges, when vehicle drives into first coil Start timing, vehicle, which is driven out to second coil timing, to be terminated, and the time difference is obtained, using distance except the upper time obtains the traveling of vehicle Speed, the speed-measuring method test the speed accuracy height, but are easy to damage, and maintenance cost is high, can only carry out testing the speed for bicycle.Radar Test the speed is to complete the survey to vehicle according to the frequency difference of the emitted frequency of radar and reception wave frequency rate using Doppler effect Speed, the speed-measuring method are protected from weather influences, and precision is high, are widely used in intelligent transportation system, but are surveyed using radar merely Speed can not intuitively engage in this profession road vehicle driving the case where.
Summary of the invention
The purpose of the present invention is to provide a kind of vehicle speed measuring method based on video encoder server and Radar Signal Fusion, Improve the visuality of the precision of vehicle speed measuring and the result that tests the speed.
The technical solution for realizing the aim of the invention is as follows: a kind of vehicle based on video encoder server and Radar Signal Fusion Speed-measuring method, which is characterized in that steps are as follows:
Step 1, vehicle identification tracking: vehicle video information is acquired using camera, carries out the identification and tracking of vehicle;
Step 2, measurement of testing the speed: the echo-signal obtained using radar determines speed, the distance and bearing information of vehicle;
Step 3, result fusion: according to the vehicle distances and azimuth information of detections of radar, vehicle centroid coordinate and row are determined Direction is sailed, the vehicle centroid coordinate and driving direction that it is obtained with video detection match, and vehicle is shown on video Velocity information.
Compared with prior art, the present invention its remarkable advantage are as follows: 1) present invention use multiple one radar in lane, precision compared with Height, it is cheap;2) video object detection is completed testing the speed for vehicle with Radar Signal Fusion by the present invention, visual and clear can be seen Examine Vehicle Speed situation;3) present invention completes the detection to vehicle using deep learning, and accuracy is higher, and speed is faster; It 4), can using the true value of vehicle location as the initial value of vehicle tracking when the present invention carries out Kalman filter tracking to vehicle So that vehicle tracking is more accurate.
Detailed description of the invention
Fig. 1 is sweep frequency band waveform diagram on radar of the invention.
Fig. 2 is sweep frequency band waveform diagram under radar of the invention.
Fig. 3 is the placement position of radar of the present invention.
Fig. 4 is the flow chart of target detection of the present invention.
Fig. 5 is the flow chart of Radar Signal Processing of the present invention.
Fig. 6 is the flow chart of vehicle speed measuring of the present invention.
Specific implementation method
Further illustrate the present invention program in the following with reference to the drawings and specific embodiments.
The present invention is based on the vehicle speed measuring methods of video encoder server and Radar Signal Fusion, include the following steps:
Step 1, vehicle identification tracking: vehicle video information is acquired using camera, carries out the identification and tracking of vehicle.Make For a kind of specific embodiment, vehicle identification is carried out using YOLOv3 algorithm, carries out vehicle tracking using Kalman filter tracking.
The process of vehicle identification is carried out as shown in figure 4, the specific method is as follows using YOLOv3 algorithm:
The frame picture read to camera is normalized, and is divided into S × S-grid, selects object centre bit The detection that the grid fallen in is responsible for object is set, since each bounding box possesses 5 parameters, the respectively centre coordinate (x of target0, y0), wide w, high h and bounding box confidence level, by centre coordinate (x0, y0) normalized to relative to target's center's cell at place Between 0-1, the wide and high of w, h image itself is normalized between 0 and 1;
Bounding box confidence level is determined according to the bounding-box perimeter after normalizationWherein Pr (object) Indicate bounding box a possibility that containing target, when there is target in bounding box, Pr (object)=1, when in bounding box without target When, Pr (object)=0,It is expressed as the accuracy of bounding box, calculation formula isIts Middle BoxtruthRepresent true object boundary frame, BoxpredThe object boundary frame for representing prediction, according toThreshold value removal The relatively low window of possibility removes redundancy window using non-maxima suppression, exports final target detection frame.
Kalman filter tracking specifically choose vehicle center-of-mass coordinate be vehicle trace point, by tracking be divided into initialization, Prediction, matching and corrigendum four-stage, steps are as follows:
Vehicle is appeared in into the pixel coordinate in first frame image as the initial value of Kalman filter tracking;
Kalman prediction is carried out to current all moving targets, since the movement of object in a short time can be seen Work is linear uniform motion, thus define its state-transition matrix A be A=[1,0, △ t, 0;0,1,0,△t;0,0,1,0;0, 0,0,1], wherein △ t indicates that video reads the time of a frame, and the prediction result of state matrix and present frame is substituted into state transfer Equation obtains the status predication value of next frame;
Multiple tracking targets are matched, since the time that video handles a frame is short, so the operating range of vehicle is not Can be far, set threshold value a T, xk+1,xkIt is the abscissa of video next frame and present frame vehicle centroid, y respectivelyk+1,ykRespectively It is the ordinate of video next frame and present frame vehicle centroid, when matching appears in the pixel in video before and after frames using vehicle and sits Mark calculates Euclidean distanceWhen minimum Eustachian distance be less than threshold value T and corresponding tracking vehicle with When the marker of detection vehicle is all displayed without successful match, it will test target and tracking target matched, otherwise, it is determined that vehicle It fails to match;
The estimated value of Kalman filtering is modified, the estimated value of obtained vehicle centroid coordinate and measured value are utilized Kalman's coefficient can calculate the optimal estimation value of current vehicle center-of-mass coordinate.The present invention may be implemented using Kalman filter To the real-time estimation of target, without hysteresis effect.
Step 2, measurement of testing the speed: the echo-signal obtained using radar determines speed, the distance and bearing information of vehicle. Radar emits signal by transmitter and transmit-receive switch, and receiver receives the small-signal being reflected back when encountering object.By It is high-frequency signal in the signal received, needs to carry out it data processing, the flow chart of Radar Signal Processing is as shown in Figure 5. As a kind of specific embodiment, measurement of testing the speed method particularly includes:
Echo-signal that receiver receives is subjected to A/D and Frequency mixing processing is converted to Beat Signal, and using window function into Row FFT transform does non-coherent accumulation processing to obtained spectrogram to inhibit the interference of other target clutters;
It carries out Ordered Statistic CFAR detection and if it is greater than thresholding, then judges target by processing result compared with threshold value In the presence of, the decoupling operation of speed and distance is carried out, since the frequency of Beat Signal is made of speed and range information, according to The peak value spectral line of spectrogram calculates the phase difference between consecutive frame, using not fuzzy distance and beat signal frequency solve away from From range-to-go is obscured to obtain, formula is substituted intoThe radial velocity of vehicle can be solved, wherein f is beat letter Number frequency, μ is chirp rate, and c is the light velocity, and R is the distance of target, and V is the radial velocity of target, f0It is radar emission signal Frequency;
When the azimuth information of radar is based on radar antenna radiated electromagnetic wave, antenna beam axis and echo when target alignment The characteristics of signal is most strong, and echo-signal dies down when antenna beam axis and target deviation determines the direction of target.The present invention according to Wave path-difference determines the azimuth of vehicleCalculation formula isWherein, λ is the wavelength of radar, and Δ R is to measure its wave Path difference.
As a kind of more specific embodiment, the transmitted waveform of radar uses step-by-step movement multi order linear frequency keying (MS- LFSK) waveform, the thinking of MS-LFSK Waveform Design is that waveform is divided into two parts of upper sweep frequency band and lower sweep frequency band, in such as Fig. 1 Shown in 2, each frequency range generates the waveform of a variety of different frequencies respectively, generates a variety of not fuzzy distances with this.Upper sweep frequency band It is made of M linear frequency keying signal, the frequency keying time width of each LFSK signal is Tstep, swept bandwidth B, kth+1 A signal and k-th of signal have fixed difference on the frequency fshfit k=fk+1-fk, the identical upper sweep frequency band of lower sweep frequency band.
Step 3, result fusion: according to the vehicle distances and azimuth information of detections of radar, vehicle centroid coordinate and row are determined Direction is sailed, the vehicle centroid coordinate and driving direction that it is obtained with video detection match, and vehicle is shown on video Velocity information.As a kind of specific embodiment, as a result merge method particularly includes:
First matching: the vehicle heading of detections of radar is compared with the vehicle heading of video tracking, if Unanimously, then first successful match, into Secondary Match, otherwise it fails to match;
Secondary Match: the three-dimensional vehicle space coordinate of detections of radar is projected into image coordinate system, calculates the matter of vehicle Heart coordinate (u, v), if the height of radar installation is H, the angle with horizontal plane is θ, and the distance that radar measures vehicle is R, orientation Angle is ψ, X=Rsin ψ, Z=Rcos ψ of the vehicle under radar fix system,Y =H-h1, the Y-axis information that the Y calculated is the absence of, according to camera monocular calibration principle [u, v, 1]=K × [R × [X, Y, Z, 1]+T] three dimensional space coordinate of vehicle is projected in image coordinate system, the center-of-mass coordinate (u of vehicle is calculatedcal, vcal), wherein K is the internal reference of camera, is obtained according to Zhang Zhengyou chessboard calibration method, and R and T are that radar is put relative to camera respectively The spin matrix and translation matrix put;The vehicle centroid of the vehicle centroid coordinate (u, v) and video detection that calculate detections of radar is sat The Euclidean distance D of mark (u ', v ');When Euclidean distance D is less than threshold value T, then the vehicle and video detection of detections of radar are judged Vehicle is same vehicle, and shows the speed of radar measuring car above this vehicle in video;When Euclidean distance D is greater than When threshold value T, then it is assumed that it fails to match.
Embodiment 1
In order to verify the validity of the method for the present invention, road carries out vehicle speed measuring at optional one, specific as follows:
One frame image of video acquisition, is normalized into 416 × 416 for picture, 13 × 13 grids is then split into, if vehicle Center fall in the grid, then this grid just be responsible for detect this vehicle.
Grid can generate 3 bounding boxes, and each bounding box is using sigmoid function to the center of vehicle, target on boundary Probability and detection classification confidence level in frame are predicted.It calculatesSelectionIt is maximum Bounding box predicted, the detection block of final output target.
Using the center of the detection block of output as the initial value of Kalman filter tracking, Kalman is carried out to current vehicle Filter tracking.When tracking 2 automobiles simultaneously, the Euclidean distance of pixel coordinate in vehicle before and after frames is calculated, if Euclidean distance Minimum value be less than threshold value 50, be determined as that the vehicle of before and after frames tracking is same, and find sub-minimum and continue to track Matching;If it is greater than threshold value 50, it is determined as that vehicle tracking is lost, re-establishes new tracking.
At the same time, radar emission MS-LFSK signal, and AD sampling and data rearrangement are carried out to echo-signal, obtain difference Clap signal.FFT processing is done to every group of LFSK Beat Signal, obtains spectrogram Wk[N] does cyclic graph accumulation to it, obtains amplitude-frequency ResponseOrdered Statistic CFAR detection be greater than thresholding and in the form of peak value existing for detect it is single Member.If one of detection unit is p, for frequency expression corresponding to the detection unit where spectral line are as follows:Calculate the corresponding phase of peak value spectral line p in spectrogram Adjacent phase is obtained as differenceAccording to not fuzzy distancePhase differenceWith Beat Signal fMS-LFSKCalculate corresponding fuzzy distance Then basis Range ambiguity resolving goes out distance and speed apart from velocity solution couple solution.
The azimuth of vehicleIt can be determined by wave path-difference,Wherein, λ is the wavelength of radar, measures its wave path-difference Δ R, so that it may determine target direction
The distance, speed and azimuth information of detections of radar to vehicle are transferred to video by host computer and handle port.Depending on The center-of-mass coordinate for the vehicle that frequency processor record detects when starting, continues the mass center of calculating vehicle during the tracking of vehicle Coordinate can finally judge the traveling of vehicle according to the placement position of camera from video according to the center-of-mass coordinate difference of vehicle Direction.Then radar, which is transmitted through the speed come, also may determine that the driving direction of vehicle from positive and negative dividing.If video detection Vehicle Speed is consistent with the radar biography direction of car speed, then enters second step and match;Otherwise it is assumed that it fails to match.
In second step matching, if the height of radar installation is H, the angle with horizontal plane is θ, and installation is as shown in Figure 3.Thunder It is R up to the distance for measuring vehicle, azimuth is ψ, X=Rsin ψ, Z=Rcos ψ of the vehicle under radar fix system,Y=H-h1, Y-axis information that the Y calculated is the absence of.According to camera The three dimensional space coordinate of vehicle can be projected to image seat by monocular calibration principle [u, v, 1]=K × [R × [X, Y, Z, 1]+T] In mark system, the center-of-mass coordinate (u of vehicle is calculatedcal, vcal).Wherein K is the internal reference of camera, can be according to Zhang Zhengyou chessboard mark The method of determining obtains, the spin matrix and translation matrix that R and T, which are radar respectively, to be put relative to camera, can put according to 2 Position determines that [u, v] is the pixel coordinate that the three dimensional space coordinate of object projects in image coordinate system.Video to vehicle into Also center-of-mass coordinate (the u of available one group of vehicle during row detectionreal, vreal).If threshold value T is 30, if metThen judge that the vehicle of video detection and the vehicle that radar is surveyed are same Vehicle shows the speed of corresponding vehicle in video;IfThen it fails to match.
From the present embodiment as can be seen that the present invention can test the speed simultaneously to more automobiles, and it can use video The travel situations of intuitive display road vehicle.Not only rate accuracy with higher in this way, and it is visual strong, it is cheap.

Claims (7)

1. a kind of vehicle speed measuring method based on video encoder server and Radar Signal Fusion, which is characterized in that steps are as follows:
Step 1, vehicle identification tracking: vehicle video information is acquired using camera, carries out the identification and tracking of vehicle;
Step 2, measurement of testing the speed: the echo-signal obtained using radar determines speed, the distance and bearing information of vehicle;
Step 3, result fusion: according to the vehicle distances and azimuth information of detections of radar, vehicle centroid coordinate and traveling side are determined To the vehicle centroid coordinate and driving direction for obtaining it with video detection match, and the speed of vehicle is shown on video Information.
2. the vehicle speed measuring method according to claim 1 based on video encoder server and Radar Signal Fusion, feature It is, step 1 carries out vehicle identification using YOLOv3 algorithm, carries out vehicle tracking using Kalman filter tracking.
3. the vehicle speed measuring method according to claim 2 based on video encoder server and Radar Signal Fusion, feature It is, step 1 carries out vehicle identification using YOLOv3 algorithm method particularly includes:
The frame picture read to camera is normalized, and is divided into S × S-grid, and object center is selected to fall Grid be responsible for the detection of object, since each bounding box possesses 5 parameters, the respectively centre coordinate (x of target0, y0), Wide w, high h and bounding box confidence level, by centre coordinate (x0, y0) relative to place target's center's cell normalize to 0-1 it Between, the wide and high of w, h image itself normalizes between 0 and 1;
Bounding box confidence level is determined according to the bounding-box perimeter after normalizationWherein Pr (object) table Show a possibility that bounding box contains target, when there is target in bounding box, Pr (object)=1, when in bounding box without target When, Pr (object)=0,It is expressed as the accuracy of bounding box, calculation formula isIts Middle BoxtruthRepresent true object boundary frame, BoxpredThe object boundary frame for representing prediction, according toThreshold value removal The relatively low window of possibility removes redundancy window using non-maxima suppression, exports final target detection frame.
4. the vehicle speed measuring method according to claim 2 based on video encoder server and Radar Signal Fusion, feature Be, the Kalman filter tracking of step 1 choose vehicle center-of-mass coordinate be vehicle trace point, by tracking be divided into initialization, Prediction, matching and corrigendum four-stage, method particularly includes:
Vehicle is appeared in into the pixel coordinate in first frame image as the initial value of Kalman filter tracking;
Kalman prediction is carried out to current all moving targets, since the movement of object in a short time can be regarded as Linear uniform motion, thus define its state-transition matrix A be A=[1,0, △ t, 0;0,1,0,△t;0,0,1,0;0,0,0, 1], wherein △ t indicates that video reads the time of a frame, and the prediction result of state matrix and present frame is substituted into state transfer side Journey obtains the status predication value of next frame;
Multiple tracking targets are matched, since the time that video handles a frame is short, so the operating range of vehicle will not be very Far, threshold value a T, x are setk+1,xkIt is the abscissa of video next frame and present frame vehicle centroid, y respectivelyk+1,ykIt is view respectively The ordinate of frequency next frame and present frame vehicle centroid, when matching, appear in the pixel coordinate meter in video before and after frames using vehicle Calculate Euclidean distanceWhen minimum Eustachian distance is less than threshold value T and corresponding tracking vehicle and detection When the marker of vehicle is all displayed without successful match, it will test target and tracking target matched, otherwise, it is determined that vehicle With failure;
The estimated value of Kalman filtering is modified, the estimated value of obtained vehicle centroid coordinate and measured value are utilized into karr Graceful coefficient can calculate the optimal estimation value of current vehicle center-of-mass coordinate.
5. the vehicle speed measuring method according to claim 1 based on video encoder server and Radar Signal Fusion, feature It is, step 2 carries out measurement of testing the speed method particularly includes:
The echo-signal that receiver receives is subjected to A/D and Frequency mixing processing is converted to Beat Signal, and carries out FFT using window function Transformation does non-coherent accumulation processing to obtained spectrogram to inhibit the interference of other target clutters;
It carries out Ordered Statistic CFAR detection and if it is greater than thresholding, then judges that target is deposited by processing result compared with threshold value The decoupling operation of speed and distance is being carried out, since the frequency of Beat Signal is made of speed and range information, according to frequency The peak value spectral line of spectrogram calculates the phase difference between consecutive frame, solves distance using not fuzzy distance and beat signal frequency Range-to-go is obscured to obtain, formula is substituted intoIt can ask
Solve the radial velocity of vehicle, wherein f is the frequency of Beat Signal, and μ is chirp rate, and c is the light velocity, R be target away from From V is the radial velocity of target, f0It is the frequency of radar emission signal;
The azimuth of vehicle is determined according to wave path-differenceCalculation formula isWherein, λ is the wavelength of radar, and Δ R is Measure its wave path-difference.
6. the vehicle speed measuring method according to claim 1 based on video encoder server and Radar Signal Fusion, feature It is, step 2 radar uses multi order linear frequency keying waveform, is divided into 2 parts of sweep frequency band and lower sweep frequency band, in each frequency Section generates the waveforms of a variety of different frequencies respectively, generates a variety of not fuzzy distances with this.
7. the vehicle speed measuring method according to claim 1 based on video encoder server and Radar Signal Fusion, feature It is, step 3 carries out video detection and radar detection object matching method particularly includes:
First matching: the vehicle heading of detections of radar is compared with the vehicle heading of video tracking, if unanimously, Then first successful match, into Secondary Match, otherwise it fails to match;
Secondary Match: projecting to image coordinate system for the three-dimensional vehicle space coordinate of detections of radar, and the mass center for calculating vehicle is sat It marks (u, v), if the height of radar installation is H, the angle with horizontal plane is θ, and the distance that radar measures vehicle is R, and azimuth is ψ, X=Rsin ψ, Z=Rcos ψ of the vehicle under radar fix system,Y=H- h1, the Y-axis information that the Y calculated is the absence of, according to camera monocular calibration principle [u, v, 1]=K × [R × [X, Y, Z, 1] + T] three dimensional space coordinate of vehicle is projected in image coordinate system, the center-of-mass coordinate (u of vehicle is calculatedcal, vcal), Middle K is the internal reference of camera, is obtained according to Zhang Zhengyou chessboard calibration method, and R and T are the rotation that radar is put relative to camera respectively Matrix and translation matrix;Calculate the vehicle centroid coordinate (u, v) of detections of radar and the vehicle centroid coordinate (u ', v ') of video detection Euclidean distance D;When Euclidean distance D is less than threshold value T, then the vehicle of the vehicle and video detection that judge detections of radar is same Vehicle, and the speed of radar measuring car is shown above this vehicle in video;When Euclidean distance D is greater than threshold value T, then Think that it fails to match.
CN201810973204.1A 2018-08-24 2018-08-24 Vehicle speed measuring method based on video encoder server and Radar Signal Fusion Pending CN109102702A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810973204.1A CN109102702A (en) 2018-08-24 2018-08-24 Vehicle speed measuring method based on video encoder server and Radar Signal Fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810973204.1A CN109102702A (en) 2018-08-24 2018-08-24 Vehicle speed measuring method based on video encoder server and Radar Signal Fusion

Publications (1)

Publication Number Publication Date
CN109102702A true CN109102702A (en) 2018-12-28

Family

ID=64851400

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810973204.1A Pending CN109102702A (en) 2018-08-24 2018-08-24 Vehicle speed measuring method based on video encoder server and Radar Signal Fusion

Country Status (1)

Country Link
CN (1) CN109102702A (en)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109613537A (en) * 2019-01-16 2019-04-12 南京奥杰智能科技有限公司 A kind of hologram radar
CN109859468A (en) * 2019-01-30 2019-06-07 淮阴工学院 Multilane traffic volume based on YOLOv3 counts and wireless vehicle tracking
CN109935080A (en) * 2019-04-10 2019-06-25 武汉大学 A monitoring system and method for real-time calculation of traffic flow on a traffic line
CN110045365A (en) * 2019-03-26 2019-07-23 西北工业大学 A kind of image target positioning method based on radar information
CN110060298A (en) * 2019-03-21 2019-07-26 径卫视觉科技(上海)有限公司 A kind of vehicle location and attitude and heading reference system based on image and corresponding method
CN110351478A (en) * 2019-05-21 2019-10-18 江苏看见云软件科技有限公司 A kind of unmanned plane acquiring and transmission system handling violation information
CN110660220A (en) * 2019-10-08 2020-01-07 五邑大学 Urban rail train priority distribution method and system
CN111383462A (en) * 2020-03-19 2020-07-07 天津职业技术师范大学(中国职业培训指导教师进修中心) New traffic accident scene video speed measurement method based on four-point plane homography
CN111754798A (en) * 2020-07-02 2020-10-09 上海电科智能系统股份有限公司 Method for realizing detection of vehicle and surrounding obstacles by fusing roadside laser radar and video
CN111833598A (en) * 2020-05-14 2020-10-27 山东科技大学 A method and system for automatic monitoring of highway UAV traffic incidents
CN111986232A (en) * 2020-08-13 2020-11-24 上海高仙自动化科技发展有限公司 Target object detection method, target object detection device, robot and storage medium
CN112017239A (en) * 2019-05-31 2020-12-01 北京市商汤科技开发有限公司 Target object orientation determining method, intelligent driving control method, device and equipment
CN112071069A (en) * 2020-09-17 2020-12-11 吉林大学 A method for diagnosing brake failure of freight vehicles on long downhill sections
CN112147632A (en) * 2020-09-23 2020-12-29 中国第一汽车股份有限公司 Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm
CN112529955A (en) * 2020-12-08 2021-03-19 北京首科丰汇科技有限公司 Road normalization and speed recovery method and device for expressway
CN112818170A (en) * 2021-01-25 2021-05-18 浙江大华技术股份有限公司 Mobile parameter display method and device
CN112990128A (en) * 2021-04-27 2021-06-18 电子科技大学 Multi-vehicle speed measuring method based on video tracking
CN113012445A (en) * 2019-12-19 2021-06-22 富士通株式会社 Intelligent traffic control system and control method thereof
CN113255397A (en) * 2020-02-10 2021-08-13 富士通株式会社 Target detection method and device
WO2021164006A1 (en) * 2020-02-21 2021-08-26 华为技术有限公司 Vehicle speed measurement method and device, vehicle acceleration measurement method and device, and storage medium
CN113850995A (en) * 2021-09-14 2021-12-28 华设设计集团股份有限公司 Event detection method, device and system based on tunnel radar vision data fusion
CN114023081A (en) * 2021-11-02 2022-02-08 北京世纪好未来教育科技有限公司 Vehicle speed measurement method, device and system, storage medium
CN114202748A (en) * 2021-12-20 2022-03-18 上海智眸智能科技有限责任公司 Illegal lane change detection method, detection device and storage medium
CN114462468A (en) * 2021-12-14 2022-05-10 浙江大华技术股份有限公司 Fusion method, device and computer readable storage medium
CN114758511A (en) * 2022-06-14 2022-07-15 深圳市城市交通规划设计研究中心股份有限公司 Sports car overspeed detection system, method, electronic equipment and storage medium
CN114937358A (en) * 2022-05-20 2022-08-23 内蒙古工业大学 A method of multi-lane traffic flow statistics on expressways
CN114973663A (en) * 2022-05-16 2022-08-30 浙江机电职业技术学院 An intelligent roadside unit device based on edge computing
CN115151836A (en) * 2020-02-27 2022-10-04 宝马汽车股份有限公司 Method for detecting a moving object in the surroundings of a vehicle and motor vehicle
CN115472022A (en) * 2022-09-06 2022-12-13 同盾科技有限公司 Fusion speed measuring method, fusion speed measuring device, storage medium and electronic equipment
CN116721552A (en) * 2023-06-12 2023-09-08 北京博宏科元信息科技有限公司 Non-motor vehicle overspeed identification recording method, device, equipment and storage medium
CN117238143A (en) * 2023-09-15 2023-12-15 北京卓视智通科技有限责任公司 Traffic data fusion method, system and device based on radar double-spectrum camera
CN120011829A (en) * 2025-04-17 2025-05-16 华芯程(杭州)科技有限公司 Layout fuzzy matching method and device, storage medium and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103227963A (en) * 2013-03-20 2013-07-31 西交利物浦大学 Static surveillance video abstraction method based on video moving target detection and tracing
CN104424804A (en) * 2013-09-10 2015-03-18 上海弘视通信技术有限公司 Intelligent speed-measuring method with single radar and multiple lanes and system thereof
CN106373394A (en) * 2016-09-12 2017-02-01 深圳尚桥交通技术有限公司 Vehicle detection method and system based on video and radar
CN106951879A (en) * 2017-03-29 2017-07-14 重庆大学 Multi-feature fusion vehicle detection method based on camera and millimeter wave radar
CN107991671A (en) * 2017-11-23 2018-05-04 浙江东车智能科技有限公司 A kind of method based on radar data and vision signal fusion recognition risk object
CN108022258A (en) * 2017-10-20 2018-05-11 南京邮电大学 Real-time multi-target tracking based on the more frame detectors of single and Kalman filtering

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103227963A (en) * 2013-03-20 2013-07-31 西交利物浦大学 Static surveillance video abstraction method based on video moving target detection and tracing
CN104424804A (en) * 2013-09-10 2015-03-18 上海弘视通信技术有限公司 Intelligent speed-measuring method with single radar and multiple lanes and system thereof
CN106373394A (en) * 2016-09-12 2017-02-01 深圳尚桥交通技术有限公司 Vehicle detection method and system based on video and radar
CN106951879A (en) * 2017-03-29 2017-07-14 重庆大学 Multi-feature fusion vehicle detection method based on camera and millimeter wave radar
CN108022258A (en) * 2017-10-20 2018-05-11 南京邮电大学 Real-time multi-target tracking based on the more frame detectors of single and Kalman filtering
CN107991671A (en) * 2017-11-23 2018-05-04 浙江东车智能科技有限公司 A kind of method based on radar data and vision signal fusion recognition risk object

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
何俊杰: "基于多阶线性频率键控的汽车防撞雷达设计与实现", 《中国优秀硕士学位论文全文数据库(电子期刊)》 *
罗庆生: "《仿生四足机器人技术》", 30 April 2016 *
黎洲: "基于YOLO_v2模型的车辆实时检测", 《中国机械工程》 *

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109613537A (en) * 2019-01-16 2019-04-12 南京奥杰智能科技有限公司 A kind of hologram radar
CN109859468A (en) * 2019-01-30 2019-06-07 淮阴工学院 Multilane traffic volume based on YOLOv3 counts and wireless vehicle tracking
CN110060298A (en) * 2019-03-21 2019-07-26 径卫视觉科技(上海)有限公司 A kind of vehicle location and attitude and heading reference system based on image and corresponding method
CN110060298B (en) * 2019-03-21 2023-06-20 径卫视觉科技(上海)有限公司 Image-based vehicle position and posture determining system and corresponding method
CN110045365A (en) * 2019-03-26 2019-07-23 西北工业大学 A kind of image target positioning method based on radar information
CN110045365B (en) * 2019-03-26 2023-03-14 西北工业大学 Image target positioning method based on radar information
CN109935080A (en) * 2019-04-10 2019-06-25 武汉大学 A monitoring system and method for real-time calculation of traffic flow on a traffic line
CN109935080B (en) * 2019-04-10 2021-07-16 武汉大学 A monitoring system and method for real-time calculation of traffic flow on a traffic line
CN110351478A (en) * 2019-05-21 2019-10-18 江苏看见云软件科技有限公司 A kind of unmanned plane acquiring and transmission system handling violation information
CN112017239A (en) * 2019-05-31 2020-12-01 北京市商汤科技开发有限公司 Target object orientation determining method, intelligent driving control method, device and equipment
CN110660220B (en) * 2019-10-08 2022-01-28 五邑大学 Urban rail train priority distribution method and system
CN110660220A (en) * 2019-10-08 2020-01-07 五邑大学 Urban rail train priority distribution method and system
CN113012445A (en) * 2019-12-19 2021-06-22 富士通株式会社 Intelligent traffic control system and control method thereof
CN113255397A (en) * 2020-02-10 2021-08-13 富士通株式会社 Target detection method and device
WO2021164006A1 (en) * 2020-02-21 2021-08-26 华为技术有限公司 Vehicle speed measurement method and device, vehicle acceleration measurement method and device, and storage medium
US12326499B2 (en) 2020-02-27 2025-06-10 Bayerische Motoren Werke Aktiengesellschaft Method for detecting moving objects in the surroundings of a vehicle, and motor vehicle
CN115151836A (en) * 2020-02-27 2022-10-04 宝马汽车股份有限公司 Method for detecting a moving object in the surroundings of a vehicle and motor vehicle
CN111383462A (en) * 2020-03-19 2020-07-07 天津职业技术师范大学(中国职业培训指导教师进修中心) New traffic accident scene video speed measurement method based on four-point plane homography
CN111833598A (en) * 2020-05-14 2020-10-27 山东科技大学 A method and system for automatic monitoring of highway UAV traffic incidents
CN111754798A (en) * 2020-07-02 2020-10-09 上海电科智能系统股份有限公司 Method for realizing detection of vehicle and surrounding obstacles by fusing roadside laser radar and video
CN111986232B (en) * 2020-08-13 2021-09-14 上海高仙自动化科技发展有限公司 Target object detection method, target object detection device, robot and storage medium
CN111986232A (en) * 2020-08-13 2020-11-24 上海高仙自动化科技发展有限公司 Target object detection method, target object detection device, robot and storage medium
CN112071069A (en) * 2020-09-17 2020-12-11 吉林大学 A method for diagnosing brake failure of freight vehicles on long downhill sections
CN112071069B (en) * 2020-09-17 2021-07-30 吉林大学 A method for diagnosing brake failure of freight vehicles on long downhill sections
CN112147632A (en) * 2020-09-23 2020-12-29 中国第一汽车股份有限公司 Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm
CN112529955A (en) * 2020-12-08 2021-03-19 北京首科丰汇科技有限公司 Road normalization and speed recovery method and device for expressway
CN112529955B (en) * 2020-12-08 2023-09-29 北京首科丰汇科技有限公司 Road normalization and speed recovery method and device for expressway
CN112818170A (en) * 2021-01-25 2021-05-18 浙江大华技术股份有限公司 Mobile parameter display method and device
CN112990128A (en) * 2021-04-27 2021-06-18 电子科技大学 Multi-vehicle speed measuring method based on video tracking
CN113850995A (en) * 2021-09-14 2021-12-28 华设设计集团股份有限公司 Event detection method, device and system based on tunnel radar vision data fusion
CN113850995B (en) * 2021-09-14 2022-12-27 华设设计集团股份有限公司 Event detection method, device and system based on tunnel radar vision data fusion
CN114023081A (en) * 2021-11-02 2022-02-08 北京世纪好未来教育科技有限公司 Vehicle speed measurement method, device and system, storage medium
CN114023081B (en) * 2021-11-02 2023-01-13 北京世纪好未来教育科技有限公司 Vehicle speed measurement method, device and system, and storage medium
CN114462468A (en) * 2021-12-14 2022-05-10 浙江大华技术股份有限公司 Fusion method, device and computer readable storage medium
CN114202748A (en) * 2021-12-20 2022-03-18 上海智眸智能科技有限责任公司 Illegal lane change detection method, detection device and storage medium
CN114973663A (en) * 2022-05-16 2022-08-30 浙江机电职业技术学院 An intelligent roadside unit device based on edge computing
CN114973663B (en) * 2022-05-16 2023-08-29 浙江机电职业技术学院 Intelligent road side unit device based on edge calculation
CN114937358A (en) * 2022-05-20 2022-08-23 内蒙古工业大学 A method of multi-lane traffic flow statistics on expressways
CN114758511B (en) * 2022-06-14 2022-11-25 深圳市城市交通规划设计研究中心股份有限公司 Sports car overspeed detection system, method, electronic equipment and storage medium
CN114758511A (en) * 2022-06-14 2022-07-15 深圳市城市交通规划设计研究中心股份有限公司 Sports car overspeed detection system, method, electronic equipment and storage medium
CN115472022A (en) * 2022-09-06 2022-12-13 同盾科技有限公司 Fusion speed measuring method, fusion speed measuring device, storage medium and electronic equipment
CN115472022B (en) * 2022-09-06 2024-03-22 同盾科技有限公司 Fusion speed measuring method, fusion speed measuring device, storage medium and electronic equipment
CN116721552B (en) * 2023-06-12 2024-05-14 北京博宏科元信息科技有限公司 Non-motor vehicle overspeed identification recording method, device, equipment and storage medium
CN116721552A (en) * 2023-06-12 2023-09-08 北京博宏科元信息科技有限公司 Non-motor vehicle overspeed identification recording method, device, equipment and storage medium
CN117238143A (en) * 2023-09-15 2023-12-15 北京卓视智通科技有限责任公司 Traffic data fusion method, system and device based on radar double-spectrum camera
CN117238143B (en) * 2023-09-15 2024-03-22 北京卓视智通科技有限责任公司 Traffic data fusion method, system and device based on radar double-spectrum camera
CN120011829A (en) * 2025-04-17 2025-05-16 华芯程(杭州)科技有限公司 Layout fuzzy matching method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN109102702A (en) Vehicle speed measuring method based on video encoder server and Radar Signal Fusion
US11340332B2 (en) Method and apparatus for processing radar data
WO2022141914A1 (en) Multi-target vehicle detection and re-identification method based on radar and video fusion
AU2014202300B2 (en) Traffic monitoring system for speed measurement and assignment of moving vehicles in a multi-target recording module
WO2021104497A1 (en) Positioning method and system based on laser radar, and storage medium and processor
CN106405556B (en) Vehicle target information detection identifying system and its signal processing method
CN109212531A (en) The method for determining target vehicle orientation
US20240053468A1 (en) Vehicle-mounted bsd millimeter wave radar based method for obstacle recognition at low speed
US12326499B2 (en) Method for detecting moving objects in the surroundings of a vehicle, and motor vehicle
GB2292860A (en) Tracking system
Cui et al. 3D detection and tracking for on-road vehicles with a monovision camera and dual low-cost 4D mmWave radars
CN110379178A (en) Pilotless automobile intelligent parking method based on millimetre-wave radar imaging
CN113835074B (en) Dynamic people flow monitoring method based on millimeter wave radar
KR101968327B1 (en) Apparatus and method for compensating distance of track
CN115061113B (en) Target detection model training method and device for radar and storage medium
Sengupta et al. Automatic radar-camera dataset generation for sensor-fusion applications
CN114184256B (en) Water level measurement method under multi-target background
JPH0980146A (en) Radar apparatus
CN112285698B (en) Multi-target tracking device and method based on radar sensor
CN210572736U (en) Low, small and slow target automatic detection, tracking and identification system
Argüello et al. Radar classification for traffic intersection surveillance based on micro-Doppler signatures
CN118033626B (en) Target tracking speed measurement method and system based on double radars
CN116879863B (en) Multi-target measuring method and system for continuous wave 4D millimeter wave radar
CN114839615B (en) 4D millimeter wave radar target course angle fitting method and storage medium
Deng et al. Robust target detection, position deducing and tracking based on radar camera fusion in transportation scenarios

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20181228