[go: up one dir, main page]

CN117291910B - River and lake pollution discharge motion capturing method based on unmanned aerial vehicle image acquisition - Google Patents

River and lake pollution discharge motion capturing method based on unmanned aerial vehicle image acquisition Download PDF

Info

Publication number
CN117291910B
CN117291910B CN202311569725.8A CN202311569725A CN117291910B CN 117291910 B CN117291910 B CN 117291910B CN 202311569725 A CN202311569725 A CN 202311569725A CN 117291910 B CN117291910 B CN 117291910B
Authority
CN
China
Prior art keywords
river
lake
image
area
sewage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311569725.8A
Other languages
Chinese (zh)
Other versions
CN117291910A (en
Inventor
植挺生
刘勇
邓永俊
陈建生
邓超河
严如灏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Guangyu Technology Development Co Ltd
Original Assignee
Guangdong Guangyu Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Guangyu Technology Development Co Ltd filed Critical Guangdong Guangyu Technology Development Co Ltd
Priority to CN202311569725.8A priority Critical patent/CN117291910B/en
Publication of CN117291910A publication Critical patent/CN117291910A/en
Application granted granted Critical
Publication of CN117291910B publication Critical patent/CN117291910B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/182Network patterns, e.g. roads or rivers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Remote Sensing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a river and lake pollution discharge motion capture method based on unmanned aerial vehicle image acquisition, which relates to the technical field of pollution discharge motion capture and comprises the following steps: step S1, carrying an imaging device through an unmanned aerial vehicle; s2, controlling the unmanned aerial vehicle to acquire river and lake images; s3, analyzing the river and lake images; step S4, daily data of the river and the lake are obtained; s5, analyzing daily data of the river and the lake; s6, analyzing the flow direction of the river and the lake and the furthest pollution discharge distance; s7, controlling the unmanned aerial vehicle to shoot a sewage image; s8, analyzing the sewage image to obtain a final polluted area of the river and the lake; the invention is used for solving the problems that the existing pollution discharge movement capturing technology is difficult to judge the end point of the pollution discharge movement according to river and lake data analysis and the analysis method for the river and lake polluted water area is too single.

Description

River and lake pollution discharge motion capturing method based on unmanned aerial vehicle image acquisition
Technical Field
The invention relates to the technical field of blowdown trend capturing, in particular to a river and lake blowdown trend capturing method based on unmanned aerial vehicle image acquisition.
Background
The pollution discharge trend capturing is to identify and evaluate the discharge condition and change of pollutants by monitoring and analyzing the discharge behavior and trend of the discharge source, and mainly comprises the steps of monitoring and collecting the discharge quantity, the discharge concentration, the discharge composition and the like of various pollution sources, and determining the discharge trend and change rule of the pollution sources by processing and analyzing the data.
In the existing pollution discharge moving-direction capturing technology based on image acquisition, the polluted water area caused by river and lake pollution discharge is usually obtained by analyzing the existing pollution area through pictures, the position of the polluted water area caused by the pollution discharge is difficult to be judged from the source through analyzing the data of the river and lake, in the existing pollution discharge moving-direction capturing technology based on image acquisition, the existing pollution water area can only be comprehensively shot, a certain area can not be detected in a targeted manner, the data processing capacity required by the mode is large, the analysis efficiency is low, and the problem that larger analysis errors are generated due to illumination reasons can occur only by carrying out one-time image gray value analysis, for example, the method disclosed in publication number is as follows: the invention patent of CN116229276A discloses a computer vision-based river entering pollution discharge detection method, which is characterized in that the existing polluted water area is analyzed after the river and the lake are comprehensively shot, the position of the polluted water area is not judged aiming at the river and the lake data, gray value analysis is only carried out on the river and the lake once, the error is too large, the analysis result is not reliable enough, the existing pollution discharge trend capturing technology based on image acquisition also has the problems that the end point of the pollution discharge trend is difficult to judge according to the river and the lake data analysis, the analysis method for the river and the lake polluted water area is too single, the analysis process efficiency is low, and the analysis result has no confidence.
Disclosure of Invention
The invention aims to solve at least one of the technical problems in the prior art to a certain extent, and the invention is used for solving the problems that the existing pollution discharge movement capturing technology based on image acquisition is difficult to judge the end point of the pollution discharge movement according to river and lake data analysis, and the analysis method for the river and lake polluted water area is too single, so that the analysis process efficiency is low and the analysis result has no confidence.
In order to achieve the above purpose, the application provides a river and lake blowdown trend capturing method based on unmanned aerial vehicle image acquisition, the blowdown trend capturing method includes:
step S1, carrying an imaging device through an unmanned aerial vehicle;
s2, controlling the unmanned aerial vehicle to acquire images of rivers and lakes, and marking the acquired images as river and lake images;
s3, analyzing the river and lake image to obtain a preliminary pollution area of the river and the lake;
s4, obtaining river and lake flow rate, river and lake flow velocity and river and lake digestion amount of the river and lake;
s5, analyzing the flow rate of the river and the lake, the flow rate of the river and the lake and the digestion amount of the river and the lake to obtain the flow direction of the river and the lake and the furthest pollution discharge distance of the river and the lake;
s6, analyzing the flow direction of the river and the lake and the furthest sewage discharge distance to obtain the sewage positioning of the river and the lake;
s7, controlling the unmanned aerial vehicle to go to the position where the sewage is positioned, performing image shooting, and marking the shot image as a sewage image;
and S8, analyzing the sewage image to obtain a final polluted area of the river and the lake.
Further, the step S2 includes the following sub-steps:
step S201, controlling the unmanned aerial vehicle to go to a specified detection point;
step S202, shooting the river and the lake, and marking the shot image as a river and lake image.
Further, the step S3 includes the following sub-steps:
step S301, extracting plant images in river and lake images through an intelligent image extraction model;
step S302, analyzing the plant image through an intelligent object recognition model to obtain plant types;
step S303, comparing the plant types with the polluted plant types in the polluted plant database, and outputting a normal plant signal if the plant types are different from the polluted plant types; outputting a contaminated plant signal if the plant species is the same as the contaminated plant species;
step S304, if a polluted plant signal is output, carrying out graying treatment on a river and lake image and marking the river and lake image as a river and lake gray image, carrying out contour extraction on a water area where a polluted plant is located in the river and lake gray image through a contour extraction model, expanding the contour of the extracted water area by a first area multiple, marking the extracted water area as a river and lake polluted area, and simultaneously obtaining unmanned plane coordinates and marking the unmanned plane coordinates as river and lake polluted area coordinates;
step S305, a river and lake gray level diagram is obtained, the gray level value of each pixel point is extracted, the gray level value is compared with a first gray level threshold value, and if the gray level value is smaller than or equal to the first gray level threshold value, a pollution area signal is output; if the gray value is larger than the first gray threshold value, outputting a normal area signal;
step S306, collecting all pollution area signals, integrating and marking pixel points outputting the pollution area signals as a preliminary pollution area, and simultaneously acquiring unmanned aerial vehicle coordinates and marking the unmanned aerial vehicle coordinates as pollution coordinates.
Further, the contour extraction model carries out binarization processing on the river and lake gray level diagram to obtain a river and lake binary diagram, then obtains the color of a target pixel point and the color of an adjacent pixel point adjacent to the target pixel point, if the color of the target pixel point is black and the corresponding colors of all the adjacent pixel points are black, the target pixel point is deleted, all the pixel points are analyzed, and the reserved image is the contour of the water area.
Further, the step S4 includes the following sub-steps:
step S401, installing a flowmeter and a water flow speed sensor in a river or a lake;
step S402, obtaining river and lake flow and river and lake flow rate;
step S403, reading a pollution discharge database to obtain the current digestion amount of the river and the lake.
Further, the step S5 includes the following sub-steps:
step S501, throwing a buoy into a river and a lake through an unmanned aerial vehicle;
step S502, controlling the unmanned aerial vehicle to follow-up the cursory through the intelligent visual tracking model, and highlighting a line through which the cursory passes and marking the line as the river and lake flow direction;
step S503, calculating the flow rate of the river and the lake and the flow velocity of the river and the lake by a pollution discharge distance calculation formula to obtain the furthest pollution discharge distance;
the blowdown distance calculation formula is configured as follows:the method comprises the steps of carrying out a first treatment on the surface of the Wherein L is the furthest pollution discharge distance, F is the flow of the river and the lake, V is the flow rate of the river and the lake, P is the digestion amount of the river and the lake, alpha is a preset flow coefficient, and beta is a preset flow coefficient.
Further, the step S6 includes the following sub-steps:
step S601, obtaining the river and lake flow direction and the furthest pollution discharge distance;
step S602, based on the river and lake flow direction, acquiring coordinates of a water area with the furthest pollution discharge distance from a pollution discharge point, and marking the coordinates as sewage positioning.
Further, the step S7 includes the following sub-steps:
step S701, controlling the unmanned aerial vehicle to go to the position where the sewage is positioned;
in step S702, the unmanned aerial vehicle performs image capturing on the location where the sewage is located, and marks the captured image as a sewage image.
Further, the step S8 includes the following sub-steps:
step S801, obtaining a sewage image;
step S802, extracting plant images in the sewage images through an intelligent image extraction model;
step S803, analyzing the plant image through the intelligent object recognition model to obtain plant types;
step S804, comparing the plant types with the polluted plant types in the polluted plant database, and outputting a normal plant signal if the plant types are different from the polluted plant types; outputting a contaminated plant signal if the plant species is the same as the contaminated plant species;
step S805, if a polluted plant signal is output, carrying out grey-scale treatment on a sewage image and marking the sewage image as a sewage grey-scale image, carrying out contour extraction on a water area where the polluted plant is located in the sewage grey-scale image through a contour extraction model, expanding the contour of the extracted water area by a first area multiple, marking the extracted water area as a river and lake polluted area, simultaneously acquiring unmanned plane coordinates, and marking the unmanned plane coordinates as river and lake polluted area coordinates;
step S806, a sewage gray level map is obtained, a gray level value of each pixel point is extracted, the gray level value is compared with a first gray level threshold value, and if the gray level value is smaller than or equal to the first gray level threshold value, a pollution area signal is output; if the gray value is larger than the first gray threshold value, outputting a normal area signal;
step S807, collecting all pollution area signals, integrating and marking pixel points outputting the pollution area signals as a final pollution area, and simultaneously acquiring unmanned plane coordinates and marking the unmanned plane coordinates as sewage coordinates;
step S808, calculating the distance between the longitude and latitude of the pollution coordinate and the sewage coordinate, marking the calculation result as a coordinate distance, comparing the coordinate distance with a first error threshold, and outputting an actual pollution area signal if the coordinate distance is smaller than or equal to the first error threshold; outputting a signal with larger error if the coordinate distance is larger than the first error threshold value;
step S809, if the actual pollution area signal is output, marking the final pollution area as a river and lake pollution area, marking the sewage coordinates as river and lake pollution area coordinates, and counting all river and lake pollution area coordinates and storing the coordinates in a pollution area database; if the error is larger, the drainage movement of the river and the lake is analyzed again.
The invention has the beneficial effects that: according to the invention, the detection area is photographed and then is subjected to preliminary analysis, whether the detection area is a river and lake pollution area or not is analyzed, the flow speed, flow rate and flow direction of the river and lake are analyzed, the specific pollution area is analyzed through images after the river and lake pollution area is found, and the final river and lake pollution area of the river and lake is obtained through comparison with the position information of the preliminary pollution area;
the method and the device for identifying the polluted water areas of the river and the lake by identifying the images of the river and the lake, judging the positions of the polluted water areas of the river and the lake by identifying algae plants in the images, and extracting the outlines to obtain the specific range of the polluted water areas of the river and the lake, and have the advantages of being capable of rapidly and accurately identifying the polluted water areas caused by the pollution discharge of the river and the lake, and improving the timeliness of the trend analysis of the pollution discharge of the river and the lake and the accuracy of the analysis result.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a schematic illustration of the unmanned aerial vehicle of the present invention;
in the figure, 1, a high-definition camera.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Embodiment one: referring to fig. 1, in a first aspect, the present application provides a river and lake sewage drainage direction capturing method based on unmanned aerial vehicle image acquisition, which includes the following steps: step S1, carrying an imaging device through an unmanned aerial vehicle; s2, controlling the unmanned aerial vehicle to acquire river and lake images; s3, analyzing the river and lake images; step S4, daily data of the river and the lake are obtained; s5, analyzing daily data of the river and the lake; s6, analyzing the flow direction of the river and the lake and the furthest pollution discharge distance; s7, controlling the unmanned aerial vehicle to shoot a sewage image; and S8, analyzing the sewage image to obtain a final polluted area of the river and the lake.
Specifically, the river and lake pollution discharge motion capturing method based on unmanned aerial vehicle image acquisition comprises the following steps:
step S1, carrying a camera device through an unmanned aerial vehicle; step S1 comprises the following sub-steps:
step S101, installing a high-definition camera device in the unmanned plane;
step S102, installing a navigation positioning system in the unmanned aerial vehicle;
referring to fig. 2, in a specific implementation, the unmanned aerial vehicle is an unmanned aerial vehicle in the prior art, the high-definition camera 1 in the prior art is adopted by the high-definition camera device, the shooting direction is generally vertical downward shooting, and the GPS navigation positioning system in the prior art is adopted by the navigation positioning system.
Step S2, controlling the unmanned aerial vehicle to acquire images of rivers and lakes, and marking the acquired images as river and lake images; step S2 comprises the following sub-steps:
step S201, controlling the unmanned aerial vehicle to go to a specified detection point;
step S202, shooting a river and a lake, and marking the shot image as a river and a lake image;
in a specific implementation, the longitude and latitude of the detection point is (32.988219,117.765097), and the unmanned aerial vehicle is controlled to go to the coordinate point with the longitude and latitude of (32.988219,117.765097) based on the GPS navigation positioning system.
S3, analyzing the river and lake image to obtain a preliminary pollution area of the river and the lake; step S3 comprises the following sub-steps:
step S301, extracting plant images in river and lake images through an intelligent image extraction model;
step S302, analyzing the plant image through an intelligent object recognition model to obtain plant types;
step S303, comparing the plant types with the polluted plant types in the polluted plant database, and outputting a normal plant signal if the plant types are different from the polluted plant types; outputting a contaminated plant signal if the plant species is the same as the contaminated plant species;
step S304, if a polluted plant signal is output, carrying out graying treatment on a river and lake image and marking the river and lake image as a river and lake gray image, carrying out contour extraction on a water area where a polluted plant is located in the river and lake gray image through a contour extraction model, expanding the contour of the extracted water area by a first area multiple, marking the extracted water area as a river and lake polluted area, and simultaneously obtaining unmanned plane coordinates and marking the unmanned plane coordinates as river and lake polluted area coordinates;
in the implementation, the first area multiple is set to be 1.5, and because the extracted outline is the outline of the area where the plant is located, the extracted outline is usually smaller than the outline of the actual pollution range, and the accuracy of judging the pollution area can be improved by setting the first area multiple; the intelligent picture extraction model adopts an image extraction model in the prior art, and the intelligent object recognition model adopts an object recognition model in the prior art; contaminated plant species in the contaminated plant database include: algae, duckweed, needle-leaf fir, and sundew; extracting plant images in river and lake images, analyzing the plant images to obtain the algae as plant types, and comparing to obtain the same plant types as the polluted plant types, so as to output polluted plant signals; graying the river and lake image, marking the river and lake image as a river and lake gray map, and obtaining coordinates (32.987659,117.764652) of a river and lake polluted area after extracting the river and lake polluted area;
step S305, a river and lake gray level diagram is obtained, the gray level value of each pixel point is extracted, the gray level value is compared with a first gray level threshold value, and if the gray level value is smaller than or equal to the first gray level threshold value, a pollution area signal is output; if the gray value is larger than the first gray threshold value, outputting a normal area signal;
step S306, collecting all pollution area signals, integrating and marking pixel points outputting the pollution area signals as a preliminary pollution area, and simultaneously acquiring unmanned aerial vehicle coordinates and marking the unmanned aerial vehicle coordinates as pollution coordinates;
in specific implementation, the first gray threshold is set to 150, the gray value of the pixel point 1 in the river and lake gray image is 153, and if the gray value is larger than the first gray threshold through comparison, a normal area signal is output; when the gray value of the pixel 121 is 144, comparing to obtain a gray value smaller than or equal to a first gray threshold value, and outputting a polluted area signal; the pollution area signals are acquired and output by the pixel points 121, 122, 123 and 124, and then are integrated and marked as a preliminary pollution area, and the pollution coordinates are acquired (32.987767,117.753419).
The contour extraction model obtains a river and lake binary image by carrying out binarization processing on a river and lake gray level image, then obtains a target pixel point color and adjacent pixel point colors adjacent to the target pixel point, and if the target pixel point color is black and all corresponding adjacent pixel point colors are black, the target pixel point is deleted, all pixel points are analyzed, and the reserved image is the water area contour;
in the implementation, the color of the obtained target pixel point is black, the color of the adjacent pixel point 1 is black, the color of the adjacent pixel point 2 is black, the color of the adjacent pixel point 3 is black, the color of the adjacent pixel point 4 is black, the color of the adjacent pixel point 5 is black, the color of the adjacent pixel point 6 is black, the color of the adjacent pixel point 7 is black, and the color of the adjacent pixel point 8 is black, and then the target pixel point is deleted.
Step S4, obtaining river and lake flow rate, river and lake flow velocity and river and lake digestion amount of the river and lake; step S4 comprises the following sub-steps:
step S401, installing a flowmeter and a water flow speed sensor in a river or a lake;
step S402, obtaining river and lake flow and river and lake flow rate;
step S403, reading a pollution discharge database to obtain the current digestion amount of the river and the lake;
in specific implementation, the flowmeter in the prior art is adopted, the water flow rate sensor in the prior art is adopted, and the obtained river and lake flow rate is 86L/min, the river and lake flow rate is 55m/min, and the river and lake digestion amount is 28L.
S5, analyzing the flow rate of the river and the lake, the flow rate of the river and the lake and the digestion amount of the river and the lake to obtain the flow direction of the river and the lake and the furthest pollution discharge distance of the river and the lake; step S5 comprises the following sub-steps:
step S501, throwing a buoy into a river and a lake through an unmanned aerial vehicle;
step S502, controlling the unmanned aerial vehicle to follow-up the cursory through the intelligent visual tracking model, and highlighting a line through which the cursory passes and marking the line as the river and lake flow direction;
step S503, calculating the flow rate of the river and the lake and the flow velocity of the river and the lake by a pollution discharge distance calculation formula to obtain the furthest pollution discharge distance;
the blowdown distance calculation formula is configured as:the method comprises the steps of carrying out a first treatment on the surface of the Wherein L is the furthest pollution discharge distance, F is the flow rate of the river and the lake, V is the flow rate of the river and the lake, P is the digestion amount of the river and the lake, alpha is a preset flow coefficient, and beta is a preset flow coefficient; alpha and beta are both constants, and alpha and beta are both greater than zero;
in specific implementation, alpha is set to 1000, beta is set to 0.1, an intelligent visual tracking model adopts the existing visual tracking technology, the flow direction of the river and the lake is obtained through analysis, the flow rate F of the river and the lake is 86L/min, the flow rate V of the river and the lake is 55m/min, the digestion amount P of the river and the lake is 28L, the furthest pollution discharge distance L is 558.44m through calculation, and the calculation result is reserved in two decimal places.
S6, analyzing the flow direction of the river and the lake and the furthest sewage discharge distance to obtain the sewage positioning of the river and the lake; step S6 comprises the following sub-steps:
step S601, obtaining the river and lake flow direction and the furthest pollution discharge distance;
step S602, acquiring coordinates of a water area with the farthest pollution discharge distance from a pollution discharge point based on the river and lake flow direction, and marking the coordinates as sewage positioning;
in specific implementation, the flow direction of the river and the lake is obtained, the furthest pollution discharge distance is 558.44m, the pollution discharge point coordinate is 32.987332,117.76064, the coordinate point of the water area at 558.44m is obtained along the flow direction line of the river and the lake along with the pollution discharge point coordinate as the starting point, and the sewage is positioned 32.987099,117.764429.
Step S7, controlling the unmanned aerial vehicle to go to the position where the sewage is positioned, performing image shooting, and marking the shot image as a sewage image; step S7 comprises the following sub-steps:
step S701, controlling the unmanned aerial vehicle to go to the position where the sewage is positioned;
step S702, performing image shooting on the position where the sewage is positioned by a high-definition camera device carried by the unmanned aerial vehicle, and marking the shot image as a sewage image;
in a specific implementation, the unmanned aerial vehicle is controlled to go to a coordinate point (32.987099,117.764429), and a sewage image is shot.
S8, analyzing the sewage image to obtain a final polluted area of the river and the lake; step S8 comprises the following sub-steps:
step S801, obtaining a sewage image;
step S802, extracting plant images in the sewage images through an intelligent image extraction model;
step S803, analyzing the plant image through the intelligent object recognition model to obtain plant types;
step S804, comparing the plant types with the polluted plant types in the polluted plant database, and outputting a normal plant signal if the plant types are different from the polluted plant types; outputting a contaminated plant signal if the plant species is the same as the contaminated plant species;
step S805, if a polluted plant signal is output, carrying out grey-scale treatment on a sewage image and marking the sewage image as a sewage grey-scale image, carrying out contour extraction on a water area where the polluted plant is located in the sewage grey-scale image through a contour extraction model, expanding the contour of the extracted water area by a first area multiple, marking the extracted water area as a river and lake polluted area, simultaneously acquiring unmanned plane coordinates, and marking the unmanned plane coordinates as river and lake polluted area coordinates;
in the specific implementation, the first area multiple is set to be 1.5, a sewage image is acquired, the sewage image is extracted, a plant image is extracted, the plant image is analyzed through an intelligent object recognition model, the plant type is obtained as algae, and a polluted plant signal is output if the plant type is the same as the polluted plant type through comparison; graying the sewage image, marking the sewage image as a sewage gray image, extracting the outline of a water area, expanding 1.5 times, marking the obtained area as a river and lake pollution area, and obtaining the coordinates of the river and lake pollution area (32.987225,117.764613);
step S806, a sewage gray level map is obtained, a gray level value of each pixel point is extracted, the gray level value is compared with a first gray level threshold value, and if the gray level value is smaller than or equal to the first gray level threshold value, a pollution area signal is output; if the gray value is larger than the first gray threshold value, outputting a normal area signal;
in the specific implementation, if the gray value of the pixel point 1 extracted from the sewage gray map is 162, comparing to obtain a gray value larger than a first gray threshold value, and outputting a pollution area signal;
step S807, collecting all pollution area signals, integrating and marking pixel points outputting the pollution area signals as a final pollution area, and simultaneously acquiring unmanned plane coordinates and marking the unmanned plane coordinates as sewage coordinates;
step S808, calculating the distance between the longitude and latitude of the pollution coordinate and the sewage coordinate, marking the calculation result as a coordinate distance, comparing the coordinate distance with a first error threshold, and outputting an actual pollution area signal if the coordinate distance is smaller than or equal to the first error threshold; outputting a signal with larger error if the coordinate distance is larger than the first error threshold value;
step S809, if the actual pollution area signal is output, marking the final pollution area as a river and lake pollution area, marking the sewage coordinates as river and lake pollution area coordinates, and counting all river and lake pollution area coordinates and storing the coordinates in a pollution area database; if the error is larger, analyzing the drainage trend of the river and the lake again;
in specific implementation, the first error distance is set to be 20m, the sewage coordinate is obtained (32.987225,117.764613), the pollution coordinate is obtained (32.987767,117.753419), the coordinate distance is calculated to be 20.66m through longitude and latitude distance, the coordinate distance is obtained through comparison to be larger than the first error distance, a signal with larger error is output, and the drainage trend of the river and the lake is analyzed again.
Embodiment two: in a second aspect, the present application provides an electronic device comprising a processor and a memory storing computer readable instructions which, when executed by the processor, perform the steps of any of the methods described above. Through the above technical solutions, the processor and the memory are interconnected and communicate with each other through a communication bus and/or other form of connection mechanism (not shown), the memory stores a computer program executable by the processor, which when executed by the electronic device, performs the method in any of the alternative implementations of the above embodiments to realize the following functions: the method comprises the steps of controlling an unmanned aerial vehicle to collect river and lake images, analyzing the river and lake images to obtain a preliminary pollution area of the river and lake, obtaining river and lake flow rate, river and lake flow velocity and river and lake digestion amount of the river and lake, analyzing the river and lake flow direction and the farthest sewage discharge distance to obtain the sewage positioning of the river and lake, controlling the unmanned aerial vehicle to go to the position where the sewage positioning is located and performing image shooting, marking the shot image as a sewage image, and analyzing the sewage image to obtain a final pollution area of the river and lake.
Embodiment III: in a third aspect, the present application provides a storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of any of the methods described above. By the above technical solution, the computer program, when executed by the processor, performs the method in any of the alternative implementations of the above embodiments to implement the following functions: the method comprises the steps of controlling an unmanned aerial vehicle to collect river and lake images, analyzing the river and lake images to obtain a preliminary pollution area of the river and lake, obtaining river and lake flow rate, river and lake flow velocity and river and lake digestion amount of the river and lake, analyzing the river and lake flow direction and the farthest sewage discharge distance to obtain the sewage positioning of the river and lake, controlling the unmanned aerial vehicle to go to the position where the sewage positioning is located and performing image shooting, marking the shot image as a sewage image, and analyzing the sewage image to obtain a final pollution area of the river and lake.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media having computer-usable program code embodied therein. The storage medium may be implemented by any type or combination of volatile or nonvolatile Memory devices, such as static random access Memory (Static Random Access Memory, SRAM), electrically erasable Programmable Read-Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM), erasable Programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM), programmable Read-Only Memory (PROM), read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk, or optical disk. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.

Claims (8)

1. A river and lake pollution discharge motion capturing method based on unmanned aerial vehicle image acquisition is characterized by comprising the following steps:
step S1, carrying an imaging device through an unmanned aerial vehicle;
s2, controlling the unmanned aerial vehicle to acquire images of rivers and lakes, and marking the acquired images as river and lake images;
s3, analyzing the river and lake image to obtain a preliminary pollution area of the river and the lake; the step S3 includes the following sub-steps:
step S301, extracting plant images in river and lake images through an intelligent image extraction model;
step S302, analyzing the plant image through an intelligent object recognition model to obtain plant types;
step S303, comparing the plant types with the polluted plant types in the polluted plant database, and outputting a normal plant signal if the plant types are different from the polluted plant types; outputting a contaminated plant signal if the plant species is the same as the contaminated plant species;
step S304, if a polluted plant signal is output, carrying out graying treatment on a river and lake image and marking the river and lake image as a river and lake gray image, carrying out contour extraction on a water area where a polluted plant is located in the river and lake gray image through a contour extraction model, expanding the contour of the extracted water area by a first area multiple, marking the extracted water area as a river and lake polluted area, and simultaneously obtaining unmanned plane coordinates and marking the unmanned plane coordinates as river and lake polluted area coordinates;
step S305, a river and lake gray level diagram is obtained, the gray level value of each pixel point is extracted, the gray level value is compared with a first gray level threshold value, and if the gray level value is smaller than or equal to the first gray level threshold value, a pollution area signal is output; if the gray value is larger than the first gray threshold value, outputting a normal area signal;
step S306, collecting all pollution area signals, integrating and marking pixel points outputting the pollution area signals as a preliminary pollution area, and simultaneously acquiring unmanned aerial vehicle coordinates and marking the unmanned aerial vehicle coordinates as pollution coordinates;
s4, obtaining river and lake flow rate, river and lake flow velocity and river and lake digestion amount of the river and lake;
s5, analyzing the flow rate of the river and the lake, the flow rate of the river and the lake and the digestion amount of the river and the lake to obtain the flow direction of the river and the lake and the furthest pollution discharge distance of the river and the lake;
s6, analyzing the flow direction of the river and the lake and the furthest sewage discharge distance to obtain the sewage positioning of the river and the lake;
s7, controlling the unmanned aerial vehicle to go to the position where the sewage is positioned, performing image shooting, and marking the shot image as a sewage image;
and S8, analyzing the sewage image to obtain a final polluted area of the river and the lake.
2. The river and lake blowdown trend capturing method based on unmanned aerial vehicle image acquisition according to claim 1, wherein the step S2 comprises the following sub-steps:
step S201, controlling the unmanned aerial vehicle to go to a specified detection point;
step S202, shooting the river and the lake, and marking the shot image as a river and lake image.
3. The river and lake blowdown trend capturing method based on unmanned aerial vehicle image acquisition according to claim 2, wherein the contour extraction model obtains a river and lake binary image by performing binarization processing on a river and lake gray image, obtains a target pixel point color and adjacent pixel point colors adjacent to the target pixel point, deletes the target pixel point if the target pixel point color is black and all corresponding adjacent pixel point colors are black, analyzes all pixel points, and obtains the reserved image as a water area contour.
4. The river and lake blowdown trend capturing method based on unmanned aerial vehicle image acquisition according to claim 3, wherein the step S4 comprises the following sub-steps:
step S401, installing a flowmeter and a water flow speed sensor in a river or a lake;
step S402, obtaining river and lake flow and river and lake flow rate;
step S403, reading a pollution discharge database to obtain the current digestion amount of the river and the lake.
5. The river and lake blowdown trend capturing method based on unmanned aerial vehicle image acquisition according to claim 4, wherein the step S5 comprises the following sub-steps:
step S501, throwing a buoy into a river and a lake through an unmanned aerial vehicle;
step S502, controlling the unmanned aerial vehicle to follow-up the cursory through the intelligent visual tracking model, and highlighting a line through which the cursory passes and marking the line as the river and lake flow direction;
step S503, calculating the flow rate of the river and the lake and the flow velocity of the river and the lake by a pollution discharge distance calculation formula to obtain the furthest pollution discharge distance;
the blowdown distance calculation formula is configured as follows:the method comprises the steps of carrying out a first treatment on the surface of the Wherein L is the furthest pollution discharge distance, F is the flow of the river and the lake, V is the flow rate of the river and the lake, P is the digestion amount of the river and the lake, alpha is a preset flow coefficient, and beta is a preset flow coefficient.
6. The river and lake blowdown trend capturing method based on unmanned aerial vehicle image acquisition according to claim 5, wherein the step S6 comprises the following sub-steps:
step S601, obtaining the river and lake flow direction and the furthest pollution discharge distance;
step S602, based on the river and lake flow direction, acquiring coordinates of a water area with the furthest pollution discharge distance from a pollution discharge point, and marking the coordinates as sewage positioning.
7. The river and lake blowdown trend capturing method based on unmanned aerial vehicle image acquisition according to claim 6, wherein the step S7 comprises the following sub-steps:
step S701, controlling the unmanned aerial vehicle to go to the position where the sewage is positioned;
in step S702, the unmanned aerial vehicle performs image capturing on the location where the sewage is located, and marks the captured image as a sewage image.
8. The river and lake blowdown trend capturing method based on unmanned aerial vehicle image acquisition according to claim 7, wherein the step S8 comprises the following sub-steps:
step S801, obtaining a sewage image;
step S802, extracting plant images in the sewage images through an intelligent image extraction model;
step S803, analyzing the plant image through the intelligent object recognition model to obtain plant types;
step S804, comparing the plant types with the polluted plant types in the polluted plant database, and outputting a normal plant signal if the plant types are different from the polluted plant types; outputting a contaminated plant signal if the plant species is the same as the contaminated plant species;
step S805, if a polluted plant signal is output, carrying out grey-scale treatment on a sewage image and marking the sewage image as a sewage grey-scale image, carrying out contour extraction on a water area where the polluted plant is located in the sewage grey-scale image through a contour extraction model, expanding the contour of the extracted water area by a first area multiple, marking the extracted water area as a river and lake polluted area, simultaneously acquiring unmanned plane coordinates, and marking the unmanned plane coordinates as river and lake polluted area coordinates;
step S806, a sewage gray level map is obtained, a gray level value of each pixel point is extracted, the gray level value is compared with a first gray level threshold value, and if the gray level value is smaller than or equal to the first gray level threshold value, a pollution area signal is output; if the gray value is larger than the first gray threshold value, outputting a normal area signal;
step S807, collecting all pollution area signals, integrating and marking pixel points outputting the pollution area signals as a final pollution area, and simultaneously acquiring unmanned plane coordinates and marking the unmanned plane coordinates as sewage coordinates;
step S808, calculating the distance between the longitude and latitude of the pollution coordinate and the sewage coordinate, marking the calculation result as a coordinate distance, comparing the coordinate distance with a first error threshold, and outputting an actual pollution area signal if the coordinate distance is smaller than or equal to the first error threshold; outputting a signal with larger error if the coordinate distance is larger than the first error threshold value;
step S809, if the actual pollution area signal is output, marking the final pollution area as a river and lake pollution area, marking the sewage coordinates as river and lake pollution area coordinates, and counting all river and lake pollution area coordinates and storing the coordinates in a pollution area database; if the error is larger, the drainage movement of the river and the lake is analyzed again.
CN202311569725.8A 2023-11-23 2023-11-23 River and lake pollution discharge motion capturing method based on unmanned aerial vehicle image acquisition Active CN117291910B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311569725.8A CN117291910B (en) 2023-11-23 2023-11-23 River and lake pollution discharge motion capturing method based on unmanned aerial vehicle image acquisition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311569725.8A CN117291910B (en) 2023-11-23 2023-11-23 River and lake pollution discharge motion capturing method based on unmanned aerial vehicle image acquisition

Publications (2)

Publication Number Publication Date
CN117291910A CN117291910A (en) 2023-12-26
CN117291910B true CN117291910B (en) 2024-04-09

Family

ID=89253844

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311569725.8A Active CN117291910B (en) 2023-11-23 2023-11-23 River and lake pollution discharge motion capturing method based on unmanned aerial vehicle image acquisition

Country Status (1)

Country Link
CN (1) CN117291910B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0762714A (en) * 1993-08-26 1995-03-07 Fuji Electric Co Ltd Analysis method of pollutant diffusion state of water distribution network
CN101021543A (en) * 2007-03-13 2007-08-22 上海市环境监测中心 Water polletion source on-line dynamic tracking monitoring method and system
CN110244011A (en) * 2019-06-26 2019-09-17 熊颖郡 The river blowdown of unmanned plane monitors analyzing and alarming system automatically
CN112765906A (en) * 2021-01-08 2021-05-07 同济大学 Method for calculating diffusion coefficient of pollutants in water flow containing plants
CN115035416A (en) * 2022-08-10 2022-09-09 广东广宇科技发展有限公司 Method and system for quickly identifying polluted water source, electronic equipment and storage medium
CN115062071A (en) * 2022-06-09 2022-09-16 中国标准化研究院 A method and system for water pollution diffusion analysis in rivers
CN116152748A (en) * 2023-04-19 2023-05-23 水利部交通运输部国家能源局南京水利科学研究院 River and lake supervision method and system based on cyanobacteria identification

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10867178B2 (en) * 2019-01-31 2020-12-15 Palantir Technologies Inc. Systems and methods for coherent monitoring

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0762714A (en) * 1993-08-26 1995-03-07 Fuji Electric Co Ltd Analysis method of pollutant diffusion state of water distribution network
CN101021543A (en) * 2007-03-13 2007-08-22 上海市环境监测中心 Water polletion source on-line dynamic tracking monitoring method and system
CN110244011A (en) * 2019-06-26 2019-09-17 熊颖郡 The river blowdown of unmanned plane monitors analyzing and alarming system automatically
CN112765906A (en) * 2021-01-08 2021-05-07 同济大学 Method for calculating diffusion coefficient of pollutants in water flow containing plants
CN115062071A (en) * 2022-06-09 2022-09-16 中国标准化研究院 A method and system for water pollution diffusion analysis in rivers
CN115035416A (en) * 2022-08-10 2022-09-09 广东广宇科技发展有限公司 Method and system for quickly identifying polluted water source, electronic equipment and storage medium
CN116152748A (en) * 2023-04-19 2023-05-23 水利部交通运输部国家能源局南京水利科学研究院 River and lake supervision method and system based on cyanobacteria identification

Also Published As

Publication number Publication date
CN117291910A (en) 2023-12-26

Similar Documents

Publication Publication Date Title
CN112528878B (en) Method and device for detecting lane line, terminal equipment and readable storage medium
CN111178236B (en) Parking space detection method based on deep learning
CN112102369B (en) Autonomous inspection method, device, equipment and storage medium for water surface floating target
CN111382704B (en) Vehicle line pressing violation judging method and device based on deep learning and storage medium
CN112967283B (en) Target identification method, system, equipment and storage medium based on binocular camera
CN111209780A (en) Lane line attribute detection method and device, electronic device and readable storage medium
CN112966665A (en) Pavement disease detection model training method and device and computer equipment
CN109977776A (en) A kind of method for detecting lane lines, device and mobile unit
CN112016514B (en) Traffic sign recognition method, device, equipment and storage medium
CN112837384B (en) Vehicle marking method and device and electronic equipment
CN111898491A (en) Method and device for identifying reverse driving of vehicle and electronic equipment
CN109916415B (en) Road type determination method, device, equipment and storage medium
CN114926786A (en) Ship water gauge tracking method and device, storage medium and electronic equipment
CN113435350A (en) Traffic marking detection method, device, equipment and medium
CN116824516A (en) Road construction safety monitoring and management system
Kong et al. An automatic and accurate method for marking ground control points in unmanned aerial vehicle photogrammetry
CN117291910B (en) River and lake pollution discharge motion capturing method based on unmanned aerial vehicle image acquisition
CN113223064A (en) Method and device for estimating scale of visual inertial odometer
CN113014876B (en) Video monitoring method and device, electronic equipment and readable storage medium
JP6916975B2 (en) Sign positioning system and program
CN114581890B (en) Method and device for determining lane line, electronic equipment and storage medium
CN110659577A (en) Blind road obstacle detection method for smart phone platform
CN115439792A (en) Monitoring method and system based on artificial intelligence
CN111639640A (en) License plate recognition method, device and equipment based on artificial intelligence
Andò et al. A Vision-Based Solution for Water Level Monitoring in Flash Floods Scenario

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant