[go: up one dir, main page]

CN115586789B - Vision-based wind driven generator inspection unmanned aerial vehicle hovering point confirmation and inspection method - Google Patents

Vision-based wind driven generator inspection unmanned aerial vehicle hovering point confirmation and inspection method Download PDF

Info

Publication number
CN115586789B
CN115586789B CN202211507946.8A CN202211507946A CN115586789B CN 115586789 B CN115586789 B CN 115586789B CN 202211507946 A CN202211507946 A CN 202211507946A CN 115586789 B CN115586789 B CN 115586789B
Authority
CN
China
Prior art keywords
blade
image
driven generator
wind driven
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211507946.8A
Other languages
Chinese (zh)
Other versions
CN115586789A (en
Inventor
肖玄
杨润
李成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Rainpoo Technology Co ltd
Original Assignee
Chengdu Rainpoo Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Rainpoo Technology Co ltd filed Critical Chengdu Rainpoo Technology Co ltd
Priority to CN202211507946.8A priority Critical patent/CN115586789B/en
Publication of CN115586789A publication Critical patent/CN115586789A/en
Application granted granted Critical
Publication of CN115586789B publication Critical patent/CN115586789B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • G05D1/0816Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft to ensure stability
    • G05D1/0825Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft to ensure stability using mathematical models
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E10/00Energy generation through renewable energy sources
    • Y02E10/70Wind energy
    • Y02E10/72Wind turbines with rotation axis in wind direction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Algebra (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Wind Motors (AREA)

Abstract

The invention discloses a method for confirming and inspecting hovering points of a wind driven generator inspection unmanned aerial vehicle based on vision, wherein in the method for confirming hovering points, the hovering points of the unmanned aerial vehicle during inspection are confirmed to be positioned right above the central line of a hub of the wind driven generator through the confirmed right front and/or right rear of the wind driven generator, and meanwhile, the method comprises a specific method for confirming right front and/or right rear. The inspection method is based on the hover point confirmed by the hover point confirmation method. According to the scheme, through a pure visual load scheme of the unmanned aerial vehicle, under the condition that the wind driven generator is not stopped, the right front and/or right rear of the wind driven generator is determined, and the current yaw angle of the wind driven generator is analyzed; therefore, when the confirmed hovering point is used for inspection, blade image recognition and image feature definition are facilitated.

Description

Vision-based wind driven generator inspection unmanned aerial vehicle hovering point confirmation and inspection method
Technical Field
The invention relates to the technical field of aerial photography, in particular to a vision-based method for confirming and inspecting hovering points of a wind driven generator inspection unmanned aerial vehicle.
Background
The wind driven generator is equipment for converting wind energy into electric energy, is an important component in the field of new energy power generation, and has become the fastest growing wind power development market in the world in China. Wind power generators generally include blades, generators, direction regulators, towers, speed limiting safety architecture, energy storage devices, and the like. The blades are the most basic and the most critical and expensive components in the wind driven generator, and the good performance of the blades is the primary factor for ensuring the normal and stable operation of the unit.
Because the whole blade is exposed to work in natural environment and is corroded by wind and sand, the defects of surface abrasion, perforation and the like generally occur after the blade is used for a certain period of time, the power generation efficiency is affected, and even potential safety hazards occur when the blade is severe: when failure accidents, particularly single-chip fracture accidents, occur on the blades, the balance state among the blades is destroyed, the vibration of the generator set is suddenly aggravated, if no set protection measures exist or the set protection is not interposed in time in the state, serious damage is brought to the shafting and the tower of the generator set, and the whole set is possibly destroyed in serious case. In the long-term running process of the wind driven generator, the failure of the blade usually needs a long-term process, and the damage condition of the surface of the blade can be effectively monitored in the early stage, so that serious accidents such as the breakage of the blade can be effectively avoided. Therefore, the regular inspection and maintenance of the wind driven generator is an important component of wind power generation operation and maintenance.
At present, the more common modes for detecting damage to the surface of the blade include: manual high altitude (climbing or hanging) naked eyes are close to inspection, ground thermal imaging equipment image inspection, ground telescope or long focal length camera visual inspection, in the above mode, be limited by inspection equipment performance and inspection mode, have inefficiency, danger coefficient big scheduling problem, along with unmanned aerial vehicle and the maturity of aerial technology, the mode of relying on unmanned aerial vehicle to carry out wind turbine generator surface manual or automatic inspection appears in the prior art, and concrete scheme includes as the technical scheme that patent application documents such as application number is CN202011275910.2、CN201911393629.6、CN202210088479.3、CN202010115183.7、CN201910753262.8、CN202110643559.6、CN202011275910.2 provide.
Today, it is quite common to use unmanned aerial vehicles for wind driven generator inspection, and the future trend is to make unmanned aerial vehicles realize completely autonomous and intelligent automatic inspection. Under normal conditions, according to the inspection mode, the unmanned aerial vehicle is used for performing the inspection of the wind driven generator, and the shooting position/characteristic position of the wind driven generator is required to be determined. Regarding the above position determination, the prior art is as disclosed in publication No.: CN110910349a, inventive name: a wind turbine generator system state acquisition method based on aerial vision; the publication number is: CN113759960a, inventive name: a fan blade and tower barrel inspection and identification system and method based on an unmanned aerial vehicle all disclose related methods, such as confirming a hub center point of a wind driven generator.
The detection technology for detecting the damage of the blade surface of the wind driven generator is further optimized, and the method has positive significance for promoting the development process of clean energy in China.
Disclosure of Invention
Aiming at the technical problems that the detection technology for detecting the damage of the blade surface of the wind driven generator is further optimized and has positive significance in promoting the development process of clean energy in China, the invention provides a method for confirming and inspecting hovering points of a wind driven generator inspection unmanned aerial vehicle based on vision. The technical scheme provided by the scheme can be used for confirming the hovering point of the inspection unmanned aerial vehicle of the wind driven generator, and specifically, after determining the right front and/or the right rear of the wind driven generator, the image acquisition hovering point of the inspection unmanned aerial vehicle is determined.
Aiming at the problems, the method for confirming the hovering point of the unmanned aerial vehicle for inspection of the wind driven generator based on vision solves the problems through the following technical points: the method comprises the steps that an unmanned aerial vehicle hovering point is confirmed to be located right above the hub center line of a wind driven generator during inspection through the confirmed right front and/or right rear of the wind driven generator;
the method for confirming the right front and/or the right rear comprises the following steps:
S1, confirming an annular route of unmanned aerial vehicle flight according to the position coordinates of a wind driven generator and the height of the central line of the hub: the annular air line is positioned at the outer side of the rotating track of the wind driven generator blade, the annular air line is positioned on the horizontal plane at the height, and the position coordinate is positioned on the central line of the annular air line;
S2, the unmanned aerial vehicle shoots the blades of the wind driven generator in the flight process of the annular air route, the annular air route is provided with a plurality of shooting points, when the visual field of the camera is deviated to one side of the wind driven generator, in the formed photo, even if the camera is positioned right in front of or right behind the wind driven generator on the annular air route, different visual lengths of different blades can be displayed in the photo, thus, the lengths acquired in the step S3 are not used for calculating the optimal lengths, and finally the judgment accuracy of the right front or right behind is influenced, and based on the fact, the following steps are preferably set: the lens of each shooting point faces to the center of the annular route, and each blade is integrally positioned in a photo imaging area of each shooting point;
s3, obtaining the length of each blade in each photo under the visual state, and determining the right front or right rear of the wind driven generator according to the difference value of the maximum blade length and the minimum blade length for each photo;
and the shooting point corresponding to the photo with the smallest difference value is positioned right in front of or right behind the wind driven generator.
In the prior art, according to a set inspection method of the wind driven generator, confirming the position of a hub center point of the wind driven generator or a aerial camera relative to the wind driven generator is one of key technologies. If in order to determine the shooting position, after obtaining the hub center point of the wind power generator, the hovering shooting position of the aerial camera can be confirmed by taking the hub center point of the wind power generator as a characteristic point, and the obtaining mode of the hub center point of the wind power generator comprises the following steps: through image recognition, the hub center is obtained by utilizing the characteristic that the hub of the wind driven generator is round in the main viewing direction visual state, but when the viewing port is positioned at the rear side of the wind driven generator, the appearance of the rear end of the engine room is not necessarily a regular pattern, so that the center point of the rear end of the engine room cannot be obtained directly through the image recognition of the end part of the engine room in the conventional method. In addition, the conventional photographing position confirmation includes obtaining the photographing position of the aerial camera by: because the shooting position is related to the position of the wind driven generator and the yaw angle of the wind driven generator, based on the position and the yaw angle of the wind driven generator, or parameters such as the right front side/the right rear side of the wind driven generator are firstly confirmed, and then the shooting position of the aerial camera is confirmed, in the process, the position of the wind driven generator (such as longitude and latitude through the installation position) is a known value, but the yaw angle can possibly change in the working process of the wind driven generator, specific data generally exist in computer data of the wind driven generator and cannot be intuitively acquired, and based on the situation, such as the fact that an unmanned aerial vehicle flight hand confirms the yaw angle in a visual mode, the unmanned aerial vehicle flight inspection implementation is not facilitated, and meanwhile, the related results are greatly influenced by operators; otherwise, the existing intelligent inspection scheme mainly operates the unmanned aerial vehicle to fly to the front of the stopped wind driven generator, and then performs intelligent inspection steps through visual identification and laser radar mapping, so that the dependence of the wind driven generator inspection on other equipment is strong.
Aiming at the inspection purpose of the wind driven generator, the scheme provides a method for confirming the hovering point of the unmanned aerial vehicle for inspection of the wind driven generator based on vision, and the obtained hovering point of the unmanned aerial vehicle can be used for realizing the method for inspecting the unmanned aerial vehicle of the wind driven generator, which is different from the method for confirming the hovering point of the unmanned aerial vehicle in the prior art, and is a scheme which is only based on the visual characteristics of photos and can confirm the right front and the right rear of the wind driven generator.
In the above solution, it can be appreciated that the hub centre line of the wind power generator extends along the straight ahead or the straight behind. In step S1, the position coordinates may be a wind-driven generator installation position, the height of the hub center line is also a fixed value, and is generally recorded by a power plant, regarding to the radius planning of the annular course, a person skilled in the art may set the position coordinates to be located outside the rotation track of the wind-driven generator blade according to specific needs, so that the implementation of the method can be completed in the working process of the wind-driven generator, for example, in the stop state of the wind-driven generator, the position coordinates may be further reduced to the intersection of the annular course and the rotation track, and in the case that the shooting points on the annular course cannot be infinitely many, increasing the radius r of the annular course is beneficial to improving the position confirmation precision of the shooting points regarding the azimuth recognition, and reducing the radius r of the annular course is beneficial to improving the position confirmation efficiency of the shooting points; meanwhile, when the wind driven generator operates, the central line of the hub is generally considered to be a horizontal line, and the yaw angle of the wind driven generator is only adjusted according to the requirement, so that the annular route can be uniquely planned after the r is confirmed. In step S2, the unmanned aerial vehicle carries the aerial camera to carry out multi-point shooting along the annular route to the wind driven generator, it can be understood that the images formed by different shooting points have different shooting azimuth angles, so that the blades have different imaging effects in the images, and the fact that each blade is integrally located in the photo imaging area of each shooting point can be understood that all the areas of each blade which can be shot in the current shooting are located in the photo imaging area, so that the length acquisition of the blade in the visual state in step S3 is facilitated. In step S3, according to the photographs obtained in step S2, since different blades are distributed at different positions in the circumferential direction of the hub, the same blade has different visual lengths at each photographing point, and different blades have different visual lengths at the same photographing point, the length in a specific visual state length can be determined according to the number of pixel points between two ends of the blade, and in the photographs, the blade shown in the photographs is not necessarily capable of covering the blade tip to the blade root because of the possibility that the blade is blocked by the nacelle and the hub, but if the photographing point is located at the position directly in front of, directly behind or closest to the directly in front of or behind of the wind turbine, the minimum difference value in step S3 can be obtained, and the direction directly in front of, directly behind of the wind turbine or closest to the directly in front of or behind of the wind turbine can be obtained by using the method of determining that the photographing point corresponding to the minimum difference value is located at the photographing point. The distinction between the front and rear is determined by a simple image recognition by distinguishing the outer shape of the hub and the nacelle. And then, on the basis of the confirmed right front and/or right rear of the wind driven generator, setting a hovering shooting point for inspection of the wind driven generator to be located right above the hub center line of the wind driven generator.
In summary, in the above-identified method for identifying the hovering point of the unmanned aerial vehicle for inspection of the wind driven generator, in order to achieve the purpose of identifying the right front and right rear, the method can be implemented by utilizing the inherent or directly acquired data of the wind driven generator and further assisting in judging the visual length of the blade image, the completion of the whole method can be implemented by adopting the existing aerial camera, the method for identifying the hovering point can be implemented by simple image feature identification, and the current yaw angle of the wind driven generator can be analyzed by determining the right front and/or right rear of the wind driven generator under the condition that the wind driven generator is not stopped by the pure visual load scheme of the unmanned aerial vehicle.
The annular route set above is used to enable the implementation of the method to be carried out without stopping the wind driven generator.
The unmanned aerial vehicle hovering point during the confirmed inspection can enable the aerial camera to be always below the hovering point in the process of the inspection method provided by the following, the shooting mode from top to bottom is adopted, the background of the obtained photo is the ground or sea surface, the contrast ratio of the obtained photo and the common pure white blade is higher, and the blade image extraction and recognition are easier: if the point location is photographed from bottom to top, the photo may use sky as a background, contrast ratio with the blade is small, blade image extraction is not easy to be performed, glare and overexposure may be generated due to direct irradiation of the sun, and feature definition in the image is not facilitated.
As a person skilled in the art, to confirm the right front and/or right rear of the wind power generator, any one of the following methods may be adopted: after confirming the shooting points right in front or right behind, confirming the shooting points with the center of the annular route; after confirming the shooting points right in front or right behind, confirming the shooting points by the direction of the shooting points; after confirming shooting points in front and behind, confirming through connecting lines of the two shooting points; after identifying the blade root of the following scheme, the hub center and the nacelle center are identified by using three blade root coordinate data for a wind driven generator with three blades, and the connection line between the hub center and the nacelle center is used for identification.
As the vision-based wind driven generator inspection unmanned aerial vehicle hovering point confirmation method, the technical scheme is further that:
As described above, the above difference depends on the distribution of the photographing points on the circular course, and the photographing points cannot be distributed on the circular course in an infinite number, and when the photographing point with the smallest difference in the obtained photographs among the planned photographing points deviates from the right front and the right rear by more than a set range, the right front, the right rear, the hub center line, or the current yaw angle of the wind power generator is confirmed to exceed the allowable deviation. When the allowable deviation of only the correlation difference is directly set, the allowable deviation versatility is not provided for wind turbines having different blade lengths: to facilitate accuracy of direction, hub centerline or yaw angle determination, a wind turbine with longer blade length may suitably increase the difference, and a wind turbine with shorter blade length may suitably decrease the difference, further configured to: in the step S3, a difference optimizing step is further included, where the difference optimizing step is:
Dividing the minimum difference value by the maximum blade length or the minimum blade length, and comparing the obtained result with a set judging threshold value:
when the result is smaller than or equal to the judging threshold value, confirming that the shooting point corresponding to the photo with the smallest difference value is located at the shooting point right in front of or right behind the wind driven generator;
And when the result is greater than the judgment threshold value, adjusting the shooting point distribution in the step S2 to obtain a new result, comparing the adjusted result with the judgment threshold value until the shooting point distribution is adjusted to be smaller than or equal to the judgment threshold value, and then confirming the shooting point corresponding to the photo with the smallest difference value to the shooting point right in front of or right behind the wind driven generator. By adopting the scheme, after the minimum difference value is obtained, the result is further obtained and then compared with a judging threshold value, whether the current minimum difference value is within an acceptable range is confirmed, when the current minimum difference value is acceptable, the shooting points associated with the current minimum difference value-associated photo are utilized to confirm the right front or the right rear of the wind driven generator, when the current minimum difference value-associated photo is unacceptable, the shooting points on the annular airlines are adjusted, and the right front and the right rear of the wind driven generator are reconfirmed according to the corresponding comparison method. As a person skilled in the art, this step is used to improve the accuracy of the azimuth verification by means of a difference checksum adjustment and to adapt the difference optimization step to different sizes of wind turbines, according to the required accuracy of the azimuth verification of the wind turbines, according to the way the shooting points on the circular course are planned, which is not an essential step in each embodiment.
The length of the blade in the visual state is obtained by identifying the relative positions of the two ends of the blade in the photo, and specifically comprises the following steps:
s3-1, identifying the blades in the photo and adding a target boundary box to each blade;
s3-2, confirming the blade root of each blade, specifically:
establishing a minimum circumscribed rectangle associated with all midpoints according to the midpoints of each target bounding box;
Judging the number of vertexes of each target boundary frame in the minimum bounding rectangle, if only one vertex of the target boundary frame is positioned in the minimum bounding rectangle, the vertex positioned in the minimum bounding rectangle is the blade root of the blade corresponding to the target boundary frame;
If the target boundary frame is provided with two vertexes positioned in the minimum circumscribed rectangle, respectively taking each vertex of the two vertexes and other target boundary frames as vertexes of blade roots to form a polygon, and taking the vertex of the polygon with the minimum area as the blade root of the blade corresponding to the target boundary frame;
S3-3, obtaining the length of each blade in each photo in the visual state according to the confirmed blade root and the confirmed blade tip of each blade. In the prior art, the target bounding box is a rectangular box, the blade tip and the blade root of the blade are determined to be the different vertexes of the target bounding box respectively, the midpoint of each target bounding box is the intersection point of the connecting lines of the diagonal lines of each target bounding box, meanwhile, the existing wind driven generator is generally provided with three blades, and under the shooting points positioned right in front of or right behind or the shooting points closest to the right in front of or right behind on the annular route, the wind driven generator is generally: the target bounding box of two blades has one vertex located in the minimum bounding rectangle, the target bounding box of the other blade has two vertices located in the minimum bounding rectangle, the target bounding box vertex serving as the blade root of the blade can be confirmed in the above manner, and the target bounding box vertex serving as the blade tip of the blade can be clearly confirmed, so that the distance from the target bounding box vertex serving as the blade root to the target bounding box vertex of the blade tip is determined as the length of the blade, the two vertices are generally a pair of diagonal vertices of the target bounding box, and the polygon is generally triangular. As one skilled in the art can understand, when the shooting point on the annular route deviates greatly from the right front or right rear, the part of the position of part of the blades is shielded by the hub and the cabin, so the blade root in the step S2 is the end of the blade away from the blade tip in the lower visual state of the camera, when the blades are completely shielded, the length of the corresponding blade can be considered to be zero, when the camera moves to the extension line of the single blade for shooting, the length of the whole blade in the visual state can be considered to be unavailable, and at the moment, the shooting point can be directly determined not to be located right front or right rear by lacking the effective length data of the whole blade as a determination standard. In particular, for the situation that the nacelle tail graph may be irregular, the method may be used not only to determine the midpoint position of the hub, but also to determine the midpoint position of the nacelle end, specifically: and confirming the midpoint position of the end part of the wind driven generator cabin/the midpoint position of the hub by utilizing the blade roots of each blade determined in the photo with the minimum difference. When the method is applied to judging the hub center position, compared with the method of directly identifying the hub image outline and taking the middle point of the hub image outline as the hub center, the obtained hub center is more accurate because the acquired image of the target bounding box is larger than the hub size. The center position can be obtained by simple data operation by utilizing the characteristic that the blade roots of the blades are generally distributed on the circumference taking the center point as the center of a circle.
In order to facilitate the automatic execution of the scheme, regarding the implementation of the data record in the scheme, the method is set as follows: and in the difference optimizing step, a fluctuation curve is established, the fluctuation curve is used for reflecting the variation trend of the result of each photo under the annular air route, a correction curve is obtained after the fluctuation curve is filtered, the correction curve is compared with a set judgment threshold value, and the shooting point right in front of or right behind the wind driven generator is confirmed or the fluctuation curve is obtained again, so that the shooting point right in front of or right behind the wind driven generator is finally obtained. Compared with the direct table record of the data, the method of recording the data by the fluctuation curve and filtering to obtain the correction curve has the characteristics of large data bearing capacity and visual result. More specifically, the above fluctuation curves may be set in a table, where the longitudinal direction of the table corresponds to the result described by each photograph, and the transverse direction of the table corresponds to the shooting time or azimuth data.
The scheme also discloses a vision-based wind driven generator inspection method, wherein on the unmanned aerial vehicle hovering point confirmed by the hovering point confirmation method, the aerial camera with the cradle head is used for collecting wind driven generator blade image data;
meanwhile, the specific image data acquisition process comprises the following steps:
on the hovering point, changing the camera lens direction to adjust the field of view area through the cradle head;
The direction changing mode is as follows: on the rotation plane of the target blade, the field of view region translates along a line parallel to the rotation plane;
When the target blade rotates to a field of view area, the camera acquires an image;
the image acquired at the hover point position comprises a first image and a second image, wherein the first image is an image acquired by the camera when the target blade is at a first position on a rotation track of the target blade, the second image is an image acquired by the camera when the target blade is at a second position, and the target blade at the first position rotates 180 degrees to reach the second position on the rotation plane.
In the prior art, the blades of the wind driven generator are generally installed to have a certain elevation angle and a cone angle, and due to the existence of the cone angle, the rotation plane can be regarded as a plane perpendicular to the rotation axis of the blades and a plane where a circular ring formed by rotation of any point on the blades is located. When the lens is used for realizing the linear translation of the view field area under the action of the cradle head, when the rotation angle of the lens is large enough, the lens can form a strip-shaped photographable area on the rotation plane, so that the target blade can enter the view field area at two positions, one position is a first position, the other position is a second position, and the target blade at the first position reaches the second position after rotating 180 degrees. In order to realize the translation of the field of view area along a straight line, the lens can rotate around a single axis on the tripod head, and the axis is perpendicular to the straight line. Meanwhile, the target blade can be understood as a blade to be inspected in three blades, and can be part of blades on the wind driven generator or all of blades.
Compared with the prior art, the technical scheme is characterized in that the captured image comprises the first image and the second image on the basis that the unmanned aerial vehicle hovers and the lens direction is adjusted through the cradle head, and by adopting the scheme, not only can the inspection of the blades be realized without stopping rotation, but also the imaging device has the characteristic of simple route planning from the aspects of coverage area and image feature identification.
Specifically, the scheme utilizes: in the hovering point position, a first image and a second image can be obtained by changing the camera lens direction through the cradle head, and when the target blade rotates 180 degrees relative to the target blade at the first position, the surface of the target blade exposed in the field of view is clearly shown as the first image obtained by the first position when the hovering point position of the unmanned aerial vehicle is positioned at the front upper side of the blade, and the details of the top surface (front edge) and the front upper side (windward upper side) of the target blade are clearly shown as the second image obtained by the second position can clearly show the details of the bottom surface (rear edge relative to the top surface) and the front lower side (windward lower side relative to the upper side) of the target blade; when shooting the target blade at other relative positions, even if the same target area on the target blade is in different view field areas, the first image and the second image are obtained, so that the shooting angle of the target area is changed, corresponding shielding/clear area changes in imaging, blade surface defects are reflected in the images in different modes, particularly, the defect details with obvious contrast with surrounding areas are more easily obtained in an oblique shooting mode for easily generating wind erosion defects on the blade, and therefore, more blade surface image characteristic information is favorably obtained, and the blade surface defect recognition rate and defect definition degree are improved.
In summary, this scheme can obtain the image information of different positions of target blade at same hovering point position, can draw blade surface characteristic information with different shooting angles, for among the prior art like unmanned aerial vehicle is around the spiral flight shooting of target blade when the aerogenerator is patrolled and examined, unmanned aerial vehicle follows the commentaries on classics and follows the track shooting in the aerogenerator rotation in-process, the tripod head rotates the blade paragraph that changes shooting area and cover different radial positions in the aerogenerator rotation, from coverage area and characteristic recognition effect angle, this scheme is adjusted through simple field of view area, can be relatively comprehensive, clear obtain the surface characteristic information of target blade, set up less hovering point position and shorter shooting route under the prerequisite, can obtain the image information of target blade target area and/or clear image characteristic information reach the purpose of patrolling and examining comprehensively, thereby reach the purpose that conveniently plan the route.
As a person skilled in the art, in conventional inspection of the existing wind turbine, an image acquisition form covering the whole surface of the blade as much as possible is generally adopted, so as to ensure the image feature extraction effect, for any target blade, the whole surface of the target blade is divided into a plurality of areas, and single shooting is only performed for one area, so that as a preferred mode, from the view of image quality, on a single hovering point, cameras are used for shooting the target blade for multiple times for the target blade at a first position and a second position, each shooting covers different target blade surfaces, the areas are generally sequentially arranged along the length direction of the blade according to the form of the blade, and the later use method of the images is as follows: if the photographed pictures are combined into a complete wind driven generator blade in a splicing mode, a single picture is selected for viewing, and the like, the relative positions of two adjacent areas can be specified. The solution proposed above differs from the prior art in that the image acquired by the hover point location comprises a first image and a second image, while the shooting requirements of the first image and the second image are met by a field of view area transformation, so it should be understood that: at the hover point, it is within the contemplation of the present approach to how many image acquisitions are performed on the target blade in either the first position or the second position.
The unmanned aerial vehicle hovering points are confirmed on the basis of confirming the right front and the right rear, the outstanding image background rejection and the image quality effect are achieved, the first image or the second image is shot in front of or behind the wind driven generator, the number of hovering points is as small as possible, and the surface of the blades is covered as comprehensively as possible, so that the blades in the obtained photo are not shielded by the hub and the cabin of the wind driven generator as much as possible, and the comprehensiveness of the inspection data acquisition is facilitated.
As a further technical scheme of the vision-based wind driven generator inspection method:
as a method, the whole view field of the hovering point can be fully utilized to collect the images of all the collectable areas of the target blade under the hovering point, so as to simplify route planning and improve inspection efficiency; simultaneously, a plurality of images are adopted to form the first image and the second image, so that the single image can more clearly reflect the details of the surface of the blade; meanwhile, the technical scheme for guaranteeing the image acquisition efficiency by defining the first image and the second image acquisition sequence is as follows: the first image and the second image comprise a plurality of images;
in all images acquired by the hover point camera:
all of the collectable region on the first position target blade is covered by the first image;
all of the collectable region on the target blade at the second location is covered by the second image;
in the shooting process, after all the first images are acquired, acquiring second images;
The acquisition modes of the first image and the second image are as follows: the cloud deck enables the view field area to move along the radial direction of the target blade, and each image is sequentially collected from one end to the other end of the target blade.
When the scheme is implemented, according to the structural characteristics of the long strip shape of the blade, the plurality of images respectively included in the first image and the second image can be images corresponding to a plurality of areas which are sequentially arranged along the length direction of the blade; all images acquired by the hovering point position camera cover all the acquirable areas, namely the field of view range of the current hovering state aerial camera is utilized as much as possible; limiting the first image acquisition and then carrying out the second image acquisition, namely, on the basis of meeting shooting setting, enabling the single rotation angle of the cradle head to be smaller, so that the image acquisition time can be shortened; the first image and the second image acquisition modes defined above are used for defining an image acquisition sequence and are used for matching the problem of shortening the acquisition time. In one specific implementation, image acquisition is accomplished by: after hovering along with an unmanned aerial vehicle, firstly carrying out image acquisition on a target aerial lens at a first position, driving a lens by a cradle head to adjust the orientation in such a way that a view field area moves along the radial direction of the target blade at the first position, enabling a first image acquisition sequence to be from a blade root to a blade tip and from the blade tip to the blade root, after the first image acquisition is completed, driving the lens by the cradle head to adjust the orientation of the target blade at a second position, and then completing all second image acquisition according to the sequence from the blade root to the blade tip or from the blade tip to the blade root. As a person skilled in the art, when the target blade has a predetermined inspection range, the images included in the first image and the second image may cover the predetermined inspection range.
In order to obtain an integrated image or model capable of intuitively reflecting the surface condition of each position of the target blade through later image stitching, the method is set as follows: the images corresponding to two adjacent acquisition areas in the first image have a set overlapping rate;
The images corresponding to the two adjacent acquisition areas in the second image have a set overlapping rate. It can be appreciated that the above overlapping rate is used for stitching between related images, and a specific stitching mode can be a conventional feature recognition mode, and the specific overlapping rate is set according to camera performance, the requirements of the adopted stitching software, and the like.
When the wind driven generator operates, the front edge of the blade is a wind cutting surface which is used for wind cutting, the front edge of the blade and the windward side of the blade are areas with serious wind erosion, and the image features of the areas are more clearly reflected, so that the wind driven generator is provided with the following components: the hovering position of the unmanned aerial vehicle or the position when the target blade is acquired images satisfy: when the camera collects images, an image target area on the target blade is positioned at the lower side of the lens. The scheme is as follows: when the camera collects images, the image collection is completed from the upper part of the image collection area, if the camera position is the upper part of the front surface of the blade, the area with good blade surface feature recognition quality in the first position is the front edge and the upper side of the windward surface, the area with good blade surface feature recognition quality in the second position is the rear edge and the lower side of the windward surface, when the camera position is adjusted to the upper part of the back side of the blade, the area with good blade surface feature recognition quality in the first position is the front edge and the upper side of the leeward surface, and the area with good blade surface feature recognition quality in the second position is the rear edge and the lower side of the leeward surface, so that a comprehensive blade surface image can be obtained through two points. In addition, the shooting mode from top to bottom is adopted, the background of the obtained photo is the ground or the sea surface, the contrast ratio of the photo to the common pure white blade is high, and the blade image extraction and identification are easier to carry out: if the point location is photographed from bottom to top, the photo may use sky as a background, contrast ratio with the blade is small, blade image extraction is not easy to be performed, glare and overexposure may be generated due to direct irradiation of the sun, and feature definition in the image is not facilitated. In the implementation, for the first position and the second position set forth above, where the first position and the second position each include the application of multiple images, it is preferable to use: the target blade enters the view field area and shoots when keeping level with the ground, so that the pixel points occupied by the blade ridges in each photo are more, meanwhile, the visual approximation degree of the adjacent areas of the blade is high, the background image is conveniently identified and removed, the image feature consistency is high for the blade, and the quality of the mapping can be effectively ensured to be beneficial to judging the surface defects of the blade when modeling the surface image of the three-dimensional blade in the later stage, namely: the first image and the second image are images acquired when the target blade rotates to a horizontal posture. In order to obtain comprehensive blade inspection data, the hovering points comprise a first hovering point positioned at the front side of the hub center line of the wind driven generator and a second hovering point positioned at the rear side of the hub center line of the wind driven generator.
The invention has the following beneficial effects:
In the method for confirming the hovering point of the unmanned aerial vehicle for the inspection of the wind driven generator, in order to achieve the aim of confirming the right front and the right rear, the method can be realized by utilizing the inherent or directly acquired data of the wind driven generator and further assisting in judging the visual length of the blade image, the whole method can be finished by adopting the existing aerial camera, the method for confirming the hovering point can be finished by simple image feature recognition for achieving intelligent automatic inspection, and the current yaw angle of the wind driven generator is analyzed by determining the right front and/or the right rear of the wind driven generator under the condition that the wind driven generator is not stopped by a pure visual load scheme of the unmanned aerial vehicle.
The annular route set above is used to enable the implementation of the method to be carried out without stopping the wind driven generator.
The unmanned aerial vehicle hovering point during the confirmed inspection can enable the aerial camera to be always below the hovering point in the process of the inspection method provided by the following, the shooting mode from top to bottom is adopted, the background of the obtained photo is the ground or sea surface, the contrast ratio of the obtained photo and the common pure white blade is higher, and the blade image extraction and recognition are easier: if the point location is photographed from bottom to top, the photo may use sky as a background, contrast ratio with the blade is small, blade image extraction is not easy to be performed, glare and overexposure may be generated due to direct irradiation of the sun, and feature definition in the image is not facilitated.
Drawings
FIG. 1 is a schematic diagram of a visual-based unmanned aerial vehicle hover point inspection method for a wind turbine, according to an embodiment of the present disclosure, reflecting the position of the unmanned aerial vehicle hover point relative to the wind turbine during inspection;
FIG. 2 is a schematic view of a shooting flow at a hover point reflecting the position right in front of a wind turbine in one embodiment of a vision-based wind turbine inspection method according to the present disclosure;
FIG. 3 is a schematic view of a shooting flow at a hover point reflecting directly behind a wind turbine in one embodiment of a vision-based wind turbine inspection method according to the present disclosure;
FIG. 4 is a schematic view of the coverage area of the shooting process on the target blade shown in FIG. 2 and FIG. 3;
FIG. 5 is a schematic view showing a region of the blade where each of the first image and the second image includes an image according to one embodiment of the vision-based inspection method for wind turbine generator according to the present disclosure;
FIG. 6 is a schematic view of the exterior area division of a prior art wind turbine blade;
Fig. 7 is a schematic diagram for explaining related technical terms in an implementation manner of the unmanned aerial vehicle hovering point confirmation method provided in embodiment 11;
FIG. 8 is a graph showing the fluctuation curve in one embodiment;
fig. 9 is used to show a correction curve in one embodiment.
The reference numerals in fig. 6 are respectively: 1. trailing edge, 2, leading edge, 3, windward side, 4, apex.
Detailed Description
The present invention will be described in further detail with reference to the following examples, but the present invention is not limited to the following examples:
Example 1:
As shown in fig. 1 to 6, according to the method for confirming the hovering point of the unmanned aerial vehicle for inspection of the wind driven generator based on vision, the hovering point of the unmanned aerial vehicle during inspection is confirmed to be positioned right above the central line of the hub of the wind driven generator through the confirmed right front and/or right rear of the wind driven generator;
the method for confirming the right front and/or the right rear comprises the following steps:
S1, confirming an annular route of unmanned aerial vehicle flight according to the position coordinates of a wind driven generator and the height of the central line of the hub: the annular air line is positioned at the outer side of the rotating track of the wind driven generator blade, the annular air line is positioned on the horizontal plane at the height, and the position coordinate is positioned on the central line of the annular air line;
s2, the unmanned aerial vehicle shoots the blades of the wind driven generator in the flight process of the annular route, the annular route is provided with a plurality of shooting points, lenses of the shooting points face the center of the annular route, and the whole blades are positioned in photo imaging areas of the shooting points;
s3, obtaining the length of each blade in each photo under the visual state, and determining the right front or right rear of the wind driven generator according to the difference value of the maximum blade length and the minimum blade length for each photo;
and the shooting point corresponding to the photo with the smallest difference value is positioned right in front of or right behind the wind driven generator.
Example 2:
This example was further refined on the basis of example 1:
In the step S3, a difference optimizing step is further included, where the difference optimizing step is:
Dividing the minimum difference value by the maximum blade length or the minimum blade length, and comparing the obtained result with a set judging threshold value:
when the result is smaller than or equal to the judging threshold value, confirming that the shooting point corresponding to the photo with the smallest difference value is located at the shooting point right in front of or right behind the wind driven generator;
And when the result is greater than the judgment threshold value, adjusting the shooting point distribution in the step S2 to obtain a new result, comparing the adjusted result with the judgment threshold value until the shooting point distribution is adjusted to be smaller than or equal to the judgment threshold value, and then confirming the shooting point corresponding to the photo with the smallest difference value to the shooting point right in front of or right behind the wind driven generator.
Example 3:
This example was further refined on the basis of example 1:
The length of the blade in the visual state is obtained by identifying the relative positions of the two ends of the blade in the photo, and specifically comprises the following steps:
s3-1, identifying the blades in the photo and adding a target boundary box to each blade;
s3-2, confirming the blade root of each blade, specifically:
establishing a minimum circumscribed rectangle associated with all midpoints according to the midpoints of each target bounding box;
Judging the number of vertexes of each target boundary frame in the minimum bounding rectangle, if only one vertex of the target boundary frame is positioned in the minimum bounding rectangle, the vertex positioned in the minimum bounding rectangle is the blade root of the blade corresponding to the target boundary frame;
If the target boundary frame is provided with two vertexes positioned in the minimum circumscribed rectangle, respectively taking each vertex of the two vertexes and other target boundary frames as vertexes of blade roots to form a polygon, and taking the vertex of the polygon with the minimum area as the blade root of the blade corresponding to the target boundary frame;
S3-3, obtaining the length of each blade in each photo in the visual state according to the confirmed blade root and the confirmed blade tip 4.
Example 4:
this example was further refined on the basis of example 3:
and confirming the midpoint position of the end part of the wind driven generator cabin/the midpoint position of the hub by utilizing the blade roots of each blade determined in the photo with the minimum difference.
Example 5:
This example was further refined on the basis of example 2:
And in the difference optimizing step, a fluctuation curve is established, the fluctuation curve is used for reflecting the variation trend of the result of each photo under the annular air route, a correction curve is obtained after the fluctuation curve is filtered, the correction curve is compared with a set judgment threshold value, and the shooting point right in front of or right behind the wind driven generator is confirmed or the fluctuation curve is obtained again, so that the shooting point right in front of or right behind the wind driven generator is finally obtained.
Example 6:
The embodiment provides a vision-based wind driven generator inspection method based on the embodiment 1, which comprises the following steps: collecting wind turbine blade image data at the unmanned aerial vehicle hover point confirmed by the hover point confirmation method according to any one of embodiments 1 to 5 using a aerial camera with a cradle head;
meanwhile, the specific image data acquisition process comprises the following steps:
on the hovering point, changing the camera lens direction to adjust the field of view area through the cradle head;
The direction changing mode is as follows: on the rotation plane of the target blade, the field of view region translates along a line parallel to the rotation plane;
When the target blade rotates to a field of view area, the camera acquires an image;
the image acquired at the hover point position comprises a first image and a second image, wherein the first image is an image acquired by the camera when the target blade is at a first position on a rotation track of the target blade, the second image is an image acquired by the camera when the target blade is at a second position, and the target blade at the first position rotates 180 degrees to reach the second position on the rotation plane.
Example 7:
this example was further refined on the basis of example 6:
the first image and the second image comprise a plurality of images;
in all images acquired by the hover point camera:
all of the collectable region on the first position target blade is covered by the first image;
all of the collectable region on the target blade at the second location is covered by the second image;
in the shooting process, after all the first images are acquired, acquiring second images;
The acquisition modes of the first image and the second image are as follows: the cloud deck enables the view field area to move along the radial direction of the target blade, and each image is sequentially collected from one end to the other end of the target blade. When the wind driven generator is in operation, the wind driven generator blade can be divided into four parts when seen from outside: PS surface (windward surface 3), SS surface (leeward surface, on the back side of windward surface 3), leading edge 2 (wind cut-in side), and trailing edge 1 (wind cut-out side), as shown in fig. 6. In operation, the front edge 2 of the blade is a wind cutting surface measured as a wind cut, the front edge 2 of the blade and the windward side 3 of the blade are generally regions with serious wind erosion, and in order to more clearly reflect the image characteristics of the regions, the front edge 2 of the blade and the windward side 3 of the blade are set as follows: the hovering position of the blade unmanned aerial vehicle meets the following conditions: when the camera collects images, the target blade is positioned at the lower side of the lens. The scheme is as follows: when the camera collects images, the image collection is completed from the upper part of the image collection area, if the camera position is the upper part of the front surface of the blade, the area with good blade surface feature recognition quality in the first position is the front edge 2 and the upper side of the windward side 3, the area with good blade surface feature recognition quality in the second position is the rear edge 1 and the lower side of the windward side 3, when the camera position is adjusted to the upper part of the back side of the blade, the area with good blade surface feature recognition quality in the first position is the front edge 2 and the upper side of the leeward side, and the area with good blade surface feature recognition quality in the second position is the rear edge 1 and the lower side of the leeward side of the windward side 3, so that a comprehensive blade surface image can be obtained through two points. In addition, the shooting mode from top to bottom is adopted, the background of the obtained photo is the ground or the sea surface, the contrast ratio of the photo to the common pure white blade is high, and the blade image extraction and identification are easier to carry out: if the point location is photographed from bottom to top, the photo may use sky as a background, contrast ratio with the blade is small, blade image extraction is not easy to be performed, glare and overexposure may be generated due to direct irradiation of the sun, and feature definition in the image is not facilitated. In the implementation, for the first position and the second position set forth above, where the first position and the second position each include the application of multiple images, it is preferable to use: the target blade enters the view field area and shoots when the target blade is kept horizontal with the ground, so that the pixel points occupied by the blade ridges in each photo are more, meanwhile, the visual approximation degree of the adjacent areas of the blade is high, the background image is conveniently identified and removed, the image feature consistency is high for the blade, and the quality of the mapping can be effectively ensured to be beneficial to judging the surface defects of the blade when the three-dimensional blade surface image modeling is carried out in the later stage.
Example 8:
this example was further refined on the basis of example 6:
The images corresponding to two adjacent acquisition areas in the first image have a set overlapping rate;
The images corresponding to the two adjacent acquisition areas in the second image have a set overlapping rate.
Example 9:
this example was further refined on the basis of example 6:
The hovering position of the unmanned aerial vehicle or the position when the target blade is acquired images satisfy: when the camera collects images, an image target area on the target blade is positioned at the lower side of the lens.
Example 10:
This example was further refined on the basis of example 9:
The first image and the second image are images acquired when the target blade rotates to a horizontal posture;
The hover points include a first hover point located forward of the wind turbine hub centerline, a second hover point located rearward of the wind turbine hub centerline.
Example 11:
The same concept as the above embodiment, the present embodiment provides a specific implementation manner of the unmanned aerial vehicle hover point confirmation method:
Mainly comprises the following steps:
step 1: the drone is controlled to fly to hub centerline height H and fly around the wind generator centerline in an annular course of radius r as shown in fig. 1.
Step 2: in the surrounding flight process, the right front A0 and the right rear B0 of the center of the wind wheel hub are visually identified, and the specific steps are as follows:
Step 2-1: identifying the blades, adding target bounding boxes to the blades, the midpoints of each target bounding box and the 4 vertices of each target bounding box may be found, as shown by the rectangular boxes surrounding each blade and the midpoints in these rectangular boxes shown in FIG. 7.
Step 2-2: the root of each blade is identified.
The method comprises the following steps: establishing a minimum circumscribed rectangle (hereinafter referred to simply as a circumscribed moment) according to the midpoints of the three target bounding boxes, and if only one target bounding box vertex is in the circumscribed moment as shown in a rectangle established by the three midpoints in fig. 7, determining that the target bounding box vertex in the circumscribed moment is the blade root of the blade (such as the top left blade and the vertex of the target bounding box of the bottom left blade in fig. 7 in the circumscribed moment are respectively the blade roots of the corresponding blades); if the external moment includes 2 vertices of the target bounding box (e.g., the upper right-hand blade in fig. 7), the vertices of the target bounding box and the vertices of the other two target bounding boxes are respectively formed into triangles as the vertices of the blade root, and the vertex of the triangle with the smallest area is determined as the blade root of the blade (i.e., the 2 vertices of the upper right-hand blade in the target bounding box are respectively formed into triangles with the blade root of the upper left-hand blade and the blade root of the lower left-hand blade), and the vertex of the triangle with the smallest area is determined as the blade root of the upper right-hand blade when the upper right-hand blade target bounding box is formed into the two vertices of the external moment.
In fig. 7, reference numerals are respectively: 5. the hub comprises a target boundary frame of the upper left blade, a target boundary frame of the lower left blade, a target boundary frame of the upper right blade, an external moment, a target boundary frame of the upper left blade, a target boundary frame of the lower left blade, a target boundary frame of the upper right blade, an external moment and a hub center.
As shown in fig. 7, the hub center coordinates in the drawing may be obtained by averaging or the like from the relationship between the three blade root coordinates (indicated by dots in fig. 7) and the hub center and nacelle center. Similarly, if the view port is located right behind the wind driven generator, the center coordinates of the nacelle can be obtained.
Step 2-3: the blade length is determined.
The method comprises the following steps: the distance from the target bounding box vertex determined as the root of the blade to the target bounding box vertex determined as the tip 4 (typically the distance from the target bounding box diagonal vertex) is determined as the blade length.
Step 2-4: for each shot on the circular course, a wave curve can be obtained from (longest blade length-shortest blade length)/shortest blade length, as shown in fig. 8.
Step 2-5: filtering the above-mentioned fluctuation curve results in a filtered curve, as shown in fig. 9.
Step 2-6: the trough is selected to be the right front or the right rear of the wind driven generator, and the position is judged to be the right front or the right rear if the trough is less than 0.08. (it should be noted that, in fig. 8 and fig. 9, different longitudinal positions are used for different ratios of the pointers, different transverse positions are used for different shooting times of the pointers, and the longer wave trough is because the unmanned aerial vehicle hovers at the position for one time, when a plurality of values smaller than 0.08 are provided according to different shooting point distribution setting modes on the annular navigation line, it is indicated that a plurality of shooting points meeting the set condition can be used for determining to be located right in front of or right behind the wind driven generator, but the shooting point corresponding to the smallest value should be selected as the shooting position located right in front of or right behind the wind driven generator).
Further, after the front or the rear is obtained, the unmanned aerial vehicle can be judged to be in the front based on the image recognition hub according to the fact that the hub is arranged on the front of the wind driven generator.
Step 3: the unmanned aerial vehicle is controlled to hover at the point A to shoot a front image of the wind driven generator after vertically rising by the distance h in front of the position A0, and flies to the position B to shoot a rear image of the wind driven generator after shooting is completed, wherein B is a point behind the vertically rising distance h of the position B0 in front of the position B, and the vertically rising distance h can be set manually. Or the unmanned aerial vehicle can be controlled to shoot at the B and then shoot at the A.
Example 12:
the embodiment provides a specific implementation mode of the inspection method:
When the unmanned aerial vehicle hovers at the point A shown in the figure 1, a shooting flow is shown in a left diagram and a shooting flow is shown in a right diagram of the figure 2, so that a first shooting flow coverage area and a second shooting flow coverage area shown by arrows in the figure 4 are obtained; when hovering over the point B, the shooting process is shown in the left diagram and the shooting process is shown in the right diagram of FIG. 3, so that a third shooting process coverage area and a fourth shooting process coverage area shown by arrows in FIG. 4 are obtained.
Taking a first shooting procedure for illustration, fig. 2 shows that the blade image acquisition is divided into a plurality of acquisition segments along the length direction of the blade, on the blade, the acquisition segments 1 are close to the center of the fan, the acquisition segments 2 are far from the center of the fan in sequence, and the like to the acquisition segment n. After the wind driven generator with three blades rotates for one circle, after information of 3 blades is collected in a field area corresponding to the collection section 1 during collection, the cradle head controls the camera to rotate and focus to the field area corresponding to the collection section 2, the information of 3 blades passing through the field area is collected until the information of 3 blades under the area corresponding to the collection section n is collected, and then the cradle head controls the camera to rotate to sample a second shooting flow. And then the unmanned aerial vehicle flies to the point B to be collected in the same way until the whole shooting process is completed.
Example 13:
the embodiment provides a specific implementation manner of the inspection method on the basis of embodiment 12:
The specific inspection process is as follows:
step 1: the blade is divided into n acquisition segments, A, B, C, D, E respectively, as shown in fig. 5. Adjacent acquisition segments may partially overlap. Ensuring that each part of the blade is acquired and has a corresponding image overlapping rate.
Step 2: through manual control or unmanned aerial vehicle fly to the point position A shown in FIG. 1 automatically to hover at the point position A, keep motionless with unmanned aerial vehicle camera lens alignment blade's A position through the cloud platform, when the blade rotates through A position, the camera takes a candid photograph. The fan blade rotates once, and the camera can sequentially capture the image information of the A positions of the blade 1, the blade 2 and the blade 3. The unmanned aerial vehicle camera has automatic quick focusing function, and when the blade rotates to the A position, automatic quick focusing is accomplished and image acquisition is accomplished.
Step 3: after the position A of the three blades is collected, the lens is aligned to the position B of the blades through rotation of the cradle head, image collection … is performed on the position B of the three blades until the image information of the three blades is collected at the point position A, the images are first images, and then the second images of the three blades are collected at the point position A.
Step 4: through manual control or unmanned aerial vehicle automatic fly to the point position B shown in FIG. 1 and hover at the point position B, repeat above-mentioned step 2 and step 3, the image information of three blades when gathering the point position B, these images include first image and second image, carry out the second image acquisition after having gathered the first image at point position B.
And 5, synthesizing a longer complete fan blade photo according to the acquired image information.
The foregoing is a further detailed description of the invention in connection with specific preferred embodiments, and it is not intended that the invention be limited to these descriptions. Other embodiments of the invention, which are apparent to those skilled in the art to which the invention pertains without departing from its technical scope, shall be covered by the protection scope of the invention.

Claims (9)

1. The method for confirming the hovering point of the unmanned aerial vehicle for inspection of the wind driven generator based on vision is characterized in that the hovering point of the unmanned aerial vehicle is confirmed to be positioned right above the center line of the hub of the wind driven generator during inspection through the confirmed right front and/or right rear of the wind driven generator;
the method for confirming the right front and/or the right rear comprises the following steps:
S1, confirming an annular route of unmanned aerial vehicle flight according to the position coordinates of a wind driven generator and the height of the central line of the hub: the annular air line is positioned at the outer side of the rotating track of the wind driven generator blade, the annular air line is positioned on the horizontal plane at the height, and the position coordinate is positioned on the central line of the annular air line;
S2, the unmanned aerial vehicle shoots the blades of the wind driven generator in the flight process of the annular route, the annular route is provided with a plurality of shooting points, and each blade is integrally positioned in a photo imaging area of each shooting point;
s3, obtaining the length of each blade in each photo under the visual state, and determining the right front or right rear of the wind driven generator according to the difference value of the maximum blade length and the minimum blade length for each photo;
The shooting point corresponding to the photo with the smallest difference value is positioned right in front of or right behind the wind driven generator;
The length of the blade in the visual state is obtained by identifying the relative positions of the two ends of the blade in the photo, and specifically comprises the following steps:
s3-1, identifying the blades in the photo and adding a target boundary box to each blade;
s3-2, confirming the blade root of each blade, specifically:
establishing a minimum circumscribed rectangle associated with all midpoints according to the midpoints of each target bounding box;
Judging the number of vertexes of each target boundary frame in the minimum bounding rectangle, if only one vertex of the target boundary frame is positioned in the minimum bounding rectangle, the vertex positioned in the minimum bounding rectangle is the blade root of the blade corresponding to the target boundary frame;
If the target boundary frame is provided with two vertexes positioned in the minimum circumscribed rectangle, respectively taking each vertex of the two vertexes and other target boundary frames as vertexes of blade roots to form a polygon, and taking the vertex of the polygon with the minimum area as the blade root of the blade corresponding to the target boundary frame;
S3-3, obtaining the length of each blade in each photo in the visual state according to the confirmed blade root and the confirmed blade tip of each blade.
2. The vision-based wind turbine inspection unmanned aerial vehicle hover point confirmation method of claim 1, wherein,
In the step S3, a difference optimizing step is further included, where the difference optimizing step is:
Dividing the minimum difference value by the maximum blade length or the minimum blade length, and comparing the obtained result with a set judging threshold value:
when the result is smaller than or equal to the judging threshold value, confirming that the shooting point corresponding to the photo with the smallest difference value is located at the shooting point right in front of or right behind the wind driven generator;
And when the result is greater than the judgment threshold value, adjusting the shooting point distribution in the step S2 to obtain a new result, comparing the adjusted result with the judgment threshold value until the shooting point distribution is adjusted to be smaller than or equal to the judgment threshold value, and then confirming the shooting point corresponding to the photo with the smallest difference value to the shooting point right in front of or right behind the wind driven generator.
3. The vision-based wind turbine inspection unmanned aerial vehicle hover point confirmation method of claim 1 or 2, wherein each shooting point lens faces the center of the circular course.
4. The method for confirming the hovering point of the unmanned aerial vehicle for inspection of the wind driven generator based on vision according to claim 2, wherein a fluctuation curve is established in the difference optimization step, the fluctuation curve is used for reflecting the variation trend of the result of each photo under the annular air route, a correction curve is obtained after the fluctuation curve is filtered, the correction curve is compared with a set judgment threshold value, the shooting point which is positioned right in front of or right behind the wind driven generator is confirmed, or the fluctuation curve is reacquired, so that the shooting point which is positioned right in front of or right behind the wind driven generator is finally obtained.
5. A vision-based inspection method for a wind turbine, characterized in that, at a hovering point of an unmanned aerial vehicle confirmed by the hovering point confirmation method according to any one of claims 1to 4, image data of a wind turbine blade is collected by using a aerial camera with a cradle head;
meanwhile, the specific image data acquisition process comprises the following steps:
on the hovering point, changing the camera lens direction to adjust the field of view area through the cradle head;
The direction changing mode is as follows: on the rotation plane of the target blade, the field of view region translates along a line parallel to the rotation plane;
When the target blade rotates to a field of view area, the camera acquires an image;
the image acquired at the hover point position comprises a first image and a second image, wherein the first image is an image acquired by the camera when the target blade is at a first position on a rotation track of the target blade, the second image is an image acquired by the camera when the target blade is at a second position, and the target blade at the first position rotates 180 degrees to reach the second position on the rotation plane.
6. The vision-based wind turbine inspection method of claim 5, wherein the first image and the second image each comprise a plurality of images;
in all images acquired by the hover point camera:
all of the collectable region on the first position target blade is covered by the first image;
all of the collectable region on the target blade at the second location is covered by the second image;
in the shooting process, after all the first images are acquired, acquiring second images;
The acquisition modes of the first image and the second image are as follows: the cloud deck enables the view field area to move along the radial direction of the target blade, and each image is sequentially collected from one end to the other end of the target blade.
7. The vision-based wind turbine inspection method of claim 5 or 6, wherein images corresponding to two adjacent acquisition areas in the first image have a set overlapping rate;
The images corresponding to the two adjacent acquisition areas in the second image have a set overlapping rate.
8. The vision-based wind turbine inspection method of claim 5, wherein the position where the unmanned aerial vehicle hovers or the position where the target blade is imaged satisfies: when the camera collects images, an image target area on the target blade is positioned at the lower side of the lens.
9. The vision-based wind turbine inspection method of claim 8, wherein the first and second images are images acquired when the target blade is rotated to a horizontal attitude;
The hover points include a first hover point located forward of the wind turbine hub centerline, a second hover point located rearward of the wind turbine hub centerline.
CN202211507946.8A 2022-11-29 2022-11-29 Vision-based wind driven generator inspection unmanned aerial vehicle hovering point confirmation and inspection method Active CN115586789B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211507946.8A CN115586789B (en) 2022-11-29 2022-11-29 Vision-based wind driven generator inspection unmanned aerial vehicle hovering point confirmation and inspection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211507946.8A CN115586789B (en) 2022-11-29 2022-11-29 Vision-based wind driven generator inspection unmanned aerial vehicle hovering point confirmation and inspection method

Publications (2)

Publication Number Publication Date
CN115586789A CN115586789A (en) 2023-01-10
CN115586789B true CN115586789B (en) 2024-09-17

Family

ID=84783609

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211507946.8A Active CN115586789B (en) 2022-11-29 2022-11-29 Vision-based wind driven generator inspection unmanned aerial vehicle hovering point confirmation and inspection method

Country Status (1)

Country Link
CN (1) CN115586789B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116206094B (en) * 2023-04-28 2023-07-21 尚特杰电力科技有限公司 Fan blade angle measuring method, device and system and electronic equipment
CN117294820B (en) * 2023-11-24 2024-03-15 国网电力空间技术有限公司 Unmanned aerial vehicle inspection system for wind power generation field

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104504675A (en) * 2014-11-03 2015-04-08 中国科学院光电研究院 Active vision positioning method
CN110554704A (en) * 2019-08-15 2019-12-10 成都优艾维智能科技有限责任公司 unmanned aerial vehicle-based fan blade autonomous inspection method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140000612A (en) * 2012-06-22 2014-01-03 전자부품연구원 Aerogenerator having blade inspection function and blade inspection method thereof
WO2019227347A1 (en) * 2018-05-30 2019-12-05 深圳市大疆创新科技有限公司 Gimbal control method, gimbal, photography device, and readable storage medium
CN115013255A (en) * 2022-05-23 2022-09-06 北京华能新锐控制技术有限公司 Online visual monitoring and fault diagnosis method for fan blade

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104504675A (en) * 2014-11-03 2015-04-08 中国科学院光电研究院 Active vision positioning method
CN110554704A (en) * 2019-08-15 2019-12-10 成都优艾维智能科技有限责任公司 unmanned aerial vehicle-based fan blade autonomous inspection method

Also Published As

Publication number Publication date
CN115586789A (en) 2023-01-10

Similar Documents

Publication Publication Date Title
CN110879601B (en) A UAV inspection method for unknown wind turbine structure
CN110603379B (en) Inspection tool controls for wind turbine inspection tools
CN110554704B (en) Unmanned aerial vehicle-based fan blade autonomous inspection method
CN115586789B (en) Vision-based wind driven generator inspection unmanned aerial vehicle hovering point confirmation and inspection method
CN110282143B (en) Inspection method for offshore wind farm unmanned aerial vehicle
CN114296483B (en) Intelligent inspection method and electronic equipment for wind driven generator in non-stop state
CN112904877A (en) Automatic fan blade inspection system and method based on unmanned aerial vehicle
CN115791796B (en) Unmanned aerial vehicle-based wind driven generator inspection method and device
BR112020004609B1 (en) DRONE AND AUTONOMOUS NAVIGATION PROCEDURE FOR OBJECT TRACKING
CN116501091B (en) Fan inspection control method and device based on unmanned aerial vehicle automatic adjustment route
JP7188832B1 (en) Inspection system for unmanned flying vehicle and wind power generation equipment, and method for inspection of wind power generation equipment
CN113610749A (en) Fan blade defect detection method based on neural network
CN108894933A (en) With losing method for catching again and system when by unmanned plane to blower blade tip tracing detection
CN109190186B (en) Automatic reconstruction method and system for fan model through unmanned aerial vehicle
CN115145314B (en) Path planning method for fan blade inspection based on UAV
CN111038721A (en) Wind turbine blade inspection unmanned aerial vehicle and inspection method based on image recognition
CN114020033A (en) A UAV detection system and method based on wind turbine blades
CN113187671A (en) Method for determining yaw angle of wind wheel by using unmanned aerial vehicle
CN115480589A (en) Method and system for generating fan inspection route based on unmanned aerial vehicle
CN118070504A (en) Automatic routing planning method for wind turbine blades
LU504690B1 (en) A system for automatic leaf photo acquisition by drones
WO2024239682A1 (en) Dynamic inspection method for fan, and electronic device
CN112882487A (en) Unmanned aerial vehicle inspection track generation method, inspection method and system for fan blade
CN111539919B (en) Method and device for judging position and routing inspection of tower part
CN115407766A (en) Automatic fan inspection path planning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant