[go: up one dir, main page]

CN114136982B - Underwater robot and camera protection shell detection method thereof - Google Patents

Underwater robot and camera protection shell detection method thereof Download PDF

Info

Publication number
CN114136982B
CN114136982B CN202111429254.1A CN202111429254A CN114136982B CN 114136982 B CN114136982 B CN 114136982B CN 202111429254 A CN202111429254 A CN 202111429254A CN 114136982 B CN114136982 B CN 114136982B
Authority
CN
China
Prior art keywords
image
protective shell
acquiring
detecting
protective
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111429254.1A
Other languages
Chinese (zh)
Other versions
CN114136982A (en
Inventor
王张宁
宋昱
孟强祥
邹志勤
郭海娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Ningyuan Precision Machinery Manufacturing Co ltd
Original Assignee
Suzhou Ningyuan Precision Machinery Manufacturing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Ningyuan Precision Machinery Manufacturing Co ltd filed Critical Suzhou Ningyuan Precision Machinery Manufacturing Co ltd
Priority to CN202111429254.1A priority Critical patent/CN114136982B/en
Publication of CN114136982A publication Critical patent/CN114136982A/en
Application granted granted Critical
Publication of CN114136982B publication Critical patent/CN114136982B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63CLAUNCHING, HAULING-OUT, OR DRY-DOCKING OF VESSELS; LIFE-SAVING IN WATER; EQUIPMENT FOR DWELLING OR WORKING UNDER WATER; MEANS FOR SALVAGING OR SEARCHING FOR UNDERWATER OBJECTS
    • B63C11/00Equipment for dwelling or working underwater; Means for searching for underwater objects
    • B63C11/52Tools specially adapted for working underwater, not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/94Investigating contamination, e.g. dust
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/8858Flaw counting
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Chemical & Material Sciences (AREA)
  • Pathology (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Quality & Reliability (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Ocean & Marine Engineering (AREA)
  • Studio Devices (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

本说明书公开一种水下机器人摄像机保护壳的检测方法及检测装置,所述保护壳为弧形,所述检测方法包括如下步骤:采集所述保护壳的第一图像;对所述第一图像进行图像增强处理,得到第二图像;获取所述第二图像的闭合区域数量和/或缺陷系数;判断所述闭合区域数量是否超过第一阈值,和/或,判断所述缺陷系数是否超过第二阈值;若所述闭合区域数量超过所述第一阈值和/或所述缺陷系数超过所述第二阈值,则所述保护壳不符合要求;否则,所述保护壳符合要求。该保护壳的检测方法及检测装置,能够对弧形保护壳的清洁程度或者是否完好进行检测。

This specification discloses a detection method and detection device for an underwater robot camera protective shell, the protective shell is arc-shaped, and the detection method includes the following steps: collecting a first image of the protective shell; performing image enhancement processing on the first image to obtain a second image; obtaining the number of closed areas and/or defect coefficient of the second image; judging whether the number of closed areas exceeds a first threshold, and/or judging whether the defect coefficient exceeds a second threshold; if the number of closed areas exceeds the first threshold and/or the defect coefficient exceeds the second threshold, the protective shell does not meet the requirements; otherwise, the protective shell meets the requirements. The detection method and detection device for the protective shell can detect the cleanliness or integrity of the arc-shaped protective shell.

Description

Underwater robot and detection method of camera protection shell of underwater robot
Technical Field
The specification relates to the technical field of underwater exploration, in particular to a detection method and a detection device for an underwater robot camera protective shell.
Background
The statements in this section merely provide background information related to the present disclosure and may not constitute prior art.
In the field of underwater robotic exploration, cameras are a vital component of an underwater robot or other monitoring system. The main technical means of underwater exploration is to shoot images or video information through a camera.
However, due to the complex underwater environment, cameras installed on underwater robots or other monitoring systems are prone to contamination. Therefore, the outside of the camera is generally provided with a protective shell, so that the damage to the camera caused by other factors such as physics or environment is avoided.
Many protective housing adopt the hemisphere transparent material, can let the camera shoot 180 degrees outside image content. However, due to the complexity of the external environment, in particular, equipment such as underwater robots and the like works in water for a long time, not only sediment is accumulated on the outer side of the protective shell, but also other aquatic organisms adhere to the outer side of the protective shell. Therefore, in order to ensure that the camera works properly, it is necessary to detect the cleanliness of the protective case or whether it is intact.
It should be noted that the foregoing description of the technical background is only for the purpose of facilitating a clear and complete description of the technical solutions of the present specification and for the convenience of understanding by those skilled in the art. The above-described solutions are not considered to be known to the person skilled in the art simply because they are set forth in the background section of the present description.
Disclosure of Invention
In view of the shortcomings of the prior art, the purpose of the present specification is to provide a detection method and a detection device for an underwater robot camera protection shell, which can detect the cleanliness of an arc protection shell or whether the arc protection shell is good.
In order to achieve the above object, an embodiment of the present disclosure provides a method for detecting a protective case of an underwater robot camera, where the protective case is arc-shaped, the method comprising the steps of:
Acquiring a first image of the protective shell;
Performing image enhancement processing on the first image to obtain a second image;
acquiring the number of closed areas and/or defect coefficients of the second image;
Judging whether the number of the closed areas exceeds a first threshold value and/or judging whether the defect coefficient exceeds a second threshold value;
if the number of the closed areas exceeds the first threshold value and/or the defect coefficient exceeds the second threshold value, the protective shell is not satisfactory;
Otherwise, the protective shell meets the requirements.
As a preferred embodiment, the protective shell is hemispherical, and the diameter of the protective shell is 20 cm-200 cm.
As a preferred embodiment, the protective shell has a detection area comprising a plurality of sub-areas;
in the step of acquiring the first images of the protective shell, simultaneously acquiring the first images of the sub-areas;
In the step of performing image enhancement processing on the first image, image enhancement processing is performed on the first image of each sub-region in parallel.
As a preferred embodiment, the detection area comprises four rectangular sub-areas, each of which has an equal width and an equal length.
As a preferred embodiment, the image enhancement processing includes correction, filtering, quantization, white balancing.
As a preferred embodiment, the step of performing image enhancement processing on the first image to obtain a second image includes:
Processing the first image by using a correction algorithm to obtain a corrected image;
And carrying out interpolation processing on the gaps of the corrected image to obtain the second image.
As a preferred embodiment, the step of acquiring the number of closed areas of the second image includes:
detecting an edge of the second image;
detecting the closeness of the edge and simultaneously acquiring the number of the closed areas.
As a preferred embodiment, the step of detecting the edge of the second image includes:
calculating a gradient of the second image;
processing the second image using a template matrix;
And calculating the final gradient amplitude and gradient direction.
As a preferred embodiment, the step of acquiring the defect coefficients of the second image includes:
Obtaining directional derivatives of all pixel points of bright field imaging image ;
Obtaining directional derivatives of all pixels of dark field imaging image;
Calculating a defect coefficient of the second image
The embodiment of the specification also provides a detection device of protective housing, the protective housing is the arc, detection device includes:
the acquisition module is used for acquiring a first image of the protective shell;
The processing module is used for carrying out image enhancement processing on the first image to obtain a second image;
the acquisition module is used for acquiring the number of closed areas and/or defect coefficients of the second image;
the judging module is used for judging whether the number of the closed areas exceeds a first threshold value and/or judging whether the defect coefficient exceeds a second threshold value, if the number of the closed areas exceeds the first threshold value and/or the defect coefficient exceeds the second threshold value, the protective shell does not meet the requirements, otherwise, the protective shell meets the requirements;
The processing module is respectively and electrically connected with the acquisition module and the acquisition module, and the judging module is electrically connected with the acquisition module.
The underwater robot comprises a camera, a protective shell for protecting the camera, and a detection device for detecting the pollution of the protective shell, wherein the detection device comprises:
the acquisition module is used for acquiring a first image of the protective shell;
The processing module is used for carrying out image enhancement processing on the first image to obtain a second image;
the acquisition module is used for acquiring the number of closed areas and/or defect coefficients of the second image;
the underwater robot comprises a judging module, a protective shell, a cleaning device, a control module and a control module, wherein the judging module is used for judging whether the number of the closed areas exceeds a first threshold value and/or judging whether the defect coefficient exceeds a second threshold value;
The processing module is respectively and electrically connected with the acquisition module and the acquisition module, and the judging module is electrically connected with the acquisition module.
The detection method and the detection device for the protective shell have the beneficial effects that after the first image of the protective shell is acquired, the first image is subjected to image enhancement processing to obtain the second image, and the second image is closer to the actual curved surface condition of the protective shell. And acquiring the number of closed areas and/or defect coefficients of the second image. The number of the closed areas can reflect the amount of attachments on the protective shell, the more the number of the closed areas is, the more the attachments on the protective shell are, the less the protective shell is required to be cleaned or replaced, the defect coefficient can reflect the defect amount on the protective shell, the larger the defect coefficient is, the more the defects on the protective shell are, the less the protective shell is required to be replaced or repaired. Judging whether the number of the closed areas exceeds a first threshold value and/or judging whether the defect coefficient exceeds a second threshold value, if the number of the closed areas exceeds the first threshold value and/or the defect coefficient exceeds the second threshold value, the protective shell is not qualified, otherwise, the protective shell is qualified. Therefore, the detection method and the detection device for the protective shell provided by the embodiment can detect the cleanliness of the arc-shaped protective shell or whether the arc-shaped protective shell is intact, so that the camera in the protective shell can work normally.
Specific embodiments of the present description have been disclosed in detail with reference to the following description and drawings, indicating the manner in which the principles of the present description may be employed. It should be understood that the embodiments of the present description are not limited in scope thereby.
Features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments in combination with or instead of the features of the other embodiments.
It should be emphasized that the term "comprises/comprising" when used herein is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps or components.
Drawings
In order to more clearly illustrate the embodiments of the present description or the solutions of the prior art, the drawings that are required for the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present description, and that other drawings may be obtained according to these drawings without inventive faculty for a person skilled in the art.
Fig. 1 is a flowchart of steps of a method for detecting a protective case according to the present embodiment;
fig. 2 is a schematic structural view of a protective case according to the present embodiment;
fig. 3 is a schematic structural diagram of a detection area of a protective shell according to the present embodiment;
Fig. 4 is a flowchart showing steps of performing image enhancement processing on a first image in the present embodiment;
FIG. 5 is a graph comparing the processing efficiency of the prior art and the present embodiment when processing different deformation levels;
fig. 6 is a schematic diagram of a closure detection flow provided in the present embodiment;
Fig. 7 is a schematic structural diagram of a light assembly according to the present embodiment;
fig. 8 is a schematic structural diagram of a detection device of a protective case according to the present embodiment;
Fig. 9 is a schematic structural view of a cleaning device of an underwater robot camera assembly provided in the present embodiment;
fig. 10 is a schematic structural view of an underwater robot according to the present embodiment;
FIG. 11 is a front view of the front end of the underwater robot of FIG. 2;
FIG. 12 is a front view of the front end of the underwater robot showing the direction of rotation of the shield;
Fig. 13 is an enlarged view of the outer surface of the protective cover.
Reference numerals illustrate:
10. a protective shell; 60 parts of acquisition module, 2 parts of processing module, 3 parts of acquisition module, 40 parts of judgment module, 5 parts of control piece, 6 parts of concave lens, 7 parts of convex lens, 8 parts of baffle;
C1, C2, C3, C4 sub-regions.
Detailed Description
In order to make the technical solutions in the present specification better understood by those skilled in the art, the technical solutions in the embodiments of the present specification will be clearly and completely described below with reference to the drawings in the embodiments of the present specification, and it is obvious that the described embodiments are only some embodiments of the present specification, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are intended to be within the scope of the present disclosure.
It will be understood that when an element is referred to as being "disposed on" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. The terms "vertical," "horizontal," "left," "right," and the like are used herein for illustrative purposes only and are not meant to be the only embodiment.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this specification belongs. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the description. The term "and/or" as used herein includes any and all combinations of one or more of the associated listed items.
Please refer to fig. 1. The present embodiment provides a detection method of the underwater robot camera protective housing 10, wherein the protective housing 10 is arc-shaped. The method for detecting the offset comprises the following steps:
step S10, collecting a first image of the protective shell 10;
step S20, performing image enhancement processing on the first image to obtain a second image;
step S30, acquiring the number of closed areas and/or defect coefficients of the second image;
step S40, judging whether the number of the closed areas exceeds a first threshold value and/or judging whether the defect coefficient exceeds a second threshold value;
step S50, if the number of the closed areas exceeds a first threshold value and/or the defect coefficient exceeds a second threshold value, the protective shell 10 does not meet the requirement;
step S60, if not, the protective shell 10 meets the requirements.
Because the protective housing 10 is arc-shaped, the detection method of the protective housing 10 provided in the embodiment performs image enhancement processing on the acquired first image, so that the acquired second image is closer to the actual curved surface condition.
In step S30, the number of closed areas of the second image, or the defect coefficient of the second image, or both the number of closed areas and the defect coefficient may be acquired. The number of closed areas reflects the amount of attachment on the protective shell 10. Wherein the attachment can be attached sediment, particulate matters and the like. The greater the number of closed areas, the more adherent the protective shell 10 is, the less clean the protective shell 10 is, and the less satisfactory it is to be cleaned or replaced. The defect coefficient reflects the amount of defects on the protective shell 10. Wherein the defect may be a crack caused by a physical collision, or a crack generated by a defect of the protective case 10 itself, or the like. The larger the defect coefficient, the more defects on the protective shell 10, the less satisfactory the protective shell 10 needs to be replaced or repaired.
The detection method of the protective housing 10 provided in this embodiment can detect the cleanliness of the arc-shaped protective housing 10 or whether the arc-shaped protective housing is intact, so as to ensure that the camera in the protective housing 10 works normally.
In step S10, a camera or other acquisition module 60 may be selected to acquire images of the protective case 10. Preferably, as shown in fig. 2, the protective shell 10 is hemispherical. The material and the size of the protective case 10 are not limited in this embodiment, and may be transparent glass. The area of the protective case 10 should not be excessively large in consideration of the performance and processing time of the module performing each step, the mechanical properties, and the like. In one embodiment, the protective shell 10 may be provided with a diameter of 20 cm to 200 cm.
In the present embodiment, the protective case 10 has a detection area, and the first image is an image corresponding to the detection area. Because the protective shell 10 is arc-shaped, the angles of the collected first images at different positions are different, if the number of closed areas and/or the defect coefficients are directly obtained according to the first images, the closed areas and the actual arc-shaped conditions have a gap, and the obtained data are inaccurate, so that step S20 is needed. The detection area may also be defined as OIA (Object IMAGE AREA).
Specifically, in order to make the detection method faster, the detection area may include a plurality of sub-areas, and in step S10, the first images of the sub-areas are acquired simultaneously, and in step S20, the first images of the sub-areas are subjected to image enhancement processing in parallel simultaneously.
In this embodiment, the division of the subareas may be performed according to the difference between the distance and the angle from the arc surface of the protective case 10 to the acquisition module 60. Preferably, as shown in fig. 3, the detection area may include four rectangular sub-areas C1, C2, C3, C4, where the algorithm performance is optimal. Each sub-region is equal in width and equal in length.
In step S20, the image enhancement processing may include correction, filtering, quantization, white balancing, and the like. The step S20 may specifically include processing the first image with a correction algorithm to obtain a corrected image, and performing interpolation processing on a gap of the corrected image to obtain a second image. The interpolation process can ensure the signal continuity requirements of the subsequent algorithm.
As shown in fig. 4, it is an embodiment of step S20. In this embodiment, the input first image is subjected to the image buffer S103 and the pixel interpolation S104, i.e., the second image can be output. The address management S101 is responsible for recording address information of the memory corresponding to the sub-region, the buffer writing S102 is responsible for writing operation, and the buffer reading S105 is responsible for transmitting the processed data back to the image buffer S103. The image buffer S103 includes 4 buffers, and stores data of 4 sub-images, respectively. The pixel interpolation S104 performs interpolation processing before image output. The syndrome image processing module S106 may be used for image enhancement, so that the obtained second image is closer to the actual situation of the arc-shaped protective shell 10.
In this embodiment, the acquired first image may be converted into a 12-bit, or 16-bit, or 20-bit image to accommodate a variety of image compression formats. The four sub-areas are adopted to carry out image enhancement processing on the data of the 12-bit, 16-bit and 20-bit first image respectively in a parallel processing mode, so that the correction speed is obviously improved, and the method is particularly effective in image deformation processing. As shown in fig. 5, the left graph is a line graph of the average number of pixels processed per unit time when the normal flow processes different deformation levels, and the right graph is a line graph of the average number of pixels processed per unit time when the four sub-regions are adopted to process different deformation levels in parallel in the present embodiment. Wherein the abscissa represents the image deformation coefficient, the larger the coefficient is, the larger the image deformation is, the ordinate represents the number of the processing pixels in the average unit time, and the larger the number is, the higher the processing efficiency is. The comparison shows that the processing speed of the embodiment is obviously improved, and the larger the image deformation coefficient is, the larger the speed of the method of the embodiment is improved.
In step S30, the step of acquiring the number of closed areas of the second image may specifically include detecting edges of the second image, detecting closeness of the edges, and simultaneously acquiring the number of closed areas.
The edge of the second image is detected, so that attachments can be enhanced, particles of the image are more obvious, and foreign matters or particulate matters can be identified. And the edge detection uses a first-order method, and a gradient operator is introduced to calculate the center difference of the pixels of the second image.
In this embodiment, the step of detecting the edge of the second image may specifically include calculating a gradient of the second image, processing the second image using a template matrix, and calculating a final gradient magnitude and gradient direction.
Wherein calculating the gradient of the second image may use the following calculation method:
,y,
,y
in the above two formulas, x represents the horizontal axis coordinate of the second image, y represents the vertical axis coordinate of the second image, and L represents the pixel gray value of the corresponding coordinate; Representing the x-direction gradient; Representing the y-direction gradient.
Processing the second image using the template matrix includes:
the final gradient magnitude can be calculated by the following formula:
the gradient direction estimation is
In this embodiment, after the enhanced edge processing, the attached matter is labeled by a closure detection algorithm. And marking the closed areas at the edges according to a certain threshold value, counting the number of the closed areas under the condition that the areas exist as foreign matters or particulate matters, judging that the particulate matters are too much when the number of the closed areas exceeds a first threshold value, and outputting signals for processing by an external system.
As shown in fig. 6, this is one embodiment of a closure detection procedure. In this embodiment, detecting the closure of the edge comprises the steps of:
Step S201, start.
Step S202, scanning the whole second image pixel information.
Step S203, starting from the n-th marked edge (n=1, 2.).
And S204, acquiring the characteristic information of the edge.
Step 205, a hash table is built by using the adjacent pixel pre-judgment method as the pre-judgment.
Step S206, calculating hash value for each closed segment near the edge.
Step S207, judging the approximation of the closed hash value of the adjacent pixels.
Step S208, determining continuity by utilizing the edge characteristics.
Step S209, recording the continuous edge in the closed record list.
Step S210, judging whether the path is closed.
Step S211, if the path is closed, updating the hash table.
Step S212, if the path is not closed, judging whether to process the next marked edge, and if the next marked edge is processed, returning to step S204.
Step S213, if the next marked edge is not processed, ending.
Step S214, after step S211, judging whether the hash value is better than the original table, if so, returning to step S206, and if not, returning to step S207.
In this embodiment, the number of closed areas is also recorded, together with the completion of the edge closure detection. If the number of closed areas exceeds the first threshold, it is determined that there is too much particulate matter and the protective case 10 needs to be cleaned or replaced and a signal can be output for processing by an external system.
In step S30, the step of acquiring the defect coefficient of the second image may specifically include:
Step S301, obtaining directional derivatives of all pixel points of the bright field imaging image ;
Step S302, obtaining directional derivatives of all pixels of the dark field imaging image;
Step S303, calculating defect coefficients of the second image
Wherein the order of steps S301 and S302 is not required. Obtaining a bright-dark field transmission image of the protective case 10 can enhance the defect target, thereby facilitating detection of defects. Bright field imaging (Bright FIELD IMAGE) is to block out the diffracted beam by passing the transmitted beam through the objective stop at the back focal plane of the objective to obtain the pattern contrast BFI. Dark field imaging (DARK FIELD IMAGE) is performed by tilting the incoming beam direction by 2The angle is such that the diffracted beam passes through the objective stop and the transmitted beam is blocked off to obtain the pattern contrast DFI. Applying light-dark field dislocation to the same curved surface can obtain defect coefficient
Assuming that the defect coefficient of the protective case 10 when it is acceptable is a%, the second threshold is the sum of a% and tolerance. The tolerance may be 10%, i.e. the second threshold is (a+10)%.
In one embodiment, as shown in FIG. 7, the defect coefficients of the second image are acquired by light assemblies disposed on either side of the acquisition module 60. The light assembly may include a light source, a concave lens 6, a convex lens 7, and a baffle 8 sequentially disposed, wherein the light source is located on a side away from the protective case 10, and the baffle 8 is located on a side close to the protective case 10. The shutter 8 is movable in the direction perpendicular to the light. The baffle 8 is provided with a slit for passing light. The centers of the concave lens 6 and the convex lens 7 are aligned. When the slit moves to be coaxial with the vertical axis of the lens, the light source irradiates the curved surface through the two lenses and the slit, and the image acquired by the acquisition module 60 is a bright field image. When the slit moves to the 1/2 distance of the vertical axis of the lens (i.e. the axis is 1/2 of the distance between the edge of the lens), the light source irradiates the curved surface through the two lenses and the slit, and the image collected by the collection module 60 is a dark field image.
Please refer to fig. 8. The present embodiment also provides a detection device for the protective case 10. The protective housing 10 is arc, and detection device includes collection module 60, processing module 2, acquisition module 3 and judgement module 40.
Wherein the acquisition module 60 is used for acquiring a first image of the protective case 10. The processing module 2 is configured to perform image enhancement processing on the first image to obtain a second image. The acquisition module 3 is configured to acquire the number of closed regions and/or defect coefficients of the second image. The judging module 40 is configured to judge whether the number of the closed areas exceeds a first threshold and/or judge whether the defect coefficient exceeds a second threshold, if the number of the closed areas exceeds the first threshold and/or the defect coefficient exceeds the second threshold, the protective shell 10 does not meet the requirement, otherwise, the protective shell 10 meets the requirement. The processing module 2 is electrically connected with the acquisition module 60 and the acquisition module 3, and the judging module 40 is electrically connected with the acquisition module 3.
The embodiment of the detection device corresponds to the embodiment of the detection method, which can solve the technical problems solved by the embodiment of the detection method, and correspondingly achieves the technical effects of the embodiment of the detection method, and the specific application is not described herein.
In this embodiment, the acquisition module 60 may be a camera or video camera. The processing module 2 performs image enhancement processing on the first image, and belongs to pre-processing. The acquisition module 3 acquires the number of closed areas and/or defect coefficients of the second image, and belongs to post preprocessing. After the result obtained by the judging module 40, if the protective shell 10 meets the requirement, the protective shell 10 can continue to work, and the detection method provided in the embodiment can be executed again after a predetermined time interval, and if the protective shell 10 does not meet the requirement, the protective shell 10 needs to be cleaned, repaired or replaced. Specifically, the collecting module 60, the processing module 2, the acquiring module 3 and the judging module 40 may be electrically connected to the control element 5, so as to control the detection method to be smoothly performed on the whole.
Referring to fig. 9-13, one embodiment of the present disclosure also provides an underwater robot including a camera, a protective hull protecting the camera. The underwater robot further comprises detection means (detection member 4) for detecting contamination of the protective casing. The detection device can detect the contamination of the protective case by using the detection method in the above embodiment. The underwater robot further comprises a cleaning device for cleaning the protective housing when the protective housing is not satisfactory.
Specifically, the cleaning device comprises a cleaning device 103 of a camera 104 of the underwater robot 100, the underwater robot 100 comprises the camera 104, and the cleaning device 103 comprises a protective shell 10, a cleaning assembly, a driving assembly, a detecting piece 4 and a control piece 5.
The underwater robot 100 has a front end 101 and a rear end 102 in the forward direction X. Since the camera 104 is generally provided at the front end 101 of the underwater robot 100, the protective casing 10 is also correspondingly located at the front end 101. Of course, in other embodiments, the camera 104 may be located at the rear end 102, the side surface, or other positions of the underwater robot 100, and the protective housing 10 may be located at the rear end 102, the side surface, or other positions, so long as the protective housing 10 is in sealing connection with the underwater robot 100.
A cradle head can also be arranged in the protective shell 10 to control the camera 104, so that the camera 104 is more stable. The protective housing 10 is hemispherical, and the edge of the protective housing 10 is hermetically connected with the underwater robot 100. The material of the protective shell 10 may be toughened glass, or other transparent materials with a certain hardness.
As shown in fig. 10 to 13, the cleaning assembly may include a first cleaning arm 21. The first cleaning arm 21 is arc-shaped, both ends of the first cleaning arm 21 are located at the edge of the protective case 10, and the middle portion of the first cleaning arm 21 is bent toward the center of the protective case 10. When the first cleaning arm 21 works, the arc-shaped structure of the first cleaning arm can form a descending channel with the outer surface 11 of the protective shell 10, so that aquatic organisms or attachments with larger volume can be pushed to leave the machine body 105, and secondary pollution caused by attachment of the first cleaning arm 21 can be avoided.
As shown in fig. 13, the outer surface 11 of the protective housing 10 surrounded by the first cleaning arm 21 and the edge of the protective housing 10 may be further provided with a cleaning arm 23, which can increase a cleaning effect, and further clean the portion of the protective housing 10 near the edge, where cleaning is more difficult with respect to the center of the protective housing 10. The cleaning arm 23 has one end located at the edge of the protective case 10 between the both ends of the first cleaning arm 21 and the other end extending toward and spaced apart from the middle portion of the first cleaning arm 21 by a predetermined distance so that dirt can be discharged from the gap.
Further, the cleaning assembly may also include a second cleaning arm 22. The second cleaning arm 22 extends from the edge of the protective case 10 toward the center of the protective case 10, extends to the center of the protective case 10 or beyond the center of the protective case 10. That is, one end of the second cleaning arm 22 is located at the edge of the protective case 10 and the other end is located at the center of the protective case 10 or beyond the center of the protective case 10, which is advantageous in reducing resistance, and the second cleaning arm 22 is mainly used to remove dirt in the vicinity of the center of the hemispherical protective case 10. Preferably, the other end of the second cleaning arm 22 is located at the center of the protective case 10, i.e., at the apex of the hemispherical shape, so that the path the second cleaning arm 22 passes through can cover the entire outer surface 11 of the protective case 10 after the protective case rotates one round around the center thereof, thereby making the cleaning more comprehensive.
In a specific embodiment, the cleaning assembly may include two first cleaning arms 21 and one second cleaning arm 22. One end of the second cleaning arm 22 is located at the lowest point of the edge of the protective case 10, and the two first cleaning arms 21 are symmetrically distributed about the line connecting the second cleaning arm 22 and the center of the protective case 10. Preferably, the two first cleaning arms 21 are identical in shape. The two first cleaning arms 21 may be located at leftmost and rightmost ends of the edge of the protective case 10, respectively. Or the two first cleaning arms 21 and one second cleaning arm 22 are respectively positioned at three vertexes of the regular triangle, so that the cleaning components are distributed uniformly in the whole, and a better cleaning effect can be achieved.
The intermediate portions of the first cleaning arms 21 do not reach the center of the protective case 10, so that the two first cleaning arms 21 do not collide. In this embodiment, the camera 104 is near the highest point of the edge of the protective case 10 so that the photographing range is not blocked by the second cleaning arm 22.
In the present embodiment, one end of the cleaning assembly, the detecting member 4 and the control member 5 may be fixed to the body 105 of the underwater robot 100. Specifically, the portion of the cleaning assembly located at the edge of the protective hull 10 is secured to the fuselage 105 of the underwater robot 100. As shown in fig. 1, the detecting element 4 and the controlling element 5 may be fixed in the body 105, the detecting element 4 and the controlling element 5 may be aligned in the advancing direction X of the underwater robot 100, and both the detecting element 4 and the controlling element 5 are higher than the lowest point of the protective case 10. As shown in fig. 10, the detecting element 4 and the controlling element 5 may be fixed to the outside of the body 105, the detecting element 4 and the controlling element 5 may be aligned in a direction X perpendicular to the advancing direction of the underwater robot 100, and the detecting element 4 and the controlling element 5 are located under the protective case 10.
As shown in fig. 9, 10 and 11, the detecting member 4 is located between the control member 5 and the protective case 10. The detecting element 4 may determine whether the contamination level of the protective case 10 exceeds the threshold by using an image recognition method, and the detection range of the detecting element 4 may coincide with the shooting range/visual range of the camera 104, but may be larger or slightly smaller than the shooting range/visual range in other embodiments. The detection range of the detecting piece 4 mainly includes the upper half of the protective case 10 because the camera 104 is located in the upper half area inside the protective case 10. As shown in fig. 1 and 3, a detection range of the detecting member 4 is shown between two dotted lines.
The drive assembly may include a motor 31 and a first gear 32. The motor 31 includes a drive shaft 33, and the drive shaft 33 is used to drive the first gear 32 to rotate. The protective housing 10 is provided with a second gear, and the first gear 32 is in meshed connection with the second gear, so that the motor 31 can drive the first gear 32 and the second gear to rotate through the driving shaft 33, and the rotation of the protective housing 10 is realized. Preferably, the center of the protective case 10 is stationary, i.e., the protective case 10 rotates around the center thereof. As shown in fig. 12, the dashed arrow indicates the direction of rotation of the protective shell 10, which may rotate counterclockwise, and in other embodiments may also rotate clockwise.
Specifically, the detecting element 4 and the control element 5 can be connected through a wired communication cable to realize two-way communication, and protocols such as ethernet, RS485 and the like can be used. The detecting element 4 sends the detected image or value of the dirt degree to the control element 5 through the cable, the control element 5 judges whether the dirt degree exceeds the threshold value, if the dirt degree exceeds the threshold value, the control element 5 sends a starting signal to the motor 31, and the motor 31 is controlled to start working through the cable.
In a specific application scenario, the detecting element 4 detects the degree of soiling of the outer surface 11 of the protective casing 10, and when the degree of soiling exceeds a threshold value, the control element 5 controls the actuation of the driving assembly so that the protective casing 10 starts to rotate. The outer surface 11 of the protective case 10 passes through the first cleaning arm 21, the second cleaning arm 22, and the first cleaning arm 21 in this order when rotated. The first cleaning arm 21 is responsible for removing larger attachments, such as underwater organisms, large particle attachments, etc., attached to the outer surface 11 of the protective housing 10, the second cleaning arm 22 is responsible for removing medium particle sized attachments, and the second first cleaning arm 21 is responsible for removing smaller particle attachments. Of course, the order of contact between the dirt and the cleaning arms may be different, and the application is not limited in this regard. After the protective housing 10 rotates for one period, the control member 5 returns a command to the detecting member 4, the detecting member 4 executes the detecting program again, if the degree of dirt does not exceed the threshold value, the operation is stopped, otherwise the cleaning operation is executed again until the condition is satisfied. One of the cycles may be a 360 rotation of the protective shell 10, or other desired angle.
An embodiment of the present disclosure further provides a cleaning method of an underwater robot camera, the cleaning method employing the cleaning apparatus as described in any one of the above, the cleaning method including the steps of detecting a contamination level of an outer surface of the protective case using the detection member, determining whether the contamination level exceeds the threshold value, controlling the driving member to be activated when the contamination level exceeds the threshold value by the control member, performing the step of detecting the contamination level of the protective case using the detection member again at intervals of a predetermined time when the contamination level does not exceed the threshold value, driving the protective case to rotate by a predetermined angle while the cleaning member cleans the outer surface of the protective case after the driving member is activated, and returning to the step of detecting the contamination level of the protective case using the detection member after the protective case rotates by a predetermined angle.
It should be noted that, in the description of the present specification, the terms "first," "second," and the like are used for descriptive purposes only and to distinguish between similar objects, and there is no order of preference therebetween, nor should it be construed as indicating or implying relative importance. In addition, in the description of the present specification, unless otherwise indicated, the meaning of "a plurality" is two or more.
The use of the terms "comprises" or "comprising" to describe combinations of elements, components, or steps herein also contemplates embodiments consisting essentially of such elements, components, or steps. By using the term "may" herein, it is intended that any attribute described as "may" be included is optional.
Multiple elements, components, parts or steps can be provided by a single integrated element, component, part or step. Alternatively, a single integrated element, component, part or step may be divided into separate plural elements, components, parts or steps. The disclosure of "a" or "an" to describe an element, component, section or step is not intended to exclude other elements, components, sections or steps.
It is to be understood that the above description is intended to be illustrative, and not restrictive. Many embodiments and many applications other than the examples provided will be apparent to those of skill in the art upon reading the above description. The scope of the present teachings should, therefore, be determined not with reference to the above description, but instead should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. The disclosures of all articles and references, including patent applications and publications, are incorporated herein by reference for the purpose of completeness. The omission of any aspect of the subject matter disclosed herein in the preceding claims is not intended to forego such subject matter, nor should the inventors regard such subject matter as not be considered to be part of the disclosed subject matter.

Claims (5)

1. The detection method of the underwater robot camera protective shell is characterized in that the protective shell is arc-shaped, and comprises the following steps:
Acquiring a first image of the protective shell;
Performing image enhancement processing on the first image to obtain a second image;
acquiring the number of closed areas and/or defect coefficients of the second image;
Judging whether the number of the closed areas exceeds a first threshold value and/or judging whether the defect coefficient exceeds a second threshold value;
if the number of the closed areas exceeds the first threshold value and/or the defect coefficient exceeds the second threshold value, the protective shell is not satisfactory;
otherwise, the protective shell meets the requirements;
the protective shell has a detection area comprising a plurality of sub-areas;
in the step of acquiring the first images of the protective shell, simultaneously acquiring the first images of the sub-areas;
In the step of performing image enhancement processing on the first image, performing image enhancement processing on the first image of each sub-region in parallel;
The step of performing image enhancement processing on the first image to obtain a second image includes:
Processing the first image by using a correction algorithm to obtain a corrected image;
Interpolation processing is carried out on the gaps of the correction image, so that the second image is obtained;
the step of acquiring the number of closed areas of the second image includes:
detecting an edge of the second image;
Detecting the closeness of the edge, and simultaneously acquiring the number of the closed areas;
the step of detecting edges of the second image comprises:
calculating a gradient of the second image;
processing the second image using a template matrix;
calculating the final gradient amplitude and gradient direction;
the step of acquiring the defect coefficients of the second image includes:
Obtaining directional derivatives of all pixel points of bright field imaging image ;
Obtaining directional derivatives of all pixels of dark field imaging image;
Calculating a defect coefficient of the second image
2. The method for detecting the underwater robot camera protective casing according to claim 1, wherein the protective casing is hemispherical, and the diameter of the protective casing is 20 cm to 200 cm.
3. The method of detecting an underwater robot camera protective case of claim 1, wherein the detection area includes four rectangular sub-areas, each of which has an equal width and an equal length.
4. The method for detecting the underwater robot camera protective case of claim 1, wherein the image enhancement processing includes correction, filtering, quantization, white balance.
5. The underwater robot is characterized by comprising a camera and a protective shell for protecting the camera, wherein the protective shell is arc-shaped, the underwater robot further comprises a detection device for detecting contamination of the protective shell, the protective shell is provided with a detection area, the detection area comprises a plurality of subareas, and the detection device comprises:
the acquisition module is used for acquiring the first images of the protective shell and simultaneously acquiring the first images of all the subareas;
The processing module is used for carrying out image enhancement processing on the first image to obtain a second image, and simultaneously carrying out image enhancement processing on the first image of each subarea in parallel;
the acquisition module is used for acquiring the number of closed areas and/or defect coefficients of the second image;
the underwater robot comprises a judging module, a protective shell, a cleaning device, a control module and a control module, wherein the judging module is used for judging whether the number of the closed areas exceeds a first threshold value and/or judging whether the defect coefficient exceeds a second threshold value;
the processing module is respectively and electrically connected with the acquisition module and the acquisition module, and the judging module is electrically connected with the acquisition module;
The obtaining the second image includes:
Processing the first image by using a correction algorithm to obtain a corrected image;
Interpolation processing is carried out on the gaps of the correction image, so that the second image is obtained;
the acquiring the number of closed areas of the second image includes:
detecting an edge of the second image;
Detecting the closeness of the edge, and simultaneously acquiring the number of the closed areas;
the detecting the edge of the second image includes:
calculating a gradient of the second image;
processing the second image using a template matrix;
calculating the final gradient amplitude and gradient direction;
The obtaining the defect coefficients of the second image includes:
Obtaining directional derivatives of all pixel points of bright field imaging image ;
Obtaining directional derivatives of all pixels of dark field imaging image;
Calculating a defect coefficient of the second image
CN202111429254.1A 2021-11-29 2021-11-29 Underwater robot and camera protection shell detection method thereof Active CN114136982B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111429254.1A CN114136982B (en) 2021-11-29 2021-11-29 Underwater robot and camera protection shell detection method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111429254.1A CN114136982B (en) 2021-11-29 2021-11-29 Underwater robot and camera protection shell detection method thereof

Publications (2)

Publication Number Publication Date
CN114136982A CN114136982A (en) 2022-03-04
CN114136982B true CN114136982B (en) 2025-01-14

Family

ID=80388589

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111429254.1A Active CN114136982B (en) 2021-11-29 2021-11-29 Underwater robot and camera protection shell detection method thereof

Country Status (1)

Country Link
CN (1) CN114136982B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101308101A (en) * 2008-07-11 2008-11-19 中国科学院物理研究所 Device and method for real-time observation of internal defects of colloidal crystals
CN104990925A (en) * 2015-06-23 2015-10-21 泉州装备制造研究所 Defect detecting method based on gradient multiple threshold value optimization

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6731805B2 (en) * 2001-03-28 2004-05-04 Koninklijke Philips Electronics N.V. Method and apparatus to distinguish deposit and removal in surveillance video
AU2003903448A0 (en) * 2003-06-26 2003-07-17 Canon Kabushiki Kaisha A method for tracking depths in a scanline based raster image processor
CN106934803B (en) * 2017-03-13 2019-12-06 珠海格力电器股份有限公司 method and device for detecting surface defects of electronic device
CN111080582B (en) * 2019-12-02 2023-06-02 易思维(杭州)科技有限公司 Method for detecting defects of inner and outer surfaces of workpiece
CN112712512A (en) * 2021-01-05 2021-04-27 余波 Hot-rolled strip steel scab defect detection method and system based on artificial intelligence

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101308101A (en) * 2008-07-11 2008-11-19 中国科学院物理研究所 Device and method for real-time observation of internal defects of colloidal crystals
CN104990925A (en) * 2015-06-23 2015-10-21 泉州装备制造研究所 Defect detecting method based on gradient multiple threshold value optimization

Also Published As

Publication number Publication date
CN114136982A (en) 2022-03-04

Similar Documents

Publication Publication Date Title
US7580071B2 (en) Electronic camera and control program of same for detecting foreign materials
JP4974765B2 (en) Image processing method and apparatus
WO2020243992A1 (en) Groove defect detection method and device
CN109986172A (en) A welding seam positioning method, equipment and system
JP2010130549A (en) Contamination detection device and method of detecting contamination of photographic device
CN113610846A (en) Tubular part inner side abnormality detection method and system based on artificial intelligence
CN114136982B (en) Underwater robot and camera protection shell detection method thereof
CN110849897B (en) Non-contact type cleaning monitoring and processing system for surface of optical element
JP2011069616A (en) Device and method for inspection inner surface of pipe material
CN115356261A (en) Defect detection system and method for automobile ball cage dust cover
WO2020252879A1 (en) Mobile phone screen defect detection system based on ultrasonic spray
CN115526820A (en) Workpiece detection method and equipment
CN112051272A (en) A machine vision-based defect detection system for the inner surface of high-pressure gas cylinders
JP2007159021A (en) Camera device and wiper control method
CN117686514A (en) Zxfoom precision and precision zxfoom precision and precision Defect detection method of
CN110389088A (en) An online monitoring method for particle pollutants on the surface of large-aperture mirrors
CN108872263A (en) A kind of positive side chip detection device of eyeglass based on ship shape light source and method
CN113953238B (en) Underwater robot and cleaning device and cleaning method of camera shooting assembly of underwater robot
CN220965012U (en) Night vision image enhancement device
CN103634520B (en) Video monitoring camera head protection method and device
CN220022902U (en) Chip detection device
JP2000341568A (en) Device, system and method for image pickup
JPH08178863A (en) Method for inspecting defect of lenticular lens sheet
JP2020188554A (en) Maintenance system for solar cell module
CN217935801U (en) An image bad point detection device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant