CN110443243B - Water level monitoring method, storage medium, network device and water level monitoring system - Google Patents
Water level monitoring method, storage medium, network device and water level monitoring system Download PDFInfo
- Publication number
- CN110443243B CN110443243B CN201910727330.3A CN201910727330A CN110443243B CN 110443243 B CN110443243 B CN 110443243B CN 201910727330 A CN201910727330 A CN 201910727330A CN 110443243 B CN110443243 B CN 110443243B
- Authority
- CN
- China
- Prior art keywords
- water level
- scene
- image
- water
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01F—MEASURING VOLUME, VOLUME FLOW, MASS FLOW OR LIQUID LEVEL; METERING BY VOLUME
- G01F23/00—Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm
- G01F23/04—Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by dip members, e.g. dip-sticks
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01F—MEASURING VOLUME, VOLUME FLOW, MASS FLOW OR LIQUID LEVEL; METERING BY VOLUME
- G01F23/00—Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm
- G01F23/80—Arrangements for signal processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/62—Text, e.g. of license plates, overlay texts or captions on TV images
- G06V20/63—Scene text, e.g. street names
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Fluid Mechanics (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
Abstract
The application discloses a water level monitoring method, a storage medium, network equipment and a water level monitoring system. The water level monitoring method comprises the following steps: identifying characters on the water level ruler, which are closest to the water surface, in the water level ruler scene image; acquiring a local area image containing characters and a water surface from a water level gauge scene image; determining the scene type of the water surface according to the local area image; matching the water level line detection methods corresponding to the scene types to detect the position coordinates of the water level line relative to the characters, wherein different scene types correspond to different water level line detection methods; and calculating the water level value of the water surface according to the position coordinates. The water level line detection method matched with the scene type is provided by identifying the scene type of the water surface in the scene image of the water level gauge, so that the influence of the scene environment of the water area where the water level gauge is located on the water level value monitoring result can be eliminated, and the accuracy of measuring the water level value is improved.
Description
Technical Field
The present disclosure relates to the field of water level monitoring technologies, and in particular, to a water level monitoring method, a storage medium, a network device, and a water level monitoring system.
Background
Rivers, lakes, roads and the like all need to measure and monitor water levels, are related to the national and civil life, and are very important for industrial and agricultural transportation and the like. Especially, in recent years, urbanization is accelerated, and serious accidents that the life safety is endangered by the position lack of related water level monitoring are frequent, so that technical measures are urgently needed to be taken for solving the serious accidents.
In a non-contact water level monitoring method, the influence of a special scene environment on a water level monitoring result is generally not considered, for example, conditions such as clear reflection and serious water ripple of a water surface exist, and the water surface condition in the special scene environment is easy to interfere with a result of measuring and calculating a water level value.
Disclosure of Invention
The application mainly provides a water level monitoring method, a storage medium, network equipment and a water level monitoring system, and aims to solve the problem that the water level measurement result is easily interfered by the scene environment of a water area where a water level gauge is located.
In order to solve the technical problem, the application adopts a technical scheme that: a water level monitoring method is provided. The water level monitoring method comprises the following steps: identifying the character on the water level ruler closest to the water surface in the water level ruler scene image; acquiring a local area image containing characters and a water surface from a water level gauge scene image; determining the scene type of the water surface according to the local area image; matching the water level line detection methods corresponding to the scene types to detect the position coordinates of the water level line relative to the characters, wherein different scene types correspond to different water level line detection methods; and calculating the water level value of the water surface according to the position coordinates.
In order to solve the above technical problem, another technical solution adopted by the present application is: a storage medium is provided. The storage medium has stored thereon program data which, when executed by the processor, implement the steps of the water level monitoring method as described above.
In order to solve the above technical problem, the present application adopts another technical solution: a network device is provided. The network device comprises a processor and a memory which are connected with each other, the memory stores a computer program, and the steps of the water level monitoring method are realized when the processor executes the computer program.
In order to solve the above technical problem, the present application adopts another technical solution: a water level monitoring system is provided. The water level monitoring system comprises the network camera and the network equipment, wherein the network camera is in communication connection with the network equipment, and is used for acquiring a water level gauge scene image.
The beneficial effect of this application is: different from the prior art, the application discloses a water level monitoring method, a storage medium, network equipment and a water level monitoring system. By identifying the scene type of the water surface in the scene image of the water level gauge, a water level line detection method matched with the scene type is provided, so that the influence of the scene environment of the water area where the water level gauge is located on the water level value monitoring result is eliminated, and the accuracy of measuring the water level value is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts, wherein:
FIG. 1 is a schematic flow chart illustrating a water level monitoring method according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart of step 10 of FIG. 1;
FIG. 3 is a schematic illustration of a water gauge scene image corresponding to a first scene;
FIG. 4 is a schematic illustration of a local area image corresponding to FIG. 3;
FIG. 5 is a schematic flow chart of step 30 of FIG. 1;
FIG. 6 is a schematic illustration of an edge feature image corresponding to the local area image of FIG. 4;
FIG. 7 is a schematic illustration of a water gauge scene image corresponding to a second scene;
FIG. 8 is a schematic illustration of an edge feature image corresponding to the local area image of FIG. 7;
FIG. 9 is a schematic illustration of a saturation component image corresponding to the local area image of FIG. 7;
FIG. 10 is a schematic illustration of a water gauge scene image corresponding to a third scene;
FIG. 11 is a schematic illustration of an edge feature image corresponding to the local area image of FIG. 10;
FIG. 12 is a schematic illustration of a saturation component image corresponding to the local area image of FIG. 10;
FIG. 13 is a flow chart illustrating a method for detecting water level in step 40 of FIG. 1 corresponding to the first scenario;
FIG. 14 is a schematic illustration of the edge feature image of FIG. 6 in cooperation with a sliding window;
FIG. 15 is a flow chart illustrating a method for detecting water level in step 40 of FIG. 1 corresponding to the second scenario;
FIG. 16 is a schematic illustration of a binary image corresponding to FIG. 9 in cooperation with a sliding window;
FIG. 17 is a schematic flow chart of step 420 of FIG. 15;
FIG. 18 is a flow chart illustrating a water level line detection method corresponding to the third scenario in step 40 of FIG. 1;
FIG. 19 is a schematic illustration of a grayscale image corresponding to the local area image of FIG. 10 in cooperation with a mirror image;
FIG. 20 is a schematic flow chart of step 50 of FIG. 1;
FIG. 21 is a schematic structural diagram of an embodiment of a storage medium provided herein;
FIG. 22 is a block diagram illustrating an embodiment of a network device provided herein;
fig. 23 is a schematic structural diagram of an embodiment of a water level monitoring system provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first", "second" and "third" in the embodiments of the present application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first," "second," or "third" may explicitly or implicitly include at least one of the feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless explicitly specified otherwise. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein may be combined with other embodiments.
Referring to fig. 1, a schematic flow chart of an embodiment of a water level detection method provided in the present application is shown.
Step 10: and identifying the character closest to the water surface on the water level ruler in the scene image of the water level ruler.
The water level gauge scene image comprises a water level gauge and an image of a water area where the water level gauge is located, wherein the water level gauge is in contact with the water area, the position where the water level gauge is in contact with the water surface is a water level line, the water level value of the water area can be obtained by measuring and calculating the scale value of the water level line corresponding to the water level gauge, and the water level value is the depth value of the water level.
Optionally, the water gauge scene image is captured by using a network camera device, such as a network monitoring device like a water dome camera, a camera, etc.
After the water level gauge scene image is obtained, the character closest to the water surface on the water level gauge in the water level gauge scene image can be identified by adopting technologies such as image identification, and the character closest to the water surface is a character which is completely identified and leaks out of the water surface.
The characters are the characters which are used for representing the water level value on the water level ruler, and can be Arabic numerals, Roman numerals, letters and the like.
Specifically, referring to fig. 2, the step of identifying the character closest to the water surface on the water level ruler in the water level ruler scene image in step 10 may be performed as follows.
Step 11: and detecting the scene image of the water level gauge by adopting a deep learning model so as to identify the water level gauge.
And detecting the scene image of the water level gauge by adopting a deep learning model so as to identify the water level gauge.
The deep learning model is trained by inputting a large number of water level gauge scene images, so that the deep learning model can identify the water level gauges in various scenes, and the water level gauge scene images in any water area can be detected and identified to be in the area where the water level gauge is located in the water level gauge scene images when being input into the deep learning model.
After the water level gauge is identified, the coordinates of the water level gauge in the scene image of the water level gauge can be obtained, so that the characters on the water level gauge can be identified in step 12.
For example, the water gauge in fig. 3 is identified by using the deep learning model, and the position coordinates of the water gauge in fig. 3 are recorded to mark the area of the water gauge in the scene image of the water gauge.
Step 12: the characters on the water level gauge are identified to determine the character closest to the water surface.
After the water level ruler is identified, character recognition is carried out on the area of the water level ruler scene image where the water level ruler is located, and the character closest to the water surface is determined.
For example, if the character is an arabic numeral, the ruler region is identified by numeral.
For the machine, the machine cannot know which character is the character closest to the water surface, and the water level ruler is inserted into the water area along the vertical direction in nature, so that the detection direction of the water level ruler in the depth learning model can be set according to the natural rule in the acquired water level ruler scene image, and the last character which can be completely recognized from top to bottom in the image along the vertical direction is the character closest to the water surface.
After the character closest to the water surface is identified, the coordinates of the character, for example, the coordinates of the bottommost part of the character, may also be obtained, so as to facilitate the subsequent step 12 of obtaining the local area image.
Step 20: and acquiring a local area image containing characters and a water surface from the water level gauge scene image.
The local Region image is a local interest (ROI) Region image containing the character closest to the water surface and a part of the water surface.
Referring to fig. 4, fig. 4 is a partial area image taken from the water level scale scene image of fig. 3, the partial area image including the characters closest to the water surface and the water surface.
Specifically, after the character closest to the water surface is identified, the water level gauge scene image is processed through various operators (such as Operator operators) and functions to obtain the character closest to the water surface and the local area image of the water surface, so that the subsequent judgment of the type of the scene to which the water surface belongs and the identification of the water level line are simplified.
After the character closest to the water surface is identified in the step 11, the coordinates of the character closest to the water surface in the water level scale scene image are also determined, and the local area image is calculated according to the coordinates.
For example, the character is a number, and the local area image is an area from the top of the character higher than 1/3 character height to the bottom of the character lower than 3 character height and beyond 1/4 width from the left and right sides of the character, so as to ensure that the character closest to the water surface and part of the water surface are included in the local area image, and no second complete number appears in the local area image. The range of the acquired local area image may also be other specific ranges, which is not limited in this application.
Step 30: and determining the scene type of the water surface according to the local area image.
In the application, the scene types of the water surface are divided into a first scene, a second scene and a third scene, the first scene is a turbid water surface reflection-free scene, the second scene is a clear water surface reflection-free scene with serious water ripple, and the third scene is a clear water surface reflection scene.
The local area image can be divided into a character area and a water surface area, a boundary between the character area and the water surface area is a water level line, and position coordinates of the water level line in the local area image are obtained, so that a water level value can be obtained.
And determining the scene type of the water surface according to the local area image, namely determining the scene type corresponding to the water surface according to the characteristics of the water surface area in the local area image.
Referring to fig. 4 and 6, if the scene corresponding to the water surface is the first scene, the edge features of the character region in the corresponding edge feature image are rich, and the water surface region has substantially no edge features, that is, the gray scale values of the water surface portion are substantially consistent, so that the gray scale value of the boundary between the water surface region and the character region in the edge feature image has a large difference, and the boundary is obvious, so that the position coordinates of the first scene and the water line thereof can be identified.
Referring to fig. 7, if the scene corresponding to the water surface is the second scene, the water surface area has severe water ripples. Referring to fig. 8, in the corresponding edge feature image, the edge feature of the water surface area is also very rich, the gray level value at the boundary between the character area and the water surface area is not very different, and the water level line cannot be accurately identified through the difference of the gray level values at the edges. Referring to fig. 9, it is necessary to suppress the moire texture of the water surface region, identify the second scene according to the difference between the average value of the saturation components of the top region and the average value of the saturation components of the bottom region in the image, and convert the local region image into a binary image so as to identify the position coordinates of the water line.
Referring to fig. 10, if the water surface does not belong to the first scene or the second scene, it may be determined that the scene corresponding to the water surface is a third scene, and the position coordinates of the water line may be identified according to the features of the third scene. Referring to fig. 11, in the edge feature image corresponding to the third scene, the edge features of the horizontal region are also very rich and cannot be clearly distinguished from the character region; referring to fig. 12, in the saturation component image corresponding to the third scene, the difference between the saturation component values of the water surface area and the character area is not significant enough, so that the third scene can be identified from the two features.
Specifically, referring to fig. 5, for the identification of the first scene, the following steps may be adopted.
Step 31: and converting the local area image into an edge feature image.
Converting the local area image into an edge characteristic image, namely converting the local area image from an RGB (Red, Green, Blue, Red, Green and Blue) image into a gray level image, processing the gray level image by adopting a Canny operator, a Sobel operator, a Roberts operator, a Prewitt operator or a Laplacian operator, and further performing convolution to obtain the edge characteristic image corresponding to the local area image.
Referring to fig. 4 and 6, fig. 4 is a local area image corresponding to a first scene, and fig. 6 is an edge feature image corresponding to the local area image. The water level line at the interface of the water surface area and the character area is clearly shown in fig. 6, thereby facilitating the acquisition of the position coordinates of the water level line relative to the character.
Step 32: and acquiring the average gray-scale value of the pixels in the bottom area of the edge feature image.
The bottom area of the edge feature image contains the water surface, that is, the bottom area of the edge feature image at least contains a part of the water surface area.
Optionally, the bottom region is included in the water surface region, that is, the bottom region is a part of the water surface region, and the average gray scale value of the pixels in the bottom region is the average gray scale value of the pixels in the water surface region.
Optionally, the bottom region includes the entire water surface region and a partial character region.
In the edge feature image, each pixel corresponds to a determined gray-scale value, the ratio of the sum of the gray-scale values of all pixels in the bottom area to the number of the pixels is obtained, and then the average gray-scale value of a single pixel in the bottom area is obtained, namely the average gray-scale value of the pixel in the bottom area.
Step 33: and judging whether the average gray scale value of the bottom area is less than or equal to a preset gray scale threshold value or not.
Referring to fig. 6, the first scene is a turbid water surface scene without reflection, and thus in the edge feature image corresponding to the first scene, there are substantially no edge features in the water surface region, that is, the gray level values of the pixels in the water surface region are substantially consistent and the values of the gray level values are all small, while the gray level values of the pixels in the character region are mostly large, that is, the difference between the gray level values of the pixels in the water surface region and the gray level values of the pixels in the character region is large, and the boundary between the water surface region and the character region is obvious.
Therefore, in the edge feature image corresponding to the first scene, the bottom region includes at least a part of the water surface region, and thus the average gray-scale value of the pixels in the obtained bottom region is relatively small. Furthermore, whether the bottom area contains a partial character area or not can be identified whether the scene type of the local area image is the first scene or not according to whether the average gray-scale value of the bottom area is smaller than the preset gray-scale threshold value or not by setting a proper preset gray-scale threshold value.
If the edge features of the water surface region are abundant in the edge feature image, the obtained average gray scale value of the pixels in the bottom region is large, the situation that the edge features of the water surface region are abundant is identified by using the preset gray scale threshold value, and then the scene type corresponding to the water surface is identified to be not the first scene.
Therefore, the preset gray scale threshold corresponding to the local area images with different specifications can be obtained through the analysis of a large number of first scene images and non-first scene images, so as to accurately judge whether the scene corresponding to the water surface is the first scene.
If so, go to step 34.
Step 34: and confirming that the scene type of the water surface is a first scene.
And if the acquired average gray scale value of the bottom area is smaller than a preset gray scale threshold value, determining that the scene type of the water surface is a first scene.
If the obtained average gray-scale value of the bottom area is greater than or equal to the preset gray-scale threshold value, it is determined that the scene type of the water surface is not the first scene, and step 35 is executed.
If the scene type of the water surface is identified not to be the first scene, whether the scene type of the water surface is the second scene or not is continuously confirmed.
Specifically, with continued reference to FIG. 5, for the identification of the second scene, the following steps may be employed.
Step 35: and converting the local area image into a saturation component image.
Specifically, as shown in fig. 9, the local area image is converted into a Saturation component image, that is, the local area image is first converted from an RGB image (i.e., RGB color space) into an HSV (Hue, Saturation, Value) color space, and then the Saturation component image is extracted from the HSV color space.
Step 36: and acquiring a saturation component difference value of the average saturation component value of the top area and the average saturation component value of the bottom area on the saturation component image.
The top region of the saturation component image contains at least a portion of the character region, and the bottom region of the saturation component image contains at least a portion of the water surface region.
Optionally, the top region is contained within the character region and the bottom region is contained within the water surface region. Alternatively, the top area is contained within the character area and the bottom area contains the entire water surface area and a portion of the character area. Alternatively, the top area includes the entire character area and a part of the water surface area, and the bottom area is included in the water surface area.
In the saturation component image, each pixel corresponds to a determined saturation component value. Acquiring the ratio of the sum of the saturation component values of all pixels in the top area to the number of the pixels, and further acquiring the average saturation component value of the top area; obtaining the ratio of the sum of the saturation component values of all pixels in the bottom area to the number of the pixels, and further obtaining the average saturation component value of the bottom area; further, after determining the average saturation value of the top region and the average saturation value of the bottom region, the saturation component difference between the average saturation value of the top region and the average saturation value of the bottom region is further determined.
Step 37: and judging whether the saturation component difference value is greater than or equal to a preset saturation component threshold value.
Referring to fig. 7 and 8, the second scene is a scene with a clear water surface, no reflection, but severe water ripples, and in the edge feature image corresponding to the second scene, the water ripples in the water surface region have rich textures, have rich edge features, and cannot form a significant difference with the character region, so that the interference of the water ripples needs to be filtered.
It is also found that the water surface area of the image corresponding to the second scene has a lower saturation and darker color than the character area. Referring to fig. 9, the local area image may be converted into a saturation component image according to such a characteristic of the second scene, so that the water ripple texture in the water surface area is suppressed, and it is convenient to determine whether the scene type of the water surface of the local area image is the second scene according to the difference in saturation between the water surface area and the character area.
Similarly, through the analysis of a large number of second scene images and non-second scene images, the preset saturation component threshold corresponding to the local area images with different specifications is obtained, so that whether the scene corresponding to the water surface is the second scene or not is accurately judged.
Judging whether the saturation component difference value is greater than or equal to a preset saturation component threshold value, if so, executing a step 38; or no, step 39 is performed.
Step 38: and confirming that the scene type of the water surface is a second scene.
And if the acquired saturation component difference is greater than or equal to a preset saturation component threshold, determining that the scene type of the water surface in the local area image is a second scene.
If the obtained saturation component difference is smaller than the preset saturation component threshold, it is determined that the scene type of the water surface in the local area image is not the second scene, and step 138 is executed.
Step 39: and confirming that the scene type of the water surface is a third scene.
And if the average gray-scale value of the bottom area pixel in the edge feature image corresponding to the local area image is greater than a preset gray-scale threshold value, and the saturation component difference value in the saturation component image corresponding to the local area image is less than a preset saturation component threshold value, determining that the scene type of the water surface in the local area image is a third scene.
Referring to fig. 10, the third scene is a scene with clear reflection on the water surface. Referring to fig. 11, in the corresponding edge feature image, the water surface area has a clear reflection about the water level gauge, so that the water surface area has rich edge features.
Referring to fig. 12, in the corresponding saturation component image, the difference between the hue, saturation and brightness of the water surface area and the character area is not obvious enough.
Therefore, whether the scene type of the water surface in the local area image is the third scene can be confirmed by utilizing the characteristics, and the position coordinate of the water level line is obtained according to the similarity of the reflection and the water level gauge.
Step 40: and matching a water level line detection method corresponding to the scene type to detect the position coordinates of the water level line relative to the characters.
In the first scene, the edge features of the water level gauge in the local area image are rich, the water surface area is basically free of the edge features, and the water level line is obvious in the edge feature image.
In the second scene, the water level gauge in the local area image and the edge characteristics of the water surface are rich, but the difference of the saturation component values of the water level gauge area and the water surface area is large.
In the third scene, the reflection in the water area and the characters on the water level ruler are in a mirror image relationship, the texture similarity is extremely high, and the difference of the saturation component values is small.
Therefore, different scene types have different influence factors on the detection of the water level line, so that different water level line detection methods are correspondingly adopted to improve the accuracy of the detection of the water level line.
Therefore, after the scene type of the water surface is determined, the water level line detection method corresponding to the scene type is matched to detect the position coordinates of the characters, which are relatively closest to the water surface, of the water level line. Wherein different scene types correspond to different water line detection methods.
The position coordinates of the water line with respect to the character closest to the water surface can be understood as the position coordinates of a point on the water line with respect to the position of a point on the bottom, middle or bottom of the character. The actual height of the character is determined and the position coordinates of the water line with respect to any point on the character can be determined.
Specifically, referring to fig. 13, when the scene type of the water surface is the first scene, the position coordinates of the water level line with respect to the character are obtained as follows.
Step 410: and setting a sliding window in the edge feature image.
As shown in fig. 14, the sliding window is a window area with a certain specification, and can detect and process pixels in an area defined by the sliding window, and acquire position coordinates of the water level line relative to the character according to the characteristics of the boundary between the water surface area and the character area.
Optionally, the window width of the sliding window is equal to the width of the edge feature image, and the window height is the height of the character. Alternatively, the window width and the window height may be of other specifications. The shape of the sliding window is not limited to a rectangle, and may be a circle, an ellipse, or the like.
In order to facilitate the display of the sliding window, the sliding window is placed outside the edge feature image in fig. 14, and is substantially placed within the area of the edge feature image. The following sliding window and mirror image are all within the area range of the corresponding image, and are not described again.
Step 411: the sliding window traverses the edge feature image line by line.
As shown in fig. 14, for example, the sliding window is rectangular, the window width is equal to the width of the edge feature image, and the window height is the height of the character. Thus, as the sliding window traverses the edge feature image line by line, the sliding window traverses the edge feature image line by line along the top-to-bottom direction of the edge feature image.
And the sliding windows with other shapes or specifications traverse the edge characteristic image line by line according to the rule and are not repeated.
Step 412: and respectively acquiring the difference value between the average gray-scale value of the pixels in the upper half area and the average gray-scale value of the pixels in the lower half area of the sliding window.
As shown in fig. 14, the difference between the average gray scale value of the pixel in the upper half area and the average gray scale value of the pixel in the lower half area of the sliding window is obtained, that is, in the process of traversing the edge feature image line by line through the sliding window, the difference between the average gray scale value of the pixel in the upper half area and the average gray scale value of the pixel in the lower half area of the sliding window is obtained again once every time the sliding window moves down one line to a new position, and then a plurality of differences can be recorded.
The boundary between the upper half area and the lower half area of the sliding window may be arbitrarily set, and for example, the boundary is set at each position such as one-half, one-third, or two-thirds of the sliding window in the height direction.
Step 413: the maximum value among the differences of the average gray scale values is determined to acquire the position coordinates of the water level line with respect to the character according to the boundary line of the upper half area and the lower half area corresponding to the maximum value.
As shown in fig. 14, a maximum value in the difference between the average gray scale values is determined, a boundary between the upper half region and the lower half region in the sliding window corresponding to the maximum value is a water level line, the position coordinates of the sliding window are known, and further, the position coordinates of the boundary in the edge feature image, that is, the position coordinates of the water level line in the edge feature image, are determined, and further, the position coordinates of the water level line relative to the character can be obtained.
In the edge feature image corresponding to the first scene, the water surface area has substantially no edge feature, the edge feature of the character area is rich, and the difference between the gray levels at the two sides of the water level line where the water surface area and the character area intersect is the largest, so that the maximum value of the differences of the average gray levels is determined, that is, the position coordinate of the water level line corresponding to the maximum value is determined.
It should be noted that the boundary between the upper half area and the lower half area in the sliding window does not affect the determination of the maximum value no matter where the boundary is disposed in the sliding window in the height direction.
Referring to fig. 15, when the scene type of the water surface is the second scene, the position coordinates of the water level line relative to the character can be obtained according to the following steps.
Step 420: and converting the saturation component image into a binary image.
In the saturation component image, the saturation component values of the character region and the water surface region have a large difference, and the saturation component values on the two sides of the water line jump, that is, the saturation component values on the two sides of the water line have a large difference, and the water line is not a horizontal straight line. Referring to fig. 16, after the saturation component image is converted into a binary image, a boundary between the character region and the water surface region is more obvious, that is, the water level line is more obvious, thereby facilitating obtaining the position coordinates of the water level line.
Referring to fig. 17, a step may be pressed to convert the saturation component image to a binary image.
Step 4201: and acquiring the binarization threshold corresponding to each column of pixels in the saturation component image column by column.
And converting the saturation component image into a gray level image, acquiring the gray level value of each pixel on the gray level image, and acquiring the binarization threshold corresponding to each row of pixels according to the following formula.
Wherein, threjIs the binary threshold value of the jth column of pixels, M is the column number of the pixels in the saturation component image, N is the row number of the pixels in the saturation component image, pijIs the gray scale value of the ith row and jth column pixels, max (p)j) Is the maximum gray level value, min (p), in the jth column of pixelsj) Is the minimum gray-scale value in the j column of pixels.
Step 4202: and carrying out binarization processing on each column of pixels based on the binarization threshold corresponding to each column of pixels.
And carrying out binarization processing on each column of pixels based on the binarization threshold value corresponding to each column of pixels, namely comparing each gray-scale value of each column of pixels with the corresponding binarization threshold value, assigning pixels with gray-scale values larger than or equal to the corresponding binarization threshold value as a first gray-scale value, and assigning pixels with gray-scale values smaller than the corresponding binarization threshold value as a second gray-scale value. For example, the first gray level value is 255 and the second gray level value is 0. Or, the first gray-scale value and the second gray-scale value may also be other values, and only the boundary lines in the binary image need to be distinguished more obviously.
And after each row of pixels are subjected to binarization processing, a binary image can be obtained. Then execution continues at step 421.
Step 421: and setting a sliding window in the binary image.
And setting a sliding window in the binary image, wherein the sliding window is the same as the sliding window in the edge feature image and is not repeated.
Step 422: and traversing the binary image line by the sliding window.
And when the sliding window traverses the binary image line by line, the sliding window traverses the binary image line by line along the direction from the top to the bottom of the binary image.
Step 423: and respectively obtaining the difference value of the average binary value of the pixels in the upper half area and the average binary value of the pixels in the lower half area of the sliding window.
Referring to fig. 18, difference values between the average binarized value of the pixels in the upper half area and the average binarized value of the pixels in the lower half area of the sliding window are respectively obtained, that is, in the process of traversing the binary image line by line through the sliding window, when the sliding window moves down to a new position every time, the difference values between the average binarized value of the pixels in the upper half area and the average binarized value of the pixels in the lower half area of the sliding window are obtained again, and a plurality of difference values can be recorded.
Similarly, the boundary between the upper half area and the lower half area of the sliding window can be set arbitrarily, and is not described in detail.
Step 424: and determining the maximum value in the difference value of the average binary values to acquire the position coordinates of the water level line relative to the character according to the boundary line of the upper half area and the lower half area corresponding to the maximum value.
And determining the maximum value in the difference value of the average binary values, wherein the boundary of the upper half area and the lower half area in the sliding window corresponding to the maximum value is the water level line, the position coordinates of the sliding window are known, and further the position coordinates of the boundary in the binary image can be determined, namely the position coordinates of the water level line in the binary image can be determined, and further the position coordinates of the water level line relative to the characters can be obtained.
In the binary image, the difference in gray level values is largest on both sides of the water level line at which the water surface region and the character region intersect, and therefore, when the boundary between the upper half region and the lower half region in the sliding window approximately coincides with the water level line, the maximum value of the difference in average binary values in the upper half region and the lower half region corresponds to. Thus, the maximum value among the differences of the respective average binarized values, that is, the position coordinates of the water line corresponding to the maximum value are determined.
Referring to fig. 18, when the scene type of the water surface is the third scene, the position coordinates of the water level line with respect to the character may be obtained as follows.
Step 430: and converting the local area image into a gray scale image.
Step 431: a mirror image of the character is acquired.
Referring to fig. 19, in the gray image, the character is selected to be subjected to a mirror image process, and a mirror image of the character is obtained.
Specifically, a window area containing the character is selected from the gray image to perform mirror image processing, so as to obtain a mirror image of the window area. The width of the window region is the width of the grayscale image, and the height of the window region is the height of the character. Of course, the width and the height of the window region may have other values, which is not limited in this application.
Step 432: the mirror image is traversed through the grayscale image line by line.
And when the mirror image traverses the gray level image line by line, the mirror image traverses the gray level image line by line along the direction from the top to the bottom of the gray level image.
Step 433: and respectively acquiring the similarity of the mirror image and the gray image area covered by the mirror image.
The similarity of the mirror image and the gray image area covered by the mirror image is respectively obtained, namely, in the process of traversing the gray image line by line of the mirror image, the similarity of the mirror image and the gray image area covered by the mirror image is obtained again once when the mirror image moves down to a new position every time, and then a plurality of similarities can be recorded.
Specifically, the gray-scale value of the pixel corresponding to the position of the mirror image on the gray-scale image area covered by the mirror image is subtracted to calculate the mean square error of the gray-scale value between the mirror image and the gray-scale image covered by the mirror image, and the smaller the mean square error is, the greater the similarity between the mirror image and the gray-scale image area covered by the mirror image is, i.e. the more likely the gray-scale image area covered by the mirror image is to be the reflection of the window area of the character on the water surface area.
Wherein, the similarity is the reciprocal of the mean square error.
Step 434: and determining the maximum value of the similarity so as to obtain the position coordinates of the water level line relative to the characters according to the position coordinates of the mirror image corresponding to the maximum value.
Determining the maximum value of the similarity, namely determining the position of the reflection of the window area of the character in the water surface area, further acquiring the position coordinate of the mirror image corresponding to the maximum value, namely acquiring the position coordinate of the reflection of the character in the water surface area, wherein the reflection of the character and the character is symmetrical about the water line, and further determining the position coordinate of the water line relative to the character.
Step 50: and calculating the water level value of the water surface according to the position coordinates.
In the local area image, after the position coordinates of the water level relative to the characters are obtained, the position coordinates can be compared with the coordinate height of the actual scale size of the water level ruler on the local area image, and then the water level value is calculated.
Referring to fig. 20, the step of calculating the water level value of the water surface from the position coordinates in step 50 may be performed as follows.
Step 51: and calculating the actual distance between the waterline and the character in the actual scene according to the position coordinate and the actual scale size of the water level ruler.
The method comprises the steps of obtaining position coordinates of a water level line relative to characters in the height direction of the characters, identifying coordinate heights corresponding to unit scales of a water level ruler or coordinate heights corresponding to the characters, further obtaining the ratio of the position coordinates to the coordinate heights, and combining distance values corresponding to the unit scales in an actual scene to further calculate the actual distance of the water level line relative to the characters in the actual scene.
Step 52: and calculating the water level value according to the actual height and the actual distance represented by the characters.
And calculating the water level value according to the actual height and the actual distance represented by the character, namely acquiring the difference value of the actual height represented by the character and the actual distance of the water level relative to the character, and further calculating the specific water level value of the water area.
In other embodiments, the water level base of the water area is also calibrated in advance to avoid the need for an excessively long water gauge inserted below the water surface, and step 53 is further performed.
Step 53: and acquiring the sum of the water level value and the water level base number, and further determining the actual water level value.
The water level value of a water area fluctuates in a certain range, if the historical lowest water level value of the water area is larger, the length of the required water level ruler is obviously larger, but most of the water level ruler extending into the water surface can be shielded by the water surface all year round and cannot be utilized, so that the waste of the water level ruler is caused.
Therefore, the water level base number of the water area can be calibrated in advance, and the actual water level value of the water area can be obtained by acquiring the water level value and the water level base number after acquiring the water level value on the water level gauge.
Referring to fig. 21, fig. 21 is a schematic structural diagram of an embodiment of a storage medium provided in the present application.
The storage medium 60 stores program data 61, and the program data 61, when executed by the processor, implements the water level monitoring method as described in fig. 1 to 20.
The program data 61 is stored in a storage medium 60 and includes instructions for causing a network device (which may be a router, a personal computer, a server, etc.) or a processor to perform all or part of the steps of the methods described in the various embodiments of the present application.
Alternatively, the storage medium 60 may be various media that can store the program data 61, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Referring to fig. 22, fig. 22 is a schematic structural diagram of an embodiment of a network device provided in the present application.
The network device 70 includes a processor 72 and a memory 71 connected, the memory 71 stores a computer program, and the processor 72 executes the computer program to implement the network failure information collecting method described in fig. 1 to 20.
Referring to fig. 23, fig. 23 is a schematic structural diagram of an embodiment of a water level monitoring system provided in the present application.
The water level monitoring system 80 comprises a network camera 81 and the network device 70 as described above, the network camera 81 is in communication connection with the network device 70, the network camera 81 is used for acquiring a water level gauge scene image and transmitting the water level gauge scene image to the network device 70, and the network device 70 performs detection and analysis on the water level gauge scene image, so as to determine the water level value of the water area.
Different from the prior art, the application discloses a water level monitoring method, a storage medium, network equipment and a water level monitoring system. By identifying the scene type of the water surface in the scene image of the water level gauge, a water level line detection method matched with the scene type is provided, so that the influence of the scene environment of the water area where the water level gauge is located on the water level value monitoring result is eliminated, and the accuracy of measuring the water level value is improved.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the storage medium embodiment and the electronic device embodiment, since they are substantially similar to the method embodiment, the description is relatively simple, and the relevant points can be referred to the partial description of the method embodiment.
The application is operational with numerous general purpose or special purpose computing system environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet-type devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other manners. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings, or which are directly or indirectly applied to other related technical fields, are intended to be included within the scope of the present application.
Claims (14)
1. A water level monitoring method, comprising:
identifying the character on the water level ruler closest to the water surface in the water level ruler scene image;
acquiring a local area image containing the characters and the water surface from the water level gauge scene image;
determining the scene type of the water surface according to the edge characteristics and the saturation components of the character area and the water surface area in the local area image; the scene type includes at least one of a first scene, a second scene, and a third scene; the first scene is a turbid water surface non-reflection scene, the second scene is a clear water surface non-reflection scene with severe water ripples, and the third scene is a clear water surface reflection scene;
matching the water level line detection methods corresponding to the scene types to detect position coordinates of the water level line relative to the characters, wherein different scene types correspond to different water level line detection methods;
and calculating the water level value of the water surface according to the position coordinates.
2. The water level monitoring method according to claim 1, wherein the step of determining the scene type of the water surface according to the local area image comprises:
converting the local area image into an edge characteristic image;
acquiring an average gray-scale value of pixels in a bottom area of the edge feature image, wherein the bottom area at least comprises a part of the water surface;
judging whether the average gray scale value of the bottom area is smaller than or equal to a preset gray scale threshold value or not;
if yes, the scene type of the water surface is the first scene.
3. The water level monitoring method according to claim 2, wherein the step of matching the water level line detection method corresponding to the scene type to detect the position coordinates of the water level line comprises:
if the scene type of the water surface is the first scene, setting a sliding window in the edge feature image;
traversing the edge characteristic image line by the sliding window;
respectively obtaining the difference value of the average gray-scale value of the pixels in the upper half area and the average gray-scale value of the pixels in the lower half area of the sliding window;
and determining the maximum value in the difference values of the average gray scale values so as to obtain the position coordinates of the water level line relative to the characters according to the boundary line of the upper half area and the lower half area corresponding to the maximum value.
4. The water level monitoring method according to claim 2, wherein the step of determining the scene type of the water surface from the local area image further comprises:
converting the local area image into a saturation component image;
acquiring a saturation component difference value of the average saturation component value of the top area and the average saturation component value of the bottom area on the saturation component image;
judging whether the saturation component difference value is greater than or equal to a preset saturation component threshold value or not;
and if so, the scene type of the water surface is the second scene.
5. The water level monitoring method according to claim 4, wherein the step of matching the water level line detection method corresponding to the scene type to detect the position coordinates of the water level line comprises:
if the scene type of the water surface is the second scene, converting the saturation component image into a binary image;
setting a sliding window in the binary image;
traversing the binary image line by the sliding window;
respectively obtaining the difference value of the average binary value of the pixels in the upper half area and the average binary value of the pixels in the lower half area of the sliding window;
and determining the maximum value in the difference value of the average binarization values, so as to obtain the position coordinates of the water level line relative to the characters according to the boundary line of the upper half area and the lower half area corresponding to the maximum value.
6. The water level monitoring method according to claim 5, wherein the step of converting the saturation component image into a binary image comprises:
acquiring a binarization threshold value corresponding to each column of pixels in the saturation component image column by column;
and carrying out binarization processing on each column of pixels based on the binarization threshold corresponding to each column of pixels.
7. The water level monitoring method according to claim 6,
wherein, threjThe binarization threshold value of the jth column of pixels, M is the column number of the pixels in the saturation component image, N is the row number of the pixels in the saturation component image, and p isijIs the gray level value, max (p), of the ith row and jth column pixelj) Is the maximum gray level value, min (p), in the jth column of pixelsj) Is the minimum gray-scale value in the j-th row of pixels.
8. The water level monitoring method according to claim 4, wherein the step of determining the scene type of the water surface according to the local area image further comprises:
and if the average gray scale value of the pixels in the bottom area is greater than a preset gray scale threshold value and the saturation component difference value is less than a preset saturation component threshold value, the scene type of the water surface is the third scene.
9. The water level monitoring method according to claim 8, wherein the step of matching the water level line detection method corresponding to the scene type to detect the position coordinates of the water level line comprises:
if the scene type of the water surface is the third scene, converting the local area image into a gray image;
acquiring a mirror image of the character;
traversing the mirror image through the gray image line by line;
respectively acquiring the similarity of the mirror image and a gray image area covered by the mirror image;
and determining the maximum value of the similarity so as to obtain the position coordinate of the water level line relative to the character according to the position coordinate of the mirror image corresponding to the maximum value.
10. The water level monitoring method according to claim 1, wherein the step of calculating the water level value of the water surface according to the position coordinates comprises:
calculating the actual distance of the water level line relative to the character in an actual scene according to the position coordinate and the actual scale size of the water level ruler;
and calculating the water level value according to the actual height and the actual distance represented by the characters.
11. The water level monitoring method according to claim 1, wherein the step of identifying the character on the water level ruler closest to the water surface in the water level ruler scene image comprises:
detecting the scene image of the water level gauge by adopting a deep learning model so as to identify the water level gauge;
identifying characters on the water level gauge to determine the character closest to the water surface;
and the character closest to the water surface is the character which is exposed out of the water surface and is completely recognized.
12. A storage medium having program data stored thereon, characterized in that the program data, when being executed by a processor, realize the steps of a method according to any one of the claims 1 to 11.
13. A network device comprising a processor and a memory connected to each other, the memory storing a computer program which, when executed by the processor, performs the steps of the method according to any one of claims 1 to 11.
14. A water level monitoring system, characterized in that the water level monitoring system comprises a network camera and the network device according to claim 13, the network camera is connected to the network device in communication, and the network camera is used for acquiring the water level gauge scene image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910727330.3A CN110443243B (en) | 2019-08-07 | 2019-08-07 | Water level monitoring method, storage medium, network device and water level monitoring system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910727330.3A CN110443243B (en) | 2019-08-07 | 2019-08-07 | Water level monitoring method, storage medium, network device and water level monitoring system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110443243A CN110443243A (en) | 2019-11-12 |
CN110443243B true CN110443243B (en) | 2022-06-07 |
Family
ID=68433954
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910727330.3A Active CN110443243B (en) | 2019-08-07 | 2019-08-07 | Water level monitoring method, storage medium, network device and water level monitoring system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110443243B (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110956172B (en) * | 2019-11-18 | 2023-04-07 | 四创科技有限公司 | Water gauge identification method based on image semantic segmentation |
CN113819972A (en) * | 2020-07-07 | 2021-12-21 | 湖北亿立能科技股份有限公司 | Artificial intelligence surface of water detecting system based on virtual scale |
CN113819971B (en) * | 2020-07-07 | 2024-08-20 | 湖北亿立能科技股份有限公司 | Artificial intelligent water level monitoring system based on semantic segmentation of water, scale and floater |
CN113221898B (en) * | 2021-04-16 | 2022-02-15 | 北京科技大学 | An automatic water gauge reading method |
CN112991342B (en) * | 2021-04-29 | 2021-07-30 | 南京甄视智能科技有限公司 | Water level line detection method, device and system based on water level gauge image |
CN113807709B (en) * | 2021-09-22 | 2024-02-27 | 河海大学 | Multi-target lake area water safety evaluation method based on water condition elements |
CN114067095B (en) * | 2021-11-29 | 2023-11-10 | 黄河勘测规划设计研究院有限公司 | Water level identification method based on water gauge character detection and identification |
CN114387235B (en) * | 2021-12-30 | 2024-02-09 | 重庆知行数联智能科技有限责任公司 | Water environment monitoring method and system |
CN114663811B (en) * | 2022-03-24 | 2022-10-28 | 中国水利水电科学研究院 | Water surface line extraction method using water gauge reflection |
CN114639064B (en) * | 2022-05-18 | 2022-09-02 | 智洋创新科技股份有限公司 | Water level identification method and device |
CN118674934B (en) * | 2024-08-19 | 2024-11-12 | 杭州定川信息技术有限公司 | A method for back-end simple identification of relative historical water levels and water level warnings |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102975826A (en) * | 2012-12-03 | 2013-03-20 | 上海海事大学 | Portable ship water gauge automatic detection and identification method based on machine vision |
US8726841B2 (en) * | 2010-05-21 | 2014-05-20 | II Richard C. Schluter | Aquarium accessory |
WO2017110538A1 (en) * | 2015-12-22 | 2017-06-29 | 株式会社プロドローン | Water level measurement system and water level control system, as well as water level measurement method and water level control method that use same |
CN107506798A (en) * | 2017-08-31 | 2017-12-22 | 福建四创软件有限公司 | A kind of water level monitoring method based on image recognition |
CN108470338A (en) * | 2018-02-12 | 2018-08-31 | 南京邮电大学 | A kind of water level monitoring method |
CN108921165A (en) * | 2018-06-21 | 2018-11-30 | 江苏南水水务科技有限公司 | Water level recognition methods based on water gauge image |
CN109522889A (en) * | 2018-09-03 | 2019-03-26 | 中国人民解放军国防科技大学 | A Method for Recognition and Estimation of Hydrometer Water Level Based on Image Analysis |
CN109543596A (en) * | 2018-11-20 | 2019-03-29 | 浙江大华技术股份有限公司 | A kind of water level monitoring method, apparatus, electronic equipment and storage medium |
-
2019
- 2019-08-07 CN CN201910727330.3A patent/CN110443243B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8726841B2 (en) * | 2010-05-21 | 2014-05-20 | II Richard C. Schluter | Aquarium accessory |
CN102975826A (en) * | 2012-12-03 | 2013-03-20 | 上海海事大学 | Portable ship water gauge automatic detection and identification method based on machine vision |
WO2017110538A1 (en) * | 2015-12-22 | 2017-06-29 | 株式会社プロドローン | Water level measurement system and water level control system, as well as water level measurement method and water level control method that use same |
CN107506798A (en) * | 2017-08-31 | 2017-12-22 | 福建四创软件有限公司 | A kind of water level monitoring method based on image recognition |
CN108470338A (en) * | 2018-02-12 | 2018-08-31 | 南京邮电大学 | A kind of water level monitoring method |
CN108921165A (en) * | 2018-06-21 | 2018-11-30 | 江苏南水水务科技有限公司 | Water level recognition methods based on water gauge image |
CN109522889A (en) * | 2018-09-03 | 2019-03-26 | 中国人民解放军国防科技大学 | A Method for Recognition and Estimation of Hydrometer Water Level Based on Image Analysis |
CN109543596A (en) * | 2018-11-20 | 2019-03-29 | 浙江大华技术股份有限公司 | A kind of water level monitoring method, apparatus, electronic equipment and storage medium |
Non-Patent Citations (1)
Title |
---|
M. Kröhnert 等.AUTOMATIC WATERLINE EXTRACTION FROM SMARTPHONE IMAGES.《The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences》.2016, * |
Also Published As
Publication number | Publication date |
---|---|
CN110443243A (en) | 2019-11-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110443243B (en) | Water level monitoring method, storage medium, network device and water level monitoring system | |
CN108759973B (en) | Water level measuring method | |
CN113283439B (en) | Intelligent counting method, device and system based on image recognition | |
CN113850749B (en) | Method for training defect detector | |
KR101272448B1 (en) | Apparatus and method for detecting region of interest, and the recording media storing the program performing the said method | |
CN109376740A (en) | A kind of water gauge reading detection method based on video | |
CN110909640A (en) | Method and device for determining water level line, storage medium and electronic device | |
CN112950540B (en) | Bar code identification method and equipment | |
CN111539938B (en) | Method, system, medium and electronic terminal for detecting curvature of rolled strip steel strip head | |
CN103852034B (en) | A kind of method for measuring perendicular | |
CN113298769B (en) | FPC flexible flat cable appearance defect detection method, system and medium | |
CN115294035A (en) | Bright point positioning method, bright point positioning device, electronic equipment and storage medium | |
CN116071311A (en) | Equipment cleaning detection method, system and storage medium based on image recognition | |
CN116071692A (en) | Morphological image processing-based water gauge water level identification method and system | |
CN118501177B (en) | Appearance defect detection method and system for formed foil | |
CN118261885A (en) | Image definition identification method, intelligent terminal and storage medium | |
JP5424694B2 (en) | Image recognition apparatus and program | |
CN117788433A (en) | Glass defect detection method | |
CN117576378A (en) | Dirt determination method, defect detection method, system and electronic equipment | |
CN109389644A (en) | Parking stall line detecting method based on direction gradient enhancing | |
CN114862761B (en) | Power transformer liquid level detection method, device, equipment and storage medium | |
CN113508395A (en) | Method for detecting an object | |
CN116485874A (en) | Intelligent detection method and system for cutting intervals of die-cutting auxiliary materials | |
CN113554688B (en) | O-shaped sealing ring size measurement method based on monocular vision | |
CN112329770B (en) | Instrument scale identification method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |