[go: up one dir, main page]

CN110057824B - Ocean plankton optical imaging device and imaging processing method - Google Patents

Ocean plankton optical imaging device and imaging processing method Download PDF

Info

Publication number
CN110057824B
CN110057824B CN201910370129.4A CN201910370129A CN110057824B CN 110057824 B CN110057824 B CN 110057824B CN 201910370129 A CN201910370129 A CN 201910370129A CN 110057824 B CN110057824 B CN 110057824B
Authority
CN
China
Prior art keywords
plankton
flash lamp
camera
image
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910370129.4A
Other languages
Chinese (zh)
Other versions
CN110057824A (en
Inventor
潘俊
陈磊
于非
刁新源
魏传杰
王延清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Oceanology of CAS
Original Assignee
Institute of Oceanology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Oceanology of CAS filed Critical Institute of Oceanology of CAS
Priority to CN201910370129.4A priority Critical patent/CN110057824B/en
Publication of CN110057824A publication Critical patent/CN110057824A/en
Application granted granted Critical
Publication of CN110057824B publication Critical patent/CN110057824B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/85Investigating moving fluids or granular solids
    • G01N21/8507Probe photometers, i.e. with optical measuring part dipped into fluid sample
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/85Investigating moving fluids or granular solids
    • G01N2021/8592Grain or other flowing solid samples

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Biochemistry (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Immunology (AREA)
  • General Health & Medical Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Pathology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

The invention relates to the technical field of marine equipment, in particular to an ocean plankton optical imaging device and an imaging processing method. And obtaining a marine organism population image through the image processing program step, classifying, and analyzing to obtain the marine organism population distribution. The invention utilizes the machine vision technology to realize the rapid identification and quantification of the submarine plankton population and the particulate matters, and avoids the disturbance of the submarine plankton population and the particulate matters.

Description

Ocean plankton optical imaging device and imaging processing method
Technical Field
The invention belongs to the technical field of marine equipment, and particularly relates to an optical imaging device and an imaging processing method for marine plankton.
Background
Plankton is a very important group in marine ecosystems, and due to the complexity of plankton, one bottleneck problem faced by current marine plankton observation studies is the difficulty in rapidly measuring the changes in their number and species composition and completing the observation of their fine structure over a large spatiotemporal scale.
The traditional way of sampling by using plankton nets is still the core of current plankton sampling technology and is the basis for many long-time series studies and ocean research projects. Along with the development of technology, some updates are made on sampling devices in the prior art, such as multi-networking, which collect samples of different water layers in the same section through pressure sensing. Although the new technology replaces some traditional sampling technologies to a certain extent, one challenge faced by the new technology is that samples can only be acquired at relatively low space-time precision, the sample analysis period is long, indoor microscopic examination needs to be manually performed, and due to factors affecting sampling and storage conditions, such as fragility of individuals caused by water flow flushing of the samples, or easy decomposition of individuals stored in formalin for a long time during storage, etc., the sampling result and the distribution situation cognition of an actual sea area often have a certain difference. It is now widely recognized by researchers that many important ecological processes related to plankton occur on finer spatiotemporal scales that are difficult to meet using traditional sampling methods.
In addition, the use of netting gear sampling often disturbs plankton distribution characteristics (e.g., plaque distribution, lamellar distribution, etc.), which are important for better understanding of plankton colony structure and distribution, and can only be maintained by in situ observation systems, while for ecologically important but vulnerable groups such as colloid plankton, their numbers are also easily underestimated using conventional netting gear collection.
Disclosure of Invention
The invention aims to provide an optical imaging device and an imaging processing method for marine plankton, which utilize a machine vision technology to realize rapid identification and quantification of a marine plankton population and particulate matters, and avoid the change of natural dimensions and hydrologic morphological forms of the marine plankton population and the particulate matters, so that the accurate estimation of the plankton population behavior can be realized.
The technical scheme adopted for realizing the invention is as follows: the utility model provides an ocean plankton optical imaging device, includes the body frame, flash light subassembly, camera subassembly and main control unit, flash light subassembly and camera subassembly are all located on the body frame, flash light subassembly includes rotary drive device, impeller and flash light, impeller passes through the rotary drive device drive is rotatory the last a plurality of flash lamps that have along circumferencial direction equipartition of impeller, the camera subassembly includes sharp drive device, optical imager and camera lens base, camera lens base orientation one side of flash light subassembly is equipped with the lens be equipped with the optical imager in the camera lens base, just the optical imager passes through the drive of sharp drive device is removed so that suitable camera lens with the lens counterpoint, rotary drive device, flash light, sharp drive device and optical imager all pass through main control unit control.
The camera lens base is arranged at the front end of the camera driving unit, and the linear driving device is arranged in the camera driving unit.
The flash lamp unit is internally provided with a driving power supply for supplying power to the rotary driving device and the flash lamp, and the driving power supply is controlled to be powered on and powered off through the main control unit.
The main control unit comprises a monitoring controller and an embedded PC module, the linear driving device and the driving power supply are controlled by the embedded PC module, and the optical imager is monitored by the monitoring controller; the main control unit is connected with a data comprehensive processing and analyzing unit through a connecting cable; and the main control unit is provided with a battery assembly and a data storage unit.
A chlorophyll turbidity sensor and a temperature and salt depth sensor are arranged in the main frame body; an upper flow guide wing plate is arranged on the upper side of the main frame body, and a lower flow guide wing plate is arranged on the lower side of the main frame body; the main frame body is in a shuttle shape with a small front part and a big rear part; the main frame body is provided with a reserved mounting hole.
An optical imaging processing method for marine plankton comprises the following steps:
The main control unit receives the instruction of the data comprehensive processing and analyzing unit, controls the flash lamp assembly to irradiate marine organisms, controls the camera assembly to collect original images of the marine plankton, controls the chlorophyll turbidity sensor and the thermal salt depth sensor to collect and store marine data, and obtains and classifies image types of the marine organism population through the program steps of image processing for analyzing and obtaining the distribution of the marine organism population.
The main control unit outputs a signal to control the rotary driving device to enable the rotary impeller to rotate, so that a plurality of flash lamps uniformly distributed on the rotary impeller along the circumferential direction irradiate marine organisms; and the main control unit outputs a signal to control the linear driving device to drive the optical imager to switch positions among the lenses, so as to carry out focusing photographing on marine organisms.
The image processing program steps include:
a. Loading an original image of the collected marine plankton;
b. Preprocessing an original image, detecting a focusing object, calculating a segmentation threshold value, and carrying out gradient analysis to preliminarily obtain the outline of marine plankton;
c. Extracting the feature vector: constructing an optimal hyperplane in a sample space; calculating separation distances between different sample sets and the hyperplane; calculating an average value matrix and a distance matrix of each separation distance and normalizing; calculating the contrast, correction degree and variance of the normalized matrix as feature vectors;
d. multi-environmental factor parameter analysis: analyzing profile biological abundance distribution and hydrologic environmental factor distribution of an observation position according to a temperature value, a conductivity value, a pressure value, an optical chlorophyll concentration, a turbidity value and a longitude and latitude value acquired by each sensor;
e. Machine learning and deep learning: inserting a preset number of feature points into the image after feature vector extraction, screening and comparing to obtain a plankton type feature image, and storing the plankton type feature image into an image expert database for deep learning; extracting multi-level multi-angle characteristics from images stored in an image expert library by adopting nonlinear transformation, selecting structural elements with different sizes, shapes and direction characteristics according to different types of target morphological characteristics for learning the population types of plankton, and acquiring a rapid linear classifier selected by self-adaptive characteristics;
f. And processing the images acquired on site by using a quick linear classifier selected by the self-adaptive features, classifying the images according to the discriminated population types, and removing the images with wrong classification by combining manual identification.
And analyzing the distribution of the population according to the abundance data and the environmental factor parameters of the population.
The preprocessing and the focusing object detection comprise the steps of adopting gray scale correction, image segmentation and marking; the segmentation threshold calculation comprises the steps of carrying out binarization processing according to a set threshold, calculating gray level differences of adjacent pixels for the region of interest (ROI) and setting sobel parameters.
The beneficial effects of the invention are as follows:
1. According to the invention, quick identification and quantification of the plankton population and the particulate matters on the seabed are realized by utilizing a machine vision technology, wherein the linear driving device drives the optical imaging instrument to move so as to complete switching of different CCD lenses, so that the shooting visual field is expanded, automatic focusing aiming at plankton and the particulate matters with different sizes can be realized, in addition, during shooting, each flash lamp in the flash lamp assembly rotates along with the rotating impeller, the light emitted by each flash lamp is effectively focused, the shooting definition of the camera is ensured, the shadow part in the shooting process is avoided, and more accurate speculation on the plankton population behavior can be realized after the subsequent unit data processing.
2. The flash lamp component and the camera lens base are arranged oppositely, so that plankton population and particulate matters are not disturbed during shooting, and the natural dimension and hydrologic morphological form of the plankton population and the particulate matters on the sea bottom are prevented from being changed.
3. After the image processing step, the collected plankton can be rapidly identified and classified, so that the distribution situation of plankton groups is mastered, and the method has the advantages of in-situ refined observation and rapid identification for researching marine ecological phenomena such as day-night vertical movement, plaque distribution and the like of the plankton.
Drawings
Figure 1 is a schematic view of the structure of the present invention,
Figure 2 is an enlarged view at a in figure 1,
Figure 3 is a schematic diagram of the main control unit, the flash unit and the camera driving unit of figure 1,
FIG. 4 is a schematic diagram of the optical data post-processing flow of the present invention,
Figure 5 is a graph of multi-environmental factor parameter analysis-profile biological abundance profile,
Figure 6 is a graph of environmental factor parameter analysis-hydrologic environmental factor distribution,
Fig. 7 is a schematic view of the optical imaging effect of the present invention.
The device comprises an upper flow guide wing plate 1, a chlorophyll turbidity sensor 2, a main frame body 3, a main control unit 4, a battery assembly 5, a temperature and salt depth sensor 6, a lower flow guide wing plate 7, a flash lamp unit 8, a flash lamp assembly 9, a camera lens base 10, a camera lens base 11, a camera driving unit 12, a connecting cable 13, a connecting plate 14, a data comprehensive processing and analyzing unit 15, a data storage unit 16, a monitoring controller 17, an embedded PC module 18, a driving power supply 19, a rotary driving device 20, a rotary impeller 21, a linear driving device 22, an optical imager 23 and a lens.
Detailed Description
The following describes in further detail the embodiments of the present invention with reference to the drawings and examples. The following examples are for the purpose of illustrating the invention but are not intended to limit the scope of the invention.
As shown in fig. 1 to 3, the invention comprises a main frame body 3, a flash lamp assembly 9, a camera assembly and a main control unit 4, wherein the flash lamp assembly 9 is arranged at the front end of the main frame body 3, the flash lamp assembly 9 comprises a rotary driving device 19, a rotary impeller 20 and flash lamps, the rotary impeller 20 is driven to rotate by the rotary driving device 19, a plurality of flash lamps are uniformly distributed on the rotary impeller 20 along the circumferential direction, the camera assembly comprises a linear driving device 21, an optical imager 22 and a camera lens base 10, the camera lens base 10 is arranged at the front end of the main frame body 3, a lens 23 is arranged at one side of the camera lens base 10 facing the flash lamp assembly 9 and is used as a lens, an optical imager 22 is arranged in the camera lens base 10, the optical imager 22 is driven to move by the linear driving device 21 so as to enable a proper lens to be aligned with the lens 23 on the camera lens base 10, thereby realizing the change of a photographing focal length, the photographing range is enlarged, and each flash lamp in the camera assembly 9 rotates along with the rotary impeller 20 to compensate a shadow region when photographing, and the rotation of the flash lamps is controlled by the linear driving device 21, the optical imager 22 and the main control unit 4. The rotation driving device 19, the flash lamp, the linear driving device 21 and the optical imager 22 are all commercially available products, in this embodiment, the rotation driving device 19 is a rotation motor, the linear driving device 21 is an electric cylinder, the optical imager 22 is a CCD sensor, and the flash lamp frequency is more than 2 times of the photographing frequency of the CCD sensor, so as to avoid ineffective photographing.
As shown in fig. 1 and 3, the main frame body 3 is provided with a flash unit 8 and a camera driving unit 11, the flash unit 8 and the camera driving unit 11 are arranged in parallel, the rear ends of the flash unit 8 and the camera driving unit 11 are respectively connected with the front end of the main control unit 4 through a connecting plate 13, the connecting plate 13 is hollow for various lines to pass through, the flash assembly 9 is arranged at the front end of the flash unit 8, the camera lens base 10 is arranged at the front end of the camera driving unit 11, and the camera driving unit 11 is internally provided with the linear driving device 21.
As shown in fig. 3, a driving power supply 18 for supplying power to the rotation driving device 19 and the flash lamp is disposed in the flash lamp unit 8, and the driving power supply 18 controls the power on/off through the main control unit 4, so that the rotation driving device 19 and the flash lamp in the flash lamp assembly 9 are powered on or off. In addition, in this embodiment, a rotary joint is disposed at the rear end of the middle axle of the rotary impeller 20, the driving power supply 18 is connected to the rotary joint through a circuit, and the rotary joint divides the circuit into a plurality of circuits to be connected to each flash lamp, so as to supply power to the flash lamps without affecting rotation, and the rotary joint is a commercially available product.
As shown in fig. 1 and 3, the main control unit 4 includes a monitor controller 16 and an embedded PC module 17, where the embedded PC module 17 can control the linear driving device 21 to extend and shorten as required, so as to push the optical imager 22 to move, so that different lenses on the optical imager 22 are aligned with the lenses 23 on the camera lens base, to implement the change of the photographing focal length, and enlarge the photographing field of view, and in addition, the embedded PC module 17 also controls the driving power supply 18 in the flash unit 8 to provide power for the flash and the rotating impeller 20, and the monitor controller 16 can monitor the imaging condition of the optical imager 22 continuously. The supervisory controller 16 and embedded PC module 17 are both well known in the art.
As shown in fig. 1, the main control unit 4 is connected to the data comprehensive processing and analyzing unit 14 on the ship body through two connecting cables 12, wherein the embedded PC module 17 is connected to the data comprehensive processing and analyzing unit 14 through one connecting cable 12, a target biological image acquisition program is arranged in the embedded PC module (17), after photographing as a CCD sensor of the optical imager 22, the image is identified through the key parameter setting as an edge condition, the type of plankton or particulate matter contained in the image is judged, the information is fed back and processed by the data comprehensive processing and analyzing unit 14, and the monitoring controller 16 is connected to the data comprehensive processing and analyzing unit 14 through another connecting cable 12 to perform data communication, so that real-time monitoring is ensured. The data synthesis and analysis unit 14 is well known in the art.
As shown in fig. 1-2, the main control unit 4 is provided with a battery assembly 5 and a data storage unit 15, the battery assembly 5 supplies power to the whole main control unit 4, and the data storage unit 15 is used for storing collected data information. The battery assembly 5 and the data storage unit 15 are all well known in the art.
As shown in fig. 1-2, a chlorophyll turbidity sensor 2 and a thermal salt depth sensor 6 are disposed in the main frame body 3, wherein the chlorophyll turbidity sensor 2 is used for measuring chlorophyll concentration and turbidity of a water body, the thermal salt depth sensor 6 is used for measuring profile thermal salt depth data, the measured data of the chlorophyll turbidity sensor 2 and the thermal salt depth sensor 6 are transmitted to a data comprehensive processing and analyzing unit 14 for processing, so as to be used as environmental background factors of plankton distribution, and used for verifying influence of hydrologic characteristics on the plankton, and the chlorophyll turbidity sensor 2 and the thermal salt depth sensor 6 are all commercially available products.
As shown in fig. 1-2, the upper side of the main frame body 3 is provided with an upper guide wing plate 1, the lower side is provided with a lower guide wing plate 7, the upper guide wing plate 1 and the lower guide wing plate 7 are in dovetail-shaped design, so that seawater resistance in the sailing process can be effectively reduced, traction force is saved, efficiency is improved, the main frame body 3 is designed into a front-small rear-large shuttle shape, the main frame body is suitable for sailing in seawater, the whole frame is connected by bolts, the influence of service life reduction caused by corrosion of the seawater on welding parts is reduced, meanwhile, maintainability of equipment is enhanced, and in addition, a plurality of reserved mounting holes are designed in the middle of the main frame body 3, and different hydrological sensors can be carried according to requirements of environmental element observation are met.
The working principle of the device of the invention is as follows:
The invention utilizes the machine vision technology to realize the rapid identification and quantification of the plankton population and the particulate matters on the seabed, and avoids the change of the natural dimension and hydrologic form of the plankton population and the particulate matters, thereby realizing the more accurate estimation of the plankton population behavior, wherein the invention utilizes the linear driving device 21 to drive the CCD sensor serving as the optical imaging instrument 22 to move, so that different lenses on the CCD sensor are respectively aligned with the lenses 23 on the camera lens base 10, thereby completing the switching of different CCD lenses, expanding the shooting field of view, and realizing the automatic focusing of plankton and particulate matters with different sizes, each flash lamp in the flash lamp assembly 9 is arranged opposite to the camera lens base 10, and when the optical imaging instrument 22 shoots, the flash lamp assembly rotates along with the rotating impeller 20, and effectively focuses the light emitted by the flash lamp through the rotation of the rotating impeller 20, thereby ensuring the shooting definition of the camera within a certain range, and the flash lamp rotates along with the rotating impeller 20, thereby avoiding the shadow part in the shooting process.
The embedded PC module 17 and the monitoring controller 16 are disposed in the main control unit 4, so that real-time monitoring and control of the flash lamp assembly 9 and the CCD sensor serving as the optical imager 22 can be realized, wherein a target biological image acquisition program is disposed in the embedded PC module 17, after the CCD is photographed, a picture can be identified by taking a key parameter setting as an edge condition, the type of plankton or particulate matters contained in the picture is judged, information is fed back, and the information is processed by the data comprehensive processing and analyzing unit 14, the specific processing process is shown in fig. 4, and the optical image effect after the acquisition processing is shown in fig. 7.
As shown in fig. 3, the comprehensive processing and analysis of the ocean plankton optical imaging data is specifically implemented as follows: after the original image data is input, corresponding parameters (such as threshold, sigma and the like) are set, and the target organism uses white line block diagram to outline the region of interest (ROI) by depending on the setting of black-white binarization, so that a continuous and closed target contour is obtained. The potential target biological individuals are delineated by the synchronous line block diagram and then are individually displayed and stored in the relevant folders, and each photo has the corresponding reference number and the relevant information of the environmental factors.
The main processing steps are as follows: 1. the method comprises the steps of importing original image data of a plankton optical imaging device, designing key parameters, extracting feature vectors, analyzing parameters of multiple environmental factors, 5, machine learning and deep learning, 6, identifying types, 7, controlling data quality and analyzing.
Wherein 2 key parameters are designed. The method comprises three main contents of preprocessing, focusing object detection, segmentation threshold calculation and gradient analysis.
Wherein the preprocessing and focusing object detection are as follows: and carrying out gray scale correction, edge detection and setting an edge threshold value on the original image in sequence to acquire the region of interest (ROI). Specifically, the method comprises gray scale correction (gray scale transformation is used for carrying out gray scale stretching on pixels so that the pixels are distributed in visible gray scale), segmentation (the segmentation of unrepairable areas in the background area and the foreground contour edge fuzzy area is achieved), and marking (the segmented areas are marked for the next analysis, and the water depth and the environmental element information are synchronously imported).
Segment threshold calculation is: performing binarization processing according to a set threshold value; specifically, a focus detection algorithm is used for selecting and analyzing the detection points; introducing certain parameters such as threshold, sigma, sobel and the like to set high and low thresholds, wherein the low value is usually set to 0, the high value has a reference value, and when the high value is usually higher, the image is displayed as a high bright white value, and when the high value is lower, the image is displayed as a black dark value;
gradient analysis is carried out on the region of interest ROI to obtain a contour line; specifically, gray level differences of adjacent pixels are calculated in a selected area, edge detection is carried out, and the rough outline of an image is detected preliminarily; and the sobel parameter is set, so that the segmentation of the gray gradient image which is influenced by noise is facilitated.
3. Feature vector extraction, combining the texture of the image, the shape of the composition and the spatial relationship among the parts, performs image noise suppression processing, and extracts meaningful target information so as to change the image data into organized data which is easy to process.
In a sample space (the sample space is an image set of the image contour of the preliminary detection in the previous step), the target individual of the region of interest is obtained by constructing an optimal hyperplane. The maximum generalization capability is achieved mainly by using the fact that the separation distance between different sample sets and the hyperplane is the largest. The relative rotation invariance is realized by using the constant moment (translation, rotation and scale) as the mathematical morphological parameter of the main characteristic and the average value matrix and the distance matrix of each separation distance. And carrying out normalization processing on the obtained matrix to realize scale invariance. The contrast, correction degree, variance and the like of the matrixes are calculated and used as feature vectors, and when the same plankton target has displacement or scale change, the plankton target can be still divided into the same category targets, so that false separation is avoided, and the method has lower sensitivity to shielding and projection. For plankton, texture feature extraction containing important information of plankton cell tissue surface structure arrangement plays an important role in recognition. Texture features better reflect macroscopic and microscopic structural properties of plankton images than other classes of features.
4. Multi-environmental factor parameter analysis: and acquiring data detected by a temperature and salt depth sensor and a chlorophyll turbidity sensor. Mainly comprises a temperature value, a conductivity value, a pressure value, an optical chlorophyll concentration, a turbidity value and a longitude and latitude value. The data format is as follows:
Output.Format("ctd%08u:%f,%f,%f,%f,%f,%f,%f,%f,%f\r\n",ms,Conduc tivity,Temperature,Pressure,Salinity,ChloroRef,ChloroSig,TurbRef,TurbSig,Altitude);
ctd6 # (#): ROI time (ctd #) records the sampling time (ms) of the individual of interest, if multiple individuals are extracted at the same time, # is recorded sequentially from 01
Conductivity 3.573590, conductivity value of the temperature and salt depth measuring instrument when the image is recorded and acquired
Temperature value when Temperature and salt depth measuring instrument records acquired image is given to temperature:9.619100
Pressure 67.970000, pressure value when temperature and salt depth measuring instrument records acquired image
Salinity:32.917400, and salinity value of image acquisition is recorded by a warm salt depth measuring instrument
ChloroRef A0.000000, the optical chlorophyll sensor mounted on the sensor collects the time band value (nm)
ChloroSig A0.000000, the original voltage value recorded when chlorophyll is collected in single band
TurbRef A0.000000, the optical turbidity sensor mounted on the device collects the time band value (nm)
TurbSig A0.000000, the original voltage value recorded when turbidity is collected in a single wave band
Altitude longitude and latitude recorded when 0.000000 is accessed to GPS
The profile biological abundance distribution and data of the same station obtained at the yellow sea 3600-05 station through the data comprehensive processing and analysis technology are shown as follows:
as shown in fig. 5, a multi-environmental factor parametric analysis-profile biological abundance profile is shown. The marked points are the water depths and calculated abundances of zooplankton; the gray scale region is based on instantaneous observed abundance values and the abundance of adjacent water layers is estimated by using a Savitzky-Golay convolution smoothing algorithm, and is an improvement of a 5-point movement smoothing algorithm. Line a is the distribution of chlorophyll across the whole cross-section.
As shown in fig. 6, the environmental factor parameter analysis-hydrologic environmental factor distribution map is shown. The C line is the distribution condition of chlorophyll of the whole cross section; line B is the distribution of the temperature of the full section; line D is the salinity profile of the full section.
5. Machine learning and deep learning: and 3, inserting a certain number of characteristic points into the result image in the step 3, integrating and summarizing the parts conforming to the characteristic points through comparing the characteristic points, and finally obtaining the characteristics of the image so as to realize deep learning. And (3) establishing and continuously supplementing a picture expert database, analyzing the extracted characteristic information, extracting characteristics consistent with subjective understanding of people on plankton, and enabling the identification result to be consistent with the visual judgment of actual people. Through a data driving mode, a series of nonlinear transformation is adopted to extract multi-level multi-angle features from original data, and structural elements with different size, shape and direction characteristics are selected according to different target morphological features, so that the obtained features have stronger generalization capability and expression capability, and the requirements of efficient image processing are just met.
As shown in fig. 7, the optical image data of the marine plankton individual obtained by the internal process-machine learning and deep learning processes, the optical imaging-in-situ collection of species (copepods).
6. And (3) species identification: and combining a machine with manual identification to obtain the biological species.
The method can be used for quickly and effectively identifying and classifying the biological targets in real time by using machine calculation, and the target classification is carried out by the quick linear classifier which realizes the self-adaptive feature selection and the target consistency matching is carried out by combining an established image expert database and an automatic statistical algorithm, so that the statistical and distribution information of the biological targets is effectively acquired and fed back to observers in real time. And the observers further evaluate the images classified by the machine in a manual identification mode, check whether the classified images are correct or not, and delete the images with errors.
7. Data quality control and analysis: representing species abundance estimates and their distribution influencing factors.
Aiming at the classification information of the observed representative species, the water depth position information of the early-stage mark is combined, the effective acquisition volume and the like are obtained during photographing, the abundance of the population is calculated, and meanwhile, the association with the environmental factors is established, so that the corresponding scientific problems, such as main environmental factors affecting the distribution of the scientific problems, are analyzed.

Claims (5)

1. An optical imaging device for marine plankton, which is characterized in that: the camera comprises a main frame body (3), a flash lamp assembly (9), a camera assembly and a main control unit (4), wherein the flash lamp assembly (9) and the camera assembly are arranged on the main frame body (3), the flash lamp assembly (9) comprises a rotary driving device (19), a rotary impeller (20) and flash lamps, the rotary impeller (20) is driven to rotate by the rotary driving device (19), a plurality of flash lamps are uniformly distributed on the rotary impeller (20) along the circumferential direction, the camera assembly comprises a linear driving device (21), an optical imager (22) and a camera lens base (10), a lens (23) is arranged on one side, facing the flash lamp assembly (9), of the camera lens base (10), the optical imager (22) is arranged in the camera lens base (10), the optical imager (22) is driven to move by the linear driving device (21) so as to align a proper lens with the lens (23), and the rotary driving device (19), the optical imager (21) and the optical imager (22) are all controlled by the main control unit (4);
A flash lamp unit (8) and a camera driving unit (11) are arranged in the main frame body (3), the flash lamp unit (8) and the camera driving unit (11) are arranged in parallel, the rear ends of the flash lamp unit and the camera driving unit (11) are connected with the front end of the main control unit (4), the flash lamp assembly (9) is arranged at the front end of the flash lamp unit (8), the camera lens base (10) is arranged at the front end of the camera driving unit (11), and the linear driving device (21) is arranged in the camera driving unit (11);
the main control unit (4) comprises a monitoring controller (16) and an embedded PC module (17), the linear driving device (21) and the driving power supply (18) are controlled by the embedded PC module (17), and the optical imager (22) is monitored by the monitoring controller (16); the main control unit (4) is connected with a data comprehensive processing and analyzing unit (14) through a connecting cable (12); the main control unit (4) is provided with a battery assembly (5) and a data storage unit (15);
A chlorophyll turbidity sensor (2) and a temperature and salt depth sensor (6) are arranged in the main frame body (3); an upper flow guide wing plate (1) is arranged on the upper side of the main frame body (3), and a lower flow guide wing plate (7) is arranged on the lower side of the main frame body; the main frame body (3) is in a shuttle shape with a small front part and a big rear part; the main frame body (3) is provided with a reserved mounting hole.
2. The marine plankton optical imaging device of claim 1, wherein: a driving power supply (18) for supplying power to the rotation driving device (19) and the flash lamp is arranged in the flash lamp unit (8), and the driving power supply (18) is controlled to be powered on and powered off through the main control unit (4).
3. The ocean plankton optical imaging processing method is characterized by comprising the following steps of:
step 1, a main control unit (4) receives an instruction of a data comprehensive processing and analyzing unit (14);
Step 2, controlling a flash lamp assembly (9) to irradiate marine organisms: the main control unit (4) outputs signals to control the rotary driving device (19) to enable the rotary impeller (20) to rotate, so that a plurality of flash lamps uniformly distributed on the rotary impeller (20) along the circumferential direction irradiate marine organisms;
Step 3, controlling a camera component to acquire an original image of marine plankton: the main control unit (4) outputs a signal to control the linear driving device (21) to drive the optical imager (22) to switch positions among the lenses (23) so as to perform focusing photographing on marine organisms;
Step4, controlling a chlorophyll turbidity sensor (2) and a temperature and salt depth sensor (6) to acquire ocean data and store the ocean data,
Step 5, obtaining and classifying the image types of the marine organism population through the program steps of image processing, and analyzing to obtain the distribution of the marine organism population; the image processing program steps include:
a. Loading an original image of the collected marine plankton;
b. Preprocessing an original image, detecting a focusing object, calculating a segmentation threshold value, and carrying out gradient analysis to preliminarily obtain the outline of marine plankton;
c. Extracting the feature vector: constructing an optimal hyperplane in a sample space; calculating separation distances between different sample sets and the hyperplane; calculating an average value matrix and a distance matrix of each separation distance and normalizing; calculating the contrast, correction degree and variance of the normalized matrix as feature vectors;
d. multi-environmental factor parameter analysis: analyzing profile biological abundance distribution and hydrologic environmental factor distribution of an observation position according to a temperature value, a conductivity value, a pressure value, an optical chlorophyll concentration, a turbidity value and a longitude and latitude value acquired by each sensor;
e. Machine learning and deep learning: inserting a preset number of feature points into the image after feature vector extraction, screening and comparing to obtain a plankton type feature image, and storing the plankton type feature image into an image expert database for deep learning; extracting multi-level multi-angle characteristics from images stored in an image expert library by adopting nonlinear transformation, selecting structural elements with different sizes, shapes and direction characteristics according to different types of target morphological characteristics for learning the population types of plankton, and acquiring a rapid linear classifier selected by self-adaptive characteristics;
f. And processing the images acquired on site by using a quick linear classifier selected by the self-adaptive features, classifying the images according to the discriminated population types, and removing the images with wrong classification by combining manual identification.
4. A method of ocean plankton optical imaging processing according to claim 3, wherein: and analyzing the distribution of the population according to the abundance data and the environmental factor parameters of the population.
5. A method of ocean plankton optical imaging processing according to claim 3, wherein: the preprocessing and the focusing object detection comprise the steps of adopting gray scale correction, image segmentation and marking; the segmentation threshold calculation comprises the steps of carrying out binarization processing according to a set threshold, calculating gray level differences of adjacent pixels for the region of interest (ROI) and setting sobel parameters.
CN201910370129.4A 2019-05-06 2019-05-06 Ocean plankton optical imaging device and imaging processing method Active CN110057824B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910370129.4A CN110057824B (en) 2019-05-06 2019-05-06 Ocean plankton optical imaging device and imaging processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910370129.4A CN110057824B (en) 2019-05-06 2019-05-06 Ocean plankton optical imaging device and imaging processing method

Publications (2)

Publication Number Publication Date
CN110057824A CN110057824A (en) 2019-07-26
CN110057824B true CN110057824B (en) 2024-05-14

Family

ID=67322284

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910370129.4A Active CN110057824B (en) 2019-05-06 2019-05-06 Ocean plankton optical imaging device and imaging processing method

Country Status (1)

Country Link
CN (1) CN110057824B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114299332A (en) * 2021-12-22 2022-04-08 苏州热工研究院有限公司 Cold source marine organism intelligent detection method and system for nuclear power plant
CN114298940A (en) * 2021-12-30 2022-04-08 声耕智能科技(西安)研究院有限公司 Hyperspectral image shadow compensation method based on nonlinear pixel analysis
CN118794953B (en) * 2024-09-14 2025-01-21 浙江清盛检测技术有限公司 A plankton multi-viewing angle in-situ detection device and detection method

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06186607A (en) * 1992-12-18 1994-07-08 Gokou Eizo Kagaku Kenkyusho:Kk Camera with zoom lens
KR0131680B1 (en) * 1986-05-12 1998-04-15 마쓰모도 도오루 Finder optical system of the camera
JP2005091061A (en) * 2003-09-16 2005-04-07 Shin Nippon Air Technol Co Ltd Handholding particle visualizing apparatus and assembling method therefor
JP2005221270A (en) * 2004-02-03 2005-08-18 Tsunehiro Yoshida Screw inspection apparatus
JP2005249824A (en) * 2004-03-01 2005-09-15 Casio Comput Co Ltd Imaging device
JP2008009187A (en) * 2006-06-29 2008-01-17 Olympus Corp Automatic focusing device
CN101592619A (en) * 2009-07-15 2009-12-02 长沙楚天科技有限公司 Adopt the automatic lamp inspector picture synchronization tracking and collecting device of end light source
KR20100060503A (en) * 2008-11-27 2010-06-07 삼성전기주식회사 Method for controlling an angle of light radiated by flash and camera having function of adjusting an angle of light radiated by flash
JP2015012375A (en) * 2013-06-27 2015-01-19 貞彦 寺川 Multi-angle image photographing system, rotating table device, photographing device, and multi-angle image photographing method
CN104872083A (en) * 2015-05-29 2015-09-02 中国科学院海洋研究所 Visual multi-connected sampling system of plankton and use method thereof
CN204836320U (en) * 2015-06-29 2015-12-02 吴少波 Can remote control and multi -angle rotating's multi -functional camera
JP2016161597A (en) * 2015-02-26 2016-09-05 株式会社ソシオネクスト Image processing device, integrated circuit, and camera
JP2016224335A (en) * 2015-06-02 2016-12-28 カシオ計算機株式会社 Light source device and projection device
CN206117825U (en) * 2016-09-21 2017-04-19 宿迁琛博信息科技有限公司 But multi -angle rotating has camera of flash light
CN107271371A (en) * 2017-07-28 2017-10-20 中国科学院海洋研究所 A kind of planktonic organism polarization imager
CN209841715U (en) * 2019-05-06 2019-12-24 中国科学院海洋研究所 Optical imaging device for marine plankton

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001209097A (en) * 2000-01-28 2001-08-03 Masashi Saito Camera system for intra-oral photography
CN102893137B (en) * 2010-03-17 2017-01-11 曾海山 Fast multispectral imaging method and device and application for cancer detection and localization
US9019503B2 (en) * 2010-04-19 2015-04-28 The United States Of America, As Represented By The Secretary Of The Navy MEMS microdisplay optical imaging and sensor systems for underwater and other scattering environments
JP5722134B2 (en) * 2011-06-23 2015-05-20 オリンパスイメージング株式会社 Optical equipment
NO335224B1 (en) * 2013-01-28 2014-10-20 Sinvent As Zooplankton counting system and method
WO2016048851A1 (en) * 2014-09-22 2016-03-31 Gallager Scott M Continuous particle imaging and classification system

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR0131680B1 (en) * 1986-05-12 1998-04-15 마쓰모도 도오루 Finder optical system of the camera
JPH06186607A (en) * 1992-12-18 1994-07-08 Gokou Eizo Kagaku Kenkyusho:Kk Camera with zoom lens
JP2005091061A (en) * 2003-09-16 2005-04-07 Shin Nippon Air Technol Co Ltd Handholding particle visualizing apparatus and assembling method therefor
JP2005221270A (en) * 2004-02-03 2005-08-18 Tsunehiro Yoshida Screw inspection apparatus
JP2005249824A (en) * 2004-03-01 2005-09-15 Casio Comput Co Ltd Imaging device
JP2008009187A (en) * 2006-06-29 2008-01-17 Olympus Corp Automatic focusing device
KR20100060503A (en) * 2008-11-27 2010-06-07 삼성전기주식회사 Method for controlling an angle of light radiated by flash and camera having function of adjusting an angle of light radiated by flash
CN101592619A (en) * 2009-07-15 2009-12-02 长沙楚天科技有限公司 Adopt the automatic lamp inspector picture synchronization tracking and collecting device of end light source
JP2015012375A (en) * 2013-06-27 2015-01-19 貞彦 寺川 Multi-angle image photographing system, rotating table device, photographing device, and multi-angle image photographing method
JP2016161597A (en) * 2015-02-26 2016-09-05 株式会社ソシオネクスト Image processing device, integrated circuit, and camera
CN104872083A (en) * 2015-05-29 2015-09-02 中国科学院海洋研究所 Visual multi-connected sampling system of plankton and use method thereof
JP2016224335A (en) * 2015-06-02 2016-12-28 カシオ計算機株式会社 Light source device and projection device
CN204836320U (en) * 2015-06-29 2015-12-02 吴少波 Can remote control and multi -angle rotating's multi -functional camera
CN206117825U (en) * 2016-09-21 2017-04-19 宿迁琛博信息科技有限公司 But multi -angle rotating has camera of flash light
CN107271371A (en) * 2017-07-28 2017-10-20 中国科学院海洋研究所 A kind of planktonic organism polarization imager
CN209841715U (en) * 2019-05-06 2019-12-24 中国科学院海洋研究所 Optical imaging device for marine plankton

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
Generation of tailored pulse trains for efficient material processing by a high power MOPA system with birefringence compensation;Riesbeck, T. 等;《LASER PHYSICS LETTERS》;20080101;第5卷(第3期);第240-245页 *
基于ZooScan图像技术的南黄海夏季浮游动物群落结构分析;代鲁平 等;《海洋与湖沼》;20160715;第47卷(第4期);第764-773页 *
基于分子标记的刺苦草在长江中下游湖泊的种群动态与维持研究;陈磊;《中国博士学位论文全文数据库 基础科学辑》;20071115(第5期);第A006-44页 *
快速扫频光源及其在光学频域成像中的应用;丁志华 等;《中国激光》;第36卷(第10期);第2469-2476页 *
激光三维成像技术及其主要应用;王昊鹏 等;《电子设计工程》;20120630;第20卷(第12期);第160-163页 *
环形闪光灯在现场勘查中的运用;王惠斌 等;《数字与缩微影像》;20070915(第03期);第44-46页 *
董仲华.《摄影技术基础》.北京广播学院出版社,1990,(第1版),第1-10页. *

Also Published As

Publication number Publication date
CN110057824A (en) 2019-07-26

Similar Documents

Publication Publication Date Title
CN111462076B (en) Full-slice digital pathological image fuzzy region detection method and system
Qiao et al. An automatic active contour method for sea cucumber segmentation in natural underwater environments
CN110057824B (en) Ocean plankton optical imaging device and imaging processing method
CN111046880A (en) Infrared target image segmentation method and system, electronic device and storage medium
CN113781455B (en) Cervical cell image anomaly detection method, device, equipment and medium
Campbell et al. The Prince William Sound Plankton Camera: a profiling in situ observatory of plankton and particulates
KR100889997B1 (en) Ship Ballast Water Inspection System Using Image Processing and Its Method
CN115690385A (en) Water quality prediction method, system, equipment and medium based on multispectral image
CN109886170A (en) An intelligent detection, recognition and statistics system for snails
CN112883969B (en) Rainfall intensity detection method based on convolutional neural network
Babalola et al. Soil surface texture classification using RGB images acquired under uncontrolled field conditions
Wang et al. Vision-based in situ monitoring of plankton size spectra via a convolutional neural network
CN114743257A (en) Method for detecting and identifying image target behaviors
CN118351100A (en) Image definition detection and processing method based on deep learning and gradient analysis
Marcos et al. Automated benthic counting of living and non-living components in Ngedarrak Reef, Palau via subsurface underwater video
CN107330472A (en) A kind of automatic identifying method of unmarked model animal individual
CN106991441A (en) Merge the plant specimen sorting technique and system of multiple dimensioned direction textural characteristics
Nazeran et al. Biomedical image processing in pathology: a review
Akiba et al. Design and testing of an underwater microscope and image processing system for the study of zooplankton distribution
CN209841715U (en) Optical imaging device for marine plankton
CN113344987A (en) Infrared and visible light image registration method and system for power equipment under complex background
Shishkin et al. Analysis of image clusterization methods for oceanographical equipment
CN114373118B (en) Underwater target detection method based on improved YOLOV4
CN117036992A (en) Method for obtaining water ecological health condition by image processing of unmanned aerial vehicle acquired image
Geraldes et al. In situ real-time zooplankton detection and classification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant