Disclosure of Invention
The invention provides a method and a system for quickly splicing a panoramic image, which are used for overcoming the defects of splicing failure and the like caused by too few characteristic points in the prior art.
In order to achieve the above object, the present invention provides a method for fast panoramic stitching of microscopic images, comprising:
controlling a microscope objective to acquire a local microscopic image of the pathological section according to a pre-planned acquisition path;
performing optical flow estimation on a local microscopic image acquired at the current position and a local microscopic image acquired at the previous position by using a pre-trained end-to-end network to obtain an optical flow characteristic diagram;
converting the optical flow characteristic diagram by using a pre-trained space transformation network to obtain an initial matrix, performing matrix multiplication operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix at the current position, and aligning two local microscopic images according to the transformation matrix at the current position to obtain a first aligned microscopic image;
performing linear fusion on the overlapping area of the first alignment microscopic image by adopting a gradually-in and gradually-out linear fusion mode to obtain a first fusion microscopic image;
controlling a microscope objective or a microscope objective stage to move to the next position according to a pre-planned acquisition path and acquiring a local microscopic image, then carrying out optical flow estimation and alignment on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position to obtain a second aligned microscopic image, and carrying out linear fusion on the second aligned microscopic image and the first fused microscopic image to obtain a second fused microscopic image;
the processes of the optical flow estimation, the alignment and the linear fusion are circulated until all the local microscopic images are fused to obtain a panoramic fusion microscopic image;
and performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering to obtain a panoramic splicing microscopic image.
In order to achieve the above object, the present invention further provides a system for fast panorama stitching of microscopic images, comprising:
the image acquisition module is used for controlling the microscope objective to acquire a local microscopic image of the pathological section according to a pre-planned acquisition path;
the optical flow estimation module is used for carrying out optical flow estimation on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position by utilizing a pre-trained end-to-end network to obtain an optical flow characteristic diagram;
the image alignment module is used for converting the optical flow characteristic diagram by utilizing a pre-trained space transformation network to obtain an initial matrix, carrying out matrix multiplication operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix at the current position, and aligning two local microscopic images according to the transformation matrix at the current position to obtain a first aligned microscopic image;
the fusion module is used for performing linear fusion on the overlapping area of the first alignment microscopic image in a gradually-in and gradually-out linear fusion mode to obtain a first fusion microscopic image;
the circulation module is used for controlling the microscope objective or the microscope objective stage to move to the next position and acquiring a local microscopic image according to a pre-planned acquisition path, then carrying out optical flow estimation and alignment on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position to obtain a second aligned microscopic image, and carrying out linear fusion on the second aligned microscopic image and the first fused microscopic image to obtain a second fused microscopic image; the processes of the optical flow estimation, the alignment and the linear fusion are circulated until all the local microscopic images are fused to obtain a panoramic fusion microscopic image;
and the rendering module is used for performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering to obtain a panoramic splicing microscopic image.
To achieve the above object, the present invention further provides a computer device, which includes a memory and a processor, wherein the memory stores a computer program, and the processor implements the steps of the method when executing the computer program.
Compared with the prior art, the invention has the beneficial effects that:
the rapid panoramic stitching method for the microscopic images provided by the invention can realize coarse-to-fine optical flow estimation by utilizing an end-to-end network, has high calculation speed and accurate optical flow estimation, and can completely meet the requirements of image acquisition and calculation; combining a spatial transformation network, wherein the spatial transformation network can directly predict a transformation matrix from the optical flow characteristic diagram, and performs spatial transformation on the local microscopic images according to the transformation matrix so as to accurately align the overlapping areas of the adjacent local microscopic images; then, linear fusion is carried out on the overlapped area of the aligned microscopic images by adopting a gradually-in and gradually-out linear fusion mode, so that gaps and double images of the fused images can be effectively eliminated, and seamless splicing is realized; and finally, performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering so as to eliminate the spliced fusion gap and generate a seamless spliced image. Compared with the prior art, the panoramic stitching method provided by the invention has the advantages of high stitching speed and high accuracy, and can meet the requirements of image acquisition and stitching.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In addition, the technical solutions in the embodiments of the present invention may be combined with each other, but it must be based on the realization of those skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination of technical solutions should not be considered to exist, and is not within the protection scope of the present invention.
The invention provides a rapid panoramic stitching method of microscopic images, which comprises the following steps of:
101: controlling a microscope objective or a microscope objective table to acquire a local microscopic image of the pathological section according to a pre-planned acquisition path;
the microscope objective will acquire one local microscopic image at each dwell position on the acquisition path.
The collection path may be a serpentine path from top to bottom, a serpentine path from left to right, or other paths as long as the cells in the pathological section are all collected.
102: performing optical flow estimation on a local microscopic image acquired at the current position and a local microscopic image acquired at the previous position by using a pre-trained end-to-end network to obtain an optical flow characteristic diagram;
end-to-end networking refers to the use of deep neural networks to replace the multi-phase process. For example, conventional optical flow estimation by including three stages: feature extraction, feature matching and optical flow estimation. The output optical flow can be predicted directly from the two input images using an end-to-end network.
The optical flow estimation means that when two frames of images are given, luminance motion vectors of corresponding points in the next frame of image and the previous frame of image are estimated, and the luminance motion vectors are instantaneous speeds of pixel motion of a space moving object on an observation imaging plane.
103: converting the optical flow characteristic diagram by using a pre-trained space transformation network to obtain an initial matrix, performing matrix multiplication (matrix multiplication) operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix at the current position, and aligning two local microscopic images according to the transformation matrix at the current position to obtain a first aligned microscopic image;
spatial Transformer Networks (STNs) are a convolutional neural network architecture model proposed by Jaderberg et al, and the classification accuracy of the convolutional network model is improved by transforming the input pictures and reducing the influence of the Spatial diversity of data, rather than by changing the network structure. The STNs have good robustness and spatial invariance such as translation, expansion, rotation, disturbance, bending and the like.
The transformation matrix includes parameters of translation, scaling, rotation, etc.
104: performing linear fusion on the overlapped area of the first alignment microscopic image by adopting a gradually-in and gradually-out linear fusion mode to obtain a first fusion microscopic image;
the linear fusion method of gradual-in and gradual-out is to linearly assign weights to pixels of two images in an overlapping area, and when the weights increase from 0 to 1, the pixel values of the overlapping area change from a left overlapping area to a right overlapping area.
105: controlling a microscope objective or a microscope objective stage to move to the next position according to a pre-planned acquisition path and acquiring a local microscopic image I2, then carrying out optical flow estimation in step 102 and alignment in step 103 on the local microscopic image I2 acquired at the current position and the local microscopic image I1 acquired at the previous position to obtain a second aligned microscopic image, and carrying out linear fusion in step 104 on the second aligned microscopic image and the first fused microscopic image to obtain a second fused microscopic image;
the local microscopic image I2 acquired at the current position and the local microscopic image I1 acquired at the previous position are fused and then the fused microscopic image is stored, and meanwhile, the local microscopic image I2 acquired at the current position is continuously stored until the local microscopic image acquired at the next position and the local microscopic image I2 acquired at the current position are fused and then released.
106: the processes of optical flow estimation, alignment and linear fusion in the step 105 are circulated until all the local microscopic images are fused to obtain a panoramic fusion microscopic image;
and collecting the local microscopic image, and simultaneously carrying out the processes of light stream estimation, alignment and linear fusion, and obtaining the panoramic fusion microscopic image after the local microscopic image is collected.
107: and performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering to obtain a panoramic splicing microscopic image.
Bilateral filtering is a non-linear filter whose basic idea is to represent the intensity of a pixel by a weighted average of the intensity values of surrounding pixels, the weighted average being based on a gaussian distribution. The weights for the Bilateral filtering include the Euclidean distance of the pixel and the radiation difference in the pixel domain, both of which are considered simultaneously when computing the center pixel (ref: Tomasi C, Manduchi R. "binary filtering for gray and color images". ICCV [ J ].1998: 839-) -846).
The panorama rendering smoothes all overlapping regions in the panorama fusion microscopy image.
The rapid panoramic stitching method for the microscopic images provided by the invention can realize coarse-to-fine optical flow estimation by utilizing an end-to-end network, has high calculation speed and accurate optical flow estimation, and can completely meet the requirements of image acquisition and calculation; combining a spatial transformation network, wherein the spatial transformation network can directly predict a transformation matrix from the optical flow characteristic diagram, and performs spatial transformation on the local microscopic images according to the transformation matrix so as to accurately align the overlapping areas of the adjacent local microscopic images; then, linear fusion is carried out on the overlapped area of the aligned microscopic images by adopting a gradually-in and gradually-out linear fusion mode, so that gaps and double images of the fused images can be effectively eliminated, and seamless splicing is realized; and finally, performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering so as to eliminate the spliced fusion gap and generate a seamless spliced image. Compared with the prior art, the panoramic stitching method provided by the invention has the advantages of high stitching speed and high accuracy, and can meet the requirements of image acquisition and stitching.
In one embodiment, before performing step 101, the method further includes the steps of:
001: pre-scanning pathological sections to obtain blank areas and target areas of the pathological sections;
the target region is a region with cells.
002: and planning the acquisition path in the target area.
The pre-scanning aims to remove blank areas of pathological sections, so that the working efficiency is improved with less workload.
In the next embodiment, for step 101, the present embodiment employs a high-magnification digital microscope to acquire a single local microscopic image in different fields of view in the pathological section. When acquiring the local microscope images in different fields, it is necessary to ensure that there is an overlapping area between adjacent local microscope images until the sum of the areas of the individual local microscope images in different fields is enough to cover the cell or tissue sample area in the original pathological section.
The microscope objective has a magnification of 20X or 40X or 100X.
The path of this embodiment is a serpentine path from left to right.
The microscope objective or stage is moved in a serpentine path from left to right to acquire local microscopic images of the pathological section as individual microscopic images in different fields of view superimposed on each other.
In the existing pathological section scanner, the speed of scanning and splicing and the quality of the generated large-size microscopic image are both key factors, and a balance needs to be made between the speed and the quality to ensure that the best spliced panoramic image is obtained at the fastest speed. The size of the overlapping area of each time the microscope objective lens or the microscope objective table moves is set to be 20-30% of the area of a single local microscope image.
In another embodiment, for step 102, the step of performing optical flow estimation on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position by using a pre-trained end-to-end network (a structure diagram of the end-to-end network in this embodiment is shown in fig. 2) to obtain an optical flow feature map includes:
201: local microscopic image I2 (Im) acquired at current position by using pre-trained FlowNet1 network (optical flow estimation network)age2) and a local microscopic Image I1(Image1) acquired at the previous position to carry out rough optical flow estimation to obtain a rough optical flow characteristic diagram F1(Flow1) from the rough optical Flow feature F1Carrying out position transformation on the local microscopic image I2 to obtain a first transformation image W1(Warped1), calculating a local microscopic image I1 and a first transformed image W1Obtaining a first brightness error BE from the brightness difference1(Brightness Error1);
Using a pre-trained FlowNet2 network to perform local microscopic image I1, local microscopic image I2 and rough optical flow characteristic diagram F1The first converted image W1And brightness error BE1Performing fine optical flow estimation to obtain fine optical flow characteristic diagram F2(Flow2) from the fine optical Flow feature map F2The position of the local microscopic image I2 is transformed to obtain a second transformed image W2(Warped2), calculating a local microscopic image I1 and a second transformed image W2Obtaining a second brightness error BE from the brightness difference2(Brightness Error2);
Using a pre-trained FlowNet2 network to perform local microscopic image I1, local microscopic image I2 and fine optical flow characteristic diagram F2The second converted image W2And a second luminance error BE2And performing fine optical flow estimation to obtain an optical flow feature map F (Featuremap).
Considering the inaccuracy of optical flow estimation, the invention uses the optical flow characteristic diagram to transform the first transformed image W obtained after the local microscopic image I2 is transformed1And a second transformed image W2. First transformed image W1And a second transformed image W2Since there is still a certain deviation from the local microscopic image I1, it is necessary to perform the local microscopic image I1 and the first transformed image W1The second converted image W2Is subtracted to obtain a first brightness error BE1And a second luminance error BE2And the method is used for subsequent accurate optical flow estimation.
In a certain embodiment, the FlowNet1 network includes, in order, 9 convolutional layers and 1 redefinement module (adjustment module, which may also be referred to as a decoding module);
the 9 convolutional layers are used for carrying out high-level feature extraction on the local micro-images stacked in advance to obtain a feature map;
and the redefinition module is used for performing upsampling operation by using the deconvolution layer to obtain a coarse optical flow feature map.
The network structure of the redefinition module is shown in fig. 4. The network structure comprises 4 deconvolution layers (deconv4, deconv3, deconv2 and deconv1), wherein the input of the deconv4 deconvolution layer is a feature map output by a conv6 convolution layer, the input of the last three deconvolution layers comprises two parts, the first part is a deconvolution output of the previous layer, and the second part is a feature map output of convolution layers (conv5_1, conv4_1 and conv3_1) in an end-to-end network, so that information of the upper layer and the bottom layer is fused, and a mechanism from coarse to fine is also introduced. The input of the FlowNet1 network is two 3-channel images, before the FlowNet1 network is input, the two 3-channel images are stacked to become 6-channel tensors, then the 6-channel tensors are input into the FlowNet1 network, the high-level feature extraction is carried out sequentially through 9 convolutional layers, finally the feature diagram in the previous convolutional layer is continuously up-sampled through a redefinement module, and a 2-channel rough optical flow feature diagram with the same size as the input original diagram is output.
In another embodiment, the FlowNet2 network includes, in order, 9 convolutional layers, 1 correlation layer, and 1 redefinition module;
the first 3 convolutional layers are used for carrying out feature extraction on each input local microscopic image to obtain a first feature map;
the correlation layer is used for performing correlation operation on the feature images of different local microscopic images so as to combine the features of the different local microscopic images to obtain a feature fusion image;
the last 6 convolutional layers are used for carrying out high-level feature extraction on the feature fusion graph to obtain a second feature graph;
and the redefinition module is used for performing upsampling operation by using the deconvolution layer to obtain a fine optical flow feature map.
The input of the FlowNet2 network is two 3-channel images, the two 3-channel images are subjected to higher-layer feature extraction through the first 3 convolutional layers respectively to obtain 1 first feature graph, then the first feature graph enters the related layers to perform related operation so as to merge the features of the first feature graph to obtain a feature fusion graph, the feature fusion graph is subjected to higher-layer feature extraction sequentially through the next 6 convolutional layers to obtain 1 second feature graph, and finally the feature graph of the previous convolutional layer is continuously up-sampled through a redefinement module to output a 2-channel rough optical flow feature graph with the same size as the original graph.
In the next embodiment, the formula of the correlation operation is:
in the formula, c (x)1,x2) Is the correlation value, x, of two feature maps1,x2Respectively corresponding points on the two characteristic graphs; f. of1And f2Two characteristic graphs are obtained; k is the boundary value of the region to be compared; o is the position value of any point in the area to be compared;<>the correlation operation is the multiplication of the pixels at the corresponding positions, and the larger the correlation value is, the closer the characteristic diagram is represented.
The correlation operation is used for comparing the correlation between the two input feature maps.
In a further embodiment, the feature map F is based on a coarse optical flow1Carrying out position transformation on the local microscopic image I2 to obtain a first transformation image W1The method comprises the following steps:
from a coarse light flow feature map F1Obtaining the offset value of each pixel (the value of each position in the light flow diagram is a two-dimensional vector and represents the instantaneous speed of pixel movement), offsetting each corresponding pixel in the local microscopic image I2 according to the offset value to obtain a first transformed image W1。
In this embodiment, the feature map F is based on the fine optical flow2The position of the local microscopic image I2 is transformed to obtain a second transformed image W2The method comprises the following steps:
from fine optical flow feature maps F2Obtaining an offset value of each pixel, and offsetting each corresponding pixel in the local microscopic image I2 according to the offset value to obtain a second transformation image W2。
In another embodiment, for step 103, the spatial transformation network includes a local network (localization network), a Grid generator (Grid generator), and a Sampler (Sampler), and the workflow of the spatial transformation network is as shown in fig. 3;
the spatial transformation network firstly generates a transformation matrix with 9 parameters through a simple regression network for carrying out transformation operation on an original image, then each point on a target image corresponds to the original image according to the transformation matrix, and finally a sampler is used for sampling pixel values on the original image into the target image. The spatial transformation network does not need to calibrate key points, and can adaptively carry out spatial transformation and alignment (comprising translation, scaling, rotation and other set transformation and the like) on data.
Converting the optical flow characteristic diagram by using a pre-trained space transformation network to obtain an initial matrix, performing matrix multiplication operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix at the current position, and aligning two local microscopic images according to the transformation matrix at the current position to obtain a first aligned microscopic image, wherein the method comprises the following steps of:
301: carrying out multilayer convolution operation on the optical flow characteristic diagram by utilizing a pre-trained local network, carrying out characteristic full connection, regressing and outputting an initial matrix, and carrying out matrix multiplication operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix with 9 parameters;
the local network is essentially a simple regression network.
The 9 parameters include translation, scaling, rotation, etc. Parameters in the obtained transformation matrix are different according to the input optical flow characteristic diagram.
302: fixing a local microscopic image I1 acquired at the previous position, recording a local microscopic image I2 acquired at the current position as an original image, setting a hollow microscopic image as a microscopic image aligned with I1 and marking as a target image, and calculating the coordinate position of each coordinate position in the target image in the original image according to a transformation matrix by using a pre-trained grid generator to obtain a mapping relation T (G) of the overlapping area of the target image and the original image;
303: and searching corresponding coordinate positions of all coordinate positions of the target image in the original image by using a pre-trained sampler according to the mapping relation T (G), and correspondingly copying pixels in the original image into the target image by adopting a bilinear interpolation mode to obtain a first alignment microscopic image.
The bilinear interpolation is also called bilinear interpolation, and mathematically, the bilinear interpolation is linear interpolation extension of an interpolation function with two variables, and the core idea is to perform linear interpolation in two directions respectively.
The reason for using the bilinear interpolation method is that the coordinates mapped from the target image coordinates onto the original image may be decimal, and the integer coordinates need to be obtained by interpolation operation in the original image.
In a next embodiment, for step 104, in order to eliminate the fused seam after the splicing and generate the seamless spliced image, a gradually-in and gradually-out linear fusion mode is adopted to perform linear fusion on the overlapped region of the first alignment microscopic image, and in the step of obtaining the first fused microscopic image, the formula of the linear fusion is as follows:
I(i,j)=αI1(i,j)+(1-α)I2(i,j)0≤α≤1 (2)
in the formula, I is a fusion pixel value; i is1And I2Respectively the original pixel values of the corresponding overlapping areas of the two local microscope images, (i, j) are pixel coordinates, (α) are weight coefficients,
wherein w represents the width of the overlapping region;1indicating that the overlap region is near the inner edge position of image I1; dis denotes computing pixel location I1(I, j) to an inner edge1The distance of (c).
In an embodiment, after step 104, an automatic color gradation process is further performed on the panoramic stitched microscopic image to remove the influence of the background light and the white balance, that is, the black background and the impurities are removed by using a soft-edge spray gun.
The automatic tone gradation process is to automatically define the brightest and darkest pixels in each channel as white and black, and then to redistribute the pixel values in between proportionally, so that the image contrast is enhanced and the gradation is clear. The background and edges can be made more natural using an edge smoothing operation.
The invention also provides a rapid panoramic stitching system of the microscopic image, which comprises:
the image acquisition module is used for controlling the microscope objective to acquire a local microscopic image of the pathological section according to a pre-planned acquisition path;
the optical flow estimation module is used for carrying out optical flow estimation on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position by utilizing a pre-trained end-to-end network to obtain an optical flow characteristic diagram;
the image alignment module is used for converting the optical flow characteristic diagram by utilizing a pre-trained space transformation network to obtain an initial matrix, carrying out matrix multiplication operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix at the current position, and aligning two local microscopic images according to the transformation matrix at the current position to obtain a first aligned microscopic image;
the fusion module is used for performing linear fusion on the overlapping area of the first alignment microscopic image in a gradually-in and gradually-out linear fusion mode to obtain a first fusion microscopic image;
the circulation module is used for controlling the microscope objective or the microscope objective stage to move to the next position and acquiring a local microscopic image according to a pre-planned acquisition path, then carrying out optical flow estimation and alignment on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position to obtain a second aligned microscopic image, and carrying out linear fusion on the second aligned microscopic image and the first fused microscopic image to obtain a second fused microscopic image; the processes of the optical flow estimation, the alignment and the linear fusion are circulated until all the local microscopic images are fused to obtain a panoramic fusion microscopic image;
and the rendering module is used for performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering to obtain a panoramic splicing microscopic image.
The invention also relates to a computer device comprising a memory and a processor, wherein the memory stores a computer program, and the processor implements the steps of the method when executing the computer program.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention, and all modifications and equivalents of the present invention, which are made by the contents of the present specification and the accompanying drawings, or directly/indirectly applied to other related technical fields, are included in the scope of the present invention.