Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, exemplary embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited by the example embodiments described herein. Based on the embodiments of the application described in the present application, all other embodiments that a person skilled in the art would have without inventive effort shall fall within the scope of the application.
Most nerve regions have more blood vessels (e.g., armpit brachial plexus) distributed within or around them, and the blood vessels appear anechoic under ultrasound imaging and often with artifacts. Fig. 1 shows a schematic diagram of a vessel and a nerve under ultrasound imaging. As shown in fig. 1, under the ultrasonic imaging, the blood vessel characteristics and the nerve characteristics are similar to each other, so that the blood vessel characteristics and the nerve characteristics are difficult to distinguish from each other, and thus, the blood vessel is frequently misused as a nerve by a novice doctor or an experienced doctor, so that the blood vessel is mispricked, and additional damage is caused to a patient.
In view of this problem, most of the current solutions are to turn on the color doppler imaging function in a suspicious region, and determine whether the region is a blood vessel by observing whether there is blood flow. However, during the scanning process, the operator or assistant turns on the color doppler and adjusts the doppler sampling frame position and its parameters, which will increase the operation flow and affect the operation smoothness. On the other hand, turning on color Doppler can cause the image frame rate to drop instantaneously, thereby causing a feeling of image jamming and slowing down to the doctor and affecting the operation experience of the doctor.
Based on this, the present application proposes an ultrasound imaging scheme. Described below in connection with fig. 2 to 12.
Fig. 2 shows a schematic block diagram of an ultrasound imaging apparatus 200 according to one embodiment of the application. As shown in fig. 2, the ultrasonic imaging apparatus 200 includes a transmitting-receiving circuit 220, an ultrasonic probe 210, a processor 230, and a display 240, wherein the transmitting-receiving circuit 220 is configured to control the ultrasonic probe 210 to transmit a first ultrasonic wave to a portion of a target object to be nerve-blocked in a gray-scale imaging mode, to receive an echo of the first ultrasonic wave, and to acquire first ultrasonic echo data from the echo of the first ultrasonic wave, the processor 230 is configured to generate an ultrasonic image of the portion to be nerve-blocked based on the first ultrasonic echo data, to identify a neurovascular region in the ultrasonic image, and to acquire a position of a sampling frame including the neurovascular region, the transmitting-receiving circuit 220 is further configured to control the ultrasonic probe 210 to transmit a second ultrasonic wave to the position of the sampling frame in the doppler imaging mode, to receive the echo of the second ultrasonic wave, and to acquire second ultrasonic echo data from the echo of the second ultrasonic wave, and the processor 230 is further configured to acquire blood flow information based on the second ultrasonic echo data, to acquire a neurovascular region in the ultrasonic image based on the neurovascular region and the vascular region, and to acquire the neurovascular region in the ultrasonic image, and to control the display 240 to display the ultrasonic image, wherein the neurovascular region and the neurovascular region are capable of being distinguished from each other on the ultrasonic image.
In an embodiment of the present application, the ultrasound imaging apparatus 200 first generates a gray-scale image of a site to be nerve-blocked and identifies a neurovascular region in the gray-scale image, as shown in fig. 3. Fig. 3 shows an exemplary schematic view of neurovascular regions in an ultrasound image of a site to be nerve blocked generated by an ultrasound imaging device 200 in accordance with an embodiment of the present application. As shown in fig. 3, the neurovascular region is a region including a neuroregion and a vascular region. Since the nerve region and the blood vessel region are difficult to be distinguished directly in the gray-scale image, the region including the nerve region and the blood vessel region is first identified here. Then, based on the neural network region, a sampling frame (described in an exemplary manner later with reference to fig. 4 and 5) including the neural network region is placed (manually or automatically), and doppler imaging is performed for the position of the sampling frame to obtain blood flow information, from which a blood vessel region in a gray-scale image can be obtained. Next, from both the previously acquired neurovascular region and the vascular region, the neurovascular region may be acquired, thereby distinguishing the neurovascular region from the vascular region (described exemplarily in connection with fig. 6 later). In order for the user (doctor) to clearly distinguish between the nerve region and the blood vessel region, a gray-scale image is finally displayed on the display 240, and the nerve network and the blood vessel region are distinguishably highlighted from each other on the gray-scale image (described exemplarily with reference to fig. 7 to 9 later), thereby enabling the user to intuitively distinguish between the nerve and the blood vessel.
Thus, in general, the ultrasound imaging apparatus 200 according to the embodiment of the present application automatically distinguishes between nerves and blood vessels in the neurovascular region by utilizing the difference between nerves and blood flow under the doppler effect, and respectively uses different highlighting modes for them, so as to give different prompts to a doctor to intuitively distinguish between nerves and blood vessels, thereby reducing the risk of mistakenly puncturing blood vessels during surgery.
In an embodiment of the present application, after generating an ultrasound image of the site to be nerve-blocked, the processor 230 may automatically identify neurovascular regions in the ultrasound image. The neurovascular region can be obtained by an algorithm such as segmentation, edge detection, or target tracking of an ultrasonic image, and the implementation mode can be realized by a method such as deep learning or other machine learning, traditional image processing and the like. Several methods are described below by way of example.
In one example, the processor 230 may employ a depth learning based image segmentation method to automatically identify neurovascular regions in the ultrasound image. Specifically, the constructed database can be subjected to feature learning and structure boundary learning by stacking convolution layers and deconvolution layers, and for an input ultrasonic image, an image output with the same size can be directly generated through a network to represent a specific boundary range of a key anatomical structure (neurovascular region). Common networks include FCN, unet, segNet, deepLab, mask RCNN, and the like.
In another example, the processor 230 may employ other machine learning based image segmentation methods to automatically identify neurovascular regions in the ultrasound image. Specifically, the method comprises pre-segmenting an image by threshold segmentation, snake, level set, graphCut, ASM, AAM and other methods, acquiring a group of candidate anatomical structure (neurovascular region) boundary ranges in an ultrasonic image, extracting features of each candidate boundary range, wherein the feature extraction method can be extracting PCA, LDA, HOG, harr, LBP and other traditional features or extracting features of a neural network, matching the extracted features with features extracted from the boundary ranges marked in a database, classifying by using a KNN, SVM, random forest or neural network and other discriminators, determining whether the current candidate boundary range contains a key anatomical structure, and acquiring corresponding categories.
In yet another example, the processor 230 may automatically identify neurovascular regions in the ultrasound image based on conventional image processing edge detection methods. Specifically, the ultrasonic image can be firstly subjected to smooth denoising through a smoothing filter such as an average filter, a Gaussian filter, a bilateral filter and the like, and then the denoised image is subjected to edge extraction through an edge detection operator such as Roberts, sobel, kirsch, canny, laplacian and the like.
In yet another example, the processor 230 may automatically identify neurovascular regions in the ultrasound image based on a target tracking method. Specifically, the neurovascular region in the first frame of ultrasonic image can be obtained by the method in the above examples or other methods, then the feature extraction is performed on the current frame of target region by a discrimination model method (DISCRIMINATIVE, such as struct, TLD, DCF, etc.) or based on a deep learning network (such as SiamFC ++, siamMask, etc.), and then the next frame of target region is extracted by a constructed database model, so as to generate a specific boundary range of the key anatomical structure (neurovascular region).
In an embodiment of the present application, after identifying the neurovascular region in the ultrasound image, the processor 230 may obtain the location of the sampling frame including the neurovascular region. For example, the processor 230 may automatically generate a sampling frame including the neurovascular region to obtain the location of the sampling frame, or the processor 230 may receive user input to obtain the location of the sampling frame including the neurovascular region. Wherein the processor 230 automatically generates a sampling frame including the neurovascular region may include generating a sampling frame that is fixed in size and position and that can include at least the neurovascular region for each frame of ultrasound images, or generating a sampling frame that includes the neurovascular region for each frame of ultrasound images, the position of the sampling frame moving in response to the positional movement of the neurovascular region. Described below in connection with fig. 4 and 5.
Fig. 4 shows a schematic diagram of one example of a sampling frame comprising a neurovascular region acquired by an ultrasound imaging device 200 in accordance with an embodiment of the present application. As shown in fig. 4, the white box is a doppler sample box with a fixed size position. In this example, a sampling frame is generated for each frame of ultrasonic image, which is fixed in size and position and can at least comprise a neurovascular region, and the sampling frame possibly further comprises other vascular regions besides the neurovascular region, and the method for generating the sampling frame is simple and easy to realize.
Fig. 5 shows a schematic diagram of another example of a sampling frame comprising a neurovascular region acquired by an ultrasound imaging device 200 in accordance with an embodiment of the present application. As shown in fig. 5, the black small box surrounding the neurovascular region and moving with the position of the neurovascular region (in different frames) is the doppler sample box. In this example, a sampling frame including a neurovascular region is generated for each frame of ultrasound image, the position of the sampling frame moves along with the position movement of the neurovascular region, the manner of generating the sampling frame is more accurate, and the sampling frame generally only includes the neurovascular region and does not include other vascular regions, so that the calculation amount of the blood flow information obtained later can be reduced.
In the examples shown in fig. 4 and 5, the sampling frame is shown as rectangular, which is merely exemplary, and in other examples, the sampling frame may be other shapes, such as parallelogram, sector, etc., and may take different shapes adaptively according to the probe.
In an embodiment of the present application, after the sampling frame is acquired, doppler imaging (such as color doppler imaging (C-mode), pulse doppler imaging (PW mode), continuous doppler imaging (CW mode), energy doppler imaging (POWER mode), etc.) may be performed for the position of the sampling frame to acquire blood flow information, from which a blood vessel region in the gray-scale image may be acquired. Generally, the Doppler effect is used to acquire a vascular region. Wherein the doppler effect is the frequency shift caused by movement of the source or receiver relative to the medium. The doppler ultrasound is used to obtain the velocity v of tissue or blood flow, and its calculation formula is:
Wherein f o is the emission frequency, which depends on the emission parameters of the ultrasonic imaging device 200, f d is the frequency difference, which is extracted by the ultrasonic imaging device 200 according to the received signal and is related to f o, c is the sound velocity, which is a fixed value, and θ is the included angle between the sound velocity and the target motion direction.
According to different requirements, a user can obtain a required effect by adjusting Doppler blood flow imaging parameters. The adjustable Doppler imaging parameters may include, for example, doppler sampling frame (region of interest, abbreviated as Doppler sampling frame) position, doppler pulse repetition frequency (pulse repetition frequency, abbreviated as PRF), velocity range of blood flow (Scale), doppler signal Gain value (Gain), ultrasound beam emission direction, wall filtering (WALL FILTER) parameters, and the like.
In the embodiment of the application, the blood flow information in the sampling frame can be acquired through Doppler parameters, and the frame rate is kept at a higher level, so that the image blocking feeling is avoided for a user, and the user operation experience is improved.
After blood flow information within the sampling frame is acquired, a blood vessel region within the neurovascular region may be acquired, and then a neural region within the neurovascular region is acquired based on the neurovascular region and the blood vessel region.
In one embodiment of the present application, when the sampling frame further includes other blood vessel regions in the ultrasound image than the neurovascular region, the processor 230 obtains the blood vessel region in the ultrasound image based on the blood flow information, obtains the neurovascular region in the ultrasound image based on the neurovascular region and the blood vessel region, and may include obtaining all the blood vessel regions in the sampling frame based on the blood flow information, determining the blood vessel region in the neurovascular region based on all the blood vessel regions, and obtaining the neural region in the neurovascular region based on the blood vessel region (e.g., subtracting the blood vessel region from the neurovascular region). This embodiment is generally applicable to the case of the sampling frame shown in fig. 4.
In another embodiment of the present application, when the sampling frame includes only a neurovascular region, the processor 230 obtains a vascular region in the ultrasound image based on the blood flow information, obtains a neurovascular region in the ultrasound image based on the neurovascular region and the vascular region, and may include obtaining a vascular region in the neurovascular region in the ultrasound image based on the blood flow information, and obtains a neurovascular region in the neurovascular region based on the vascular region in the neurovascular region (e.g., subtracting the vascular region from the neurovascular region). This embodiment is generally applicable to the case of the sampling frame shown in fig. 5.
Fig. 6 shows an exemplary schematic diagram of an ultrasound imaging device 200 acquiring neurovascular regions, vascular regions, and neural networks, in accordance with an embodiment of the present application. As shown in fig. 6, the white dashed box selection area in the left side of fig. 6 is a neurovascular area, the white dashed box selection area in the middle side of fig. 6 is a vascular area in a neurovascular area, and the area between the two white dashed box selection areas in the right side of fig. 6 is a neurovascular area.
After obtaining the blood vessel region and the nerve region in the neurovascular region, an ultrasound image on which the nerve region and the blood vessel region are distinguishable from each other may be displayed on the display 240. Exemplary schematic diagrams that can distinguishably highlight the nerve region and the blood vessel region from each other are exemplarily described below in connection with fig. 7 to 9.
As shown in fig. 7, in this example, the nerve region and the blood vessel region are each added with different pseudo colors (which are not shown due to gray-scale images in the drawings, but are pseudo colors in practical use) so that they can be intuitively distinguished from each other. As shown in fig. 8, in this example, the nerve region and the blood vessel region are each displayed by tracing so that they can be intuitively distinguished from each other. As shown in fig. 9, in this example, the nerve region is image-enhanced, and the blood vessel region is displayed with color doppler effect (not shown because of gray scale in the drawing, but in practice, the blood vessel region is color-effective) so that they can be intuitively distinguished from each other. It should be understood that the illustration of fig. 7-9 is merely exemplary of a distinguishing highlighting manner, and in other examples may be displayed by any other suitable display manner, so long as the neural area and the vascular area can be visually distinguished.
The above exemplarily illustrates the ultrasound imaging apparatus 200 according to one embodiment of the present application. Based on the above description, the ultrasound imaging apparatus 200 according to the embodiment of the present application automatically distinguishes between the nerve and the blood vessel in the neurovascular region by using the difference between the nerve and the blood flow under the doppler effect, and respectively uses different highlighting modes for the nerve region and the blood vessel region, so as to give different prompts to the doctor, thereby intuitively distinguishing between the nerve and the blood vessel, and reducing the risk of mistakenly puncturing the blood vessel during the operation.
Fig. 10 shows a schematic block diagram of an ultrasound imaging apparatus 1000 according to an embodiment of the present application. As shown in fig. 10, the ultrasonic imaging apparatus 1000 includes a transmitting-receiving circuit 1020, an ultrasonic probe 1010, a processor 1030, and a display 1040, wherein the transmitting-receiving circuit 1020 is configured to control the ultrasonic probe 1010 to transmit a first ultrasonic wave to a portion to be nerve-blocked of a target object, receive an echo of the first ultrasonic wave, and acquire first ultrasonic echo data from the echo of the first ultrasonic wave in a gray-scale imaging mode, the processor 1030 is configured to generate an ultrasonic image of the portion to be nerve-blocked based on the first ultrasonic echo data, the transmitting-receiving circuit 1020 is further configured to control the ultrasonic probe 1010 to transmit a second ultrasonic wave to the portion to be nerve-blocked in a doppler imaging mode, receive an echo of the second ultrasonic wave, and acquire second ultrasonic echo data from the echo of the second ultrasonic wave, and the processor 1030 is further configured to acquire a nerve region and a blood vessel region in the ultrasonic image according to a deep learning, machine learning, or image processing algorithm, and control the display 1040 to display the ultrasonic image, wherein the nerve region and the blood vessel region are distinguishably highlighted from each other on the ultrasonic image, and the highlighted result is configured to guide puncturing the portion to be nerve-blocked.
The ultrasound imaging apparatus 1000 according to the embodiment of the present application is similar in function to the ultrasound imaging apparatus 200 described above, and can achieve the distinguishing highlighting of the nerve region and the blood vessel region on the ultrasound image (the display effect is shown in fig. 7 to 9, and is not repeated here). The difference is that the ultrasound imaging apparatus 1000 according to the embodiment of the present application directly acquires both the nerve region and the blood vessel region in the ultrasound image according to the deep learning, the machine learning, or the conventional image processing algorithm, not after acquiring the nerve blood vessel region and the blood vessel region, but based on the ultrasound echo data in the gray-scale imaging mode and the ultrasound echo data in the doppler imaging mode.
Thus, the ultrasound imaging device 1000 according to an embodiment of the present application automatically acquires a neural region and a vascular region in combination with ultrasound two-dimensional images and other one or more modality information (ultrasound data in imaging modes such as color Doppler imaging, pulse Doppler imaging, continuous Doppler imaging, energy Doppler imaging, etc.). In the embodiment of the application, the method for automatically identifying the nerve region and the blood vessel region by combining the ultrasonic two-dimensional image and other multi-mode information can be realized by carrying out classification, target detection and other algorithms on the ultrasonic image to identify the ultrasonic image type and the included key structure information, and the implementation mode can be realized by deep learning or other machine learning, traditional image processing and other methods. The following description is made in connection with several examples.
In one example, both an ultrasound image generated in a grayscale imaging mode and ultrasound data in a color doppler imaging mode for a region of a target object to be nerve-blocked are input to processor 1030, and processor 1030 outputs a nerve region and a blood vessel region in the ultrasound image generated in the grayscale imaging mode. The processor 1030 may include an analysis unit, where the analysis unit may include a pre-segmentation unit, a feature extraction unit, and a discrimination unit. The method comprises the steps of obtaining an image, carrying out pre-segmentation on the image by a pre-segmentation unit through methods such as threshold segmentation, snake, level set and GraphCut, ASM, AAM, obtaining a group of candidate anatomical structure boundary ranges in an ultrasonic image, carrying out feature extraction on the boundary range of each candidate, wherein the feature extraction unit is used for carrying out feature extraction on textures, spaces, gray information and the like of a two-dimensional ultrasonic image, carrying out matching on the extracted features and features extracted from the boundary range marked in a database, carrying out classification on the extracted features and the features extracted from the boundary range marked in the database by using a KNN, SVM, random forest or neural network and other discriminators, and determining whether the boundary range of the current candidate contains a key anatomical structure or not, and obtaining the corresponding category of the boundary range. The analysis unit may further include a deep learning neural network, wherein the deep learning neural network performs feature learning and structure boundary learning on the constructed database by stacking convolution layers and deconvolution layers, and for the ultrasonic image generated in the input gray-scale imaging mode and the ultrasonic data generated in the color Doppler imaging mode, an image output with the same size may be directly generated through the network, and the specific boundary range of the key anatomical structure is represented. Common networks include FCN, unet, segNet, deepLab, mask RCNN, and the like.
In another example, both an ultrasound image generated in a grayscale imaging mode and an ultrasound image generated in an energy doppler imaging mode for a region of a target object to be nerve-blocked are input to processor 1030, and processor 1030 outputs a nerve region and a blood vessel region in the ultrasound image generated in the grayscale imaging mode. The processor 1030 may include an analysis unit, where the analysis unit may include a pre-segmentation unit, a feature extraction unit, and a discrimination unit. The method comprises the steps of obtaining an image, carrying out pre-segmentation on the image by a pre-segmentation unit through methods such as threshold segmentation, snake, level set and GraphCut, ASM, AAM, obtaining a group of candidate anatomical structure boundary ranges in an ultrasonic image, carrying out feature extraction on the boundary range of each candidate, wherein the feature extraction unit is used for carrying out feature extraction on the boundary range of each candidate, the feature extraction comprises speed, energy, variance and the like of color Doppler or spectrum Doppler data, the discrimination unit carries out matching on the extracted features and features extracted from the boundary range marked in a database, and using discriminators such as KNN, SVM, random forest or neural network to classify the extracted features to determine whether the boundary range of the current candidate contains a key anatomical structure or not, and obtaining the corresponding category of the boundary range. The analysis unit may further include a deep learning neural network, wherein the deep learning neural network performs feature learning and structure boundary learning on the constructed database by stacking convolution layers and deconvolution layers, and for the ultrasonic image generated in the input gray-scale imaging mode and the ultrasonic data generated in the energy doppler imaging mode, an image output with the same size may be directly generated through the network, and the specific boundary range of the key anatomical structure is represented. Common networks include FCN, unet, segNet, deepLab, mask RCNN, and the like.
Based on the above description, the ultrasound imaging apparatus 1000 according to the embodiment of the present application directly acquires the nerve region and the blood vessel region in the ultrasound image according to the deep learning, the machine learning or the conventional image processing algorithm based on the ultrasound echo data in the gray-scale imaging mode and the ultrasound echo data in the doppler imaging mode, and respectively uses different highlighting modes for the nerve region and the blood vessel region, so as to give different prompts to a doctor, so as to intuitively distinguish the nerve from the blood vessel, thereby reducing the risk of mistakenly puncturing the blood vessel in the operation.
An ultrasound imaging method 1100 according to an embodiment of the application is described below in connection with fig. 11, the method 1100 being implemented by the ultrasound imaging apparatus 200 above. In the above description, the imaging method of the ultrasound imaging apparatus 200 has been described, and thus, the method 1100 will be described only briefly below. As shown in fig. 11, the ultrasound imaging method 1100 may include the steps of:
In step S1110, the ultrasound probe is controlled to transmit a first ultrasound wave to a portion of the target object to be nerve-blocked in the grayscale imaging mode, receive an echo of the first ultrasound wave, and acquire first ultrasound echo data from the echo of the first ultrasound wave.
In step S1120, an ultrasound image of a site to be nerve-blocked is generated based on the first ultrasound echo data, a neurovascular region in the ultrasound image is identified, and a position of a sampling frame including the neurovascular region is acquired.
In step S1130, the ultrasound probe is controlled to transmit the second ultrasound wave to the position of the sampling frame in the doppler imaging mode, receive the echo of the second ultrasound wave, and acquire the second ultrasound echo data from the echo of the second ultrasound wave.
In step S1140, blood flow information is acquired based on the second ultrasound echo data, a blood vessel region in the ultrasound image is acquired based on the blood flow information, and a nerve region in the ultrasound image is acquired based on the nerve blood vessel region and the blood vessel region.
In step S1150, an ultrasound image on which a nerve region and a blood vessel region are distinguishably highlighted from each other is displayed.
In an embodiment of the application, the nerve region and the blood vessel region are distinguishably highlighted on the ultrasonic image, including any one of adding different pseudo colors to each of the nerve region and the blood vessel region, performing stroking display to each of the nerve region and the blood vessel region, performing image enhancement to the nerve region, and displaying a color Doppler effect to the blood vessel region.
In an embodiment of the application, the Doppler imaging mode comprises any one of a color Doppler imaging mode, a pulsed Doppler imaging mode, a continuous Doppler imaging mode, and an energy Doppler imaging mode.
In an embodiment of the application, obtaining the location of the sampling frame including the neurovascular region includes automatically generating the sampling frame including the neurovascular region to obtain the location of the sampling frame, or receiving user input to obtain the location of the sampling frame including the neurovascular region.
In an embodiment of the application, automatically generating a sampling frame comprising a neurovascular region comprises generating a sampling frame for each frame of ultrasound images that is fixed in size and position and that is capable of comprising at least the neurovascular region, or generating a sampling frame for each frame of ultrasound images that comprises the neurovascular region, the position of the sampling frame moving following the movement of the position of the neurovascular region.
In the embodiment of the application, the sampling frame further comprises other blood vessel areas except the nerve blood vessel area in the ultrasonic image, the blood vessel area in the ultrasonic image is acquired based on blood flow information, and the nerve area in the ultrasonic image is acquired based on the nerve blood vessel area and the blood vessel area.
In the embodiment of the application, the sampling frame only comprises a neurovascular region, the vessel region in the ultrasonic image is acquired based on blood flow information, and the neurovascular region in the ultrasonic image is acquired based on the neurovascular region and the vessel region.
In an embodiment of the application, identifying neurovascular regions in an ultrasound image includes segmenting or edge detecting or target tracking the ultrasound image to obtain neurovascular regions.
According to the ultrasonic imaging method 1100 provided by the embodiment of the application, the difference of the nerve and the blood flow under the Doppler effect is utilized, the nerve and the blood vessel are automatically distinguished in the nerve blood vessel region, and different highlighting modes are respectively used for the nerve region and the blood vessel region, so that different prompts can be given to doctors, the nerve and the blood vessel can be intuitively distinguished, and the risk of mistakenly puncturing the blood vessel in the operation is reduced.
An ultrasound imaging method 1200 according to an embodiment of the application is described below in connection with fig. 12, the method 1200 being implemented by the ultrasound imaging apparatus 1000 above. In the above description, the imaging method of the ultrasound imaging apparatus 1000 has been described, and thus, the method 1200 is described only briefly below. As shown in fig. 12, the ultrasound imaging method 1200 may include the steps of:
In step S1210, the ultrasound probe is controlled to transmit a first ultrasound wave to a portion of the target object to be nerve-blocked in the grayscale imaging mode, receive an echo of the first ultrasound wave, and acquire first ultrasound echo data from the echo of the first ultrasound wave.
In step S1220, an ultrasound image of the site to be nerve block is generated based on the first ultrasound echo data.
In step S1230, the ultrasonic probe is controlled to transmit the second ultrasonic wave to the portion to be nerve-blocked in the doppler imaging mode, receive the echo of the second ultrasonic wave, and acquire second ultrasonic echo data from the echo of the second ultrasonic wave.
In step S1240, based on the first ultrasound echo data and the second ultrasound echo data, a neural region and a vascular region in the ultrasound image are acquired according to a deep learning, machine learning, or image processing algorithm.
In step S1250, an ultrasonic image on which a nerve region and a blood vessel region are highlighted so as to be distinguishable from each other is displayed, the highlighted result being used to guide puncturing of a site to be subjected to nerve block.
In an embodiment of the application, the nerve region and the blood vessel region are distinguishably highlighted on the ultrasonic image, including any one of adding different pseudo colors to each of the nerve region and the blood vessel region, performing stroking display to each of the nerve region and the blood vessel region, performing image enhancement to the nerve region, and displaying a color Doppler effect to the blood vessel region.
In an embodiment of the application, the Doppler imaging mode comprises any one of a color Doppler imaging mode, a pulsed Doppler imaging mode, a continuous Doppler imaging mode, and an energy Doppler imaging mode.
According to the ultrasonic imaging method 1200 provided by the embodiment of the application, based on the ultrasonic echo data in the gray-scale imaging mode and the ultrasonic echo data in the Doppler imaging mode, the nerve region and the blood vessel region in the ultrasonic image are directly acquired according to the deep learning, the machine learning or the traditional image processing algorithm, and different highlighting modes are respectively used for the nerve region and the blood vessel region, so that different prompts can be given to doctors, and the nerves and the blood vessels can be intuitively distinguished, thereby reducing the risk of mistakenly puncturing the blood vessels in the operation.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the above illustrative embodiments are merely illustrative and are not intended to limit the scope of the present application thereto. Various changes and modifications may be made therein by one of ordinary skill in the art without departing from the scope and spirit of the application. All such changes and modifications are intended to be included within the scope of the present application as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, e.g., the division of the elements is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another device, or some features may be omitted or not performed.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in order to streamline the application and aid in understanding one or more of the various inventive aspects, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof in the description of exemplary embodiments of the application. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed application requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be combined in any combination, except combinations where the features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
Various component embodiments of the application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some of the modules in an item analysis device according to embodiments of the present application may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present application may also be embodied as ultrasound blood flow imaging device programs (e.g., computer programs and computer program products) for performing part or all of the methods described herein. Such a program embodying the present application may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means for ultrasound blood flow imaging, several of these means for ultrasound blood flow imaging may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
The foregoing description is merely illustrative of specific embodiments of the present application and the scope of the present application is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the scope of the present application. The protection scope of the application is subject to the protection scope of the claims.