CN114587590B - Intraoperative real-time tracking method and intraoperative real-time tracking system - Google Patents
Intraoperative real-time tracking method and intraoperative real-time tracking system Download PDFInfo
- Publication number
- CN114587590B CN114587590B CN202210240628.3A CN202210240628A CN114587590B CN 114587590 B CN114587590 B CN 114587590B CN 202210240628 A CN202210240628 A CN 202210240628A CN 114587590 B CN114587590 B CN 114587590B
- Authority
- CN
- China
- Prior art keywords
- image
- bone
- dimensional
- ultrasonic
- preset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 210000000988 bone and bone Anatomy 0.000 claims abstract description 305
- 230000003068 static effect Effects 0.000 claims abstract description 48
- 238000001514 detection method Methods 0.000 claims abstract description 10
- 238000011524 similarity measure Methods 0.000 claims description 49
- 238000002604 ultrasonography Methods 0.000 claims description 39
- 238000012545 processing Methods 0.000 claims description 29
- 230000003287 optical effect Effects 0.000 claims description 18
- 239000000523 sample Substances 0.000 claims description 14
- 238000007781 pre-processing Methods 0.000 claims description 7
- 238000005481 NMR spectroscopy Methods 0.000 claims description 3
- 238000013528 artificial neural network Methods 0.000 claims description 3
- 238000013135 deep learning Methods 0.000 claims description 3
- 230000005672 electromagnetic field Effects 0.000 claims description 3
- 238000001914 filtration Methods 0.000 claims description 3
- 239000013307 optical fiber Substances 0.000 claims description 3
- 208000027418 Wounds and injury Diseases 0.000 abstract description 6
- 208000014674 injury Diseases 0.000 abstract description 5
- 206010052428 Wound Diseases 0.000 abstract description 4
- 230000006378 damage Effects 0.000 abstract description 4
- 230000005855 radiation Effects 0.000 abstract description 4
- 230000004927 fusion Effects 0.000 description 9
- 230000008569 process Effects 0.000 description 8
- 238000006243 chemical reaction Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000007499 fusion processing Methods 0.000 description 3
- 230000000877 morphologic effect Effects 0.000 description 3
- 238000001356 surgical procedure Methods 0.000 description 3
- 210000002303 tibia Anatomy 0.000 description 3
- 230000008733 trauma Effects 0.000 description 3
- 210000000689 upper leg Anatomy 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 206010002091 Anaesthesia Diseases 0.000 description 1
- 208000035965 Postoperative Complications Diseases 0.000 description 1
- 208000004550 Postoperative Pain Diseases 0.000 description 1
- 230000037005 anaesthesia Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000001427 coherent effect Effects 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 239000007943 implant Substances 0.000 description 1
- 208000015181 infectious disease Diseases 0.000 description 1
- 230000002401 inhibitory effect Effects 0.000 description 1
- 230000000399 orthopedic effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 230000001225 therapeutic effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/108—Computer aided selection or customisation of medical implants or cutting guides
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2063—Acoustic tracking systems, e.g. using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Robotics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The invention provides an intraoperative real-time tracking method, which comprises the following steps: s1: acquiring an ultrasonic bone image of a bone of a designated part; s2: acquiring a preset three-dimensional skeleton static image of the skeleton of the appointed position; s3: and registering and fusing the ultrasonic bone image and the preset three-dimensional bone image to obtain the position information of the bone of the appointed position. The intraoperative real-time tracking method improves the accuracy of position detection, has no wound injury and no radiation, and has less limitation on acquiring ultrasonic skeleton images and low cost. In addition, the invention also provides an intraoperative real-time tracking system.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to an intraoperative real-time tracking method and system.
Background
Advances in digital medical technology have advanced the development of innovative medical instruments and devices to enhance clinical therapeutic effects. Many clinical follow-up studies confirm that accurate surgical treatment can improve post-treatment outcome and reduce the incidence of post-operative complications. Many manufacturers have therefore developed numerous intraoperative navigation devices to achieve high precision osteotomies, orthopedics and prosthetic implants or locations. Current navigation in surgery mostly uses optical positioning tracking technology to track the spatial position of bones in surgery. The bone position is followed in real time by using bone nails to rigidly connect the target bone with the rigid body of the reflective ball array. The coordinate system is established according to the characteristic points positioned in the operation, or the characteristic points positioned in the operation are registered with the information acquired before the operation so as to register the relative positions of the bones in the operation and the reflecting ball array rigid body, thereby realizing the real-time tracing of the bone positions.
The intraoperative bone tracking method is widely applied to various surgical navigation devices, including navigation systems based on methods such as optical positioning, mechanical structure positioning, ultrasonic positioning, X-ray positioning, electromagnetic positioning and the like.
In the known registration mode, some of the registration modes adopt probes to acquire the coordinate positions of the bone surface contours in a space coordinate system constructed by an optical positioning system one by one on the bone surface of a patient, then register the coordinate positions with a pre-operation MRI or CT or ultrasonic segmentation model of the bone of the patient, and unify the bone models into the same optical coordinate system, wherein the registration mode generally needs to acquire dozens of characteristic points. In addition, some navigation devices are needed to register an optical space coordinate system with a CT three-dimensional orthopaedics coordinate system by acquiring CT images, displaying the position marks of the optical sensor and the bone structures in the same CT image through CT scanning.
The above prior art has the following drawbacks:
additional trauma, bone nails to be used in the navigation preparation process, and a rigid body of the sensor array to be fixed on the bone of a patient can cause additional puncture trauma to the patient, and the additional trauma can bring the risk of infection of the patient, postoperative pain and other problems.
The point taking and positioning registration links in the operation need complex operation and confirmation, and a great deal of time for doctors and teachers is consumed, so that the anesthesia time of patients is prolonged, the blood loss is increased, and the operation efficiency is reduced.
Therefore, there is a need to provide a novel intra-operative tracking method and system to solve the above-mentioned problems in the prior art.
Disclosure of Invention
The invention aims to provide the intraoperative real-time tracking method which has the advantages of no wound injury, no radiation, convenient operation and improved position detection accuracy.
In order to achieve the above object, the intra-operative real-time tracking method of the present invention includes the steps of:
S1: acquiring an ultrasonic bone image of a bone of a designated part;
S2: acquiring a preset three-dimensional skeleton static image of the skeleton of the appointed position;
s3: and registering and fusing the ultrasonic bone image and the preset three-dimensional bone image to obtain the position information of the bone of the appointed position.
The intraoperative real-time tracking method has the beneficial effects that: the ultrasonic bone image and the preset three-dimensional bone static image are registered and fused to obtain the position information of the bone of the designated part, so that the accuracy of position detection is improved, no wound damage and no radiation are caused, the limitation of obtaining the ultrasonic bone image is less, the operation is convenient, and the cost is low.
Further, the registering and fusing the ultrasonic bone image and the preset three-dimensional bone static image to obtain the position information of the bone of the designated part includes:
s31: constructing a three-dimensional skeleton model according to the preset three-dimensional skeleton static image;
s32: continuously adjusting the position of the three-dimensional skeleton model according to a preset rule;
S33: acquiring ultrasonic scanning position information, and then acquiring a cross-sectional image of the three-dimensional skeleton model according to the ultrasonic scanning position information;
S34: obtaining a cross-sectional image of the ultrasound bone image;
S35: calculating a similarity measure between a cross-sectional image of the ultrasonic three-dimensional bone model and a cross-sectional image corresponding to the ultrasonic bone image once the position of the three-dimensional bone model is adjusted;
S36: and calculating a difference value between the similarity measure obtained in the last time and the similarity measure obtained in the previous time according to the acquisition sequence of the similarity measure, comparing the difference value with a preset precision requirement value, and taking the position of the similarity measure obtained in the last time corresponding to the three-dimensional skeleton model as the position of the skeleton of the appointed part to be obtained when the difference value is smaller than the preset precision requirement value.
The ultrasonic bone image and the preset three-dimensional bone static image are registered and fused with high accuracy, are insensitive to input errors and have high anti-interference performance; and the registration accuracy can be set according to actual needs.
Further, the ultrasound bone image comprises at least 1.
Further, the ultrasound bone image comprises at least one of a two-dimensional ultrasound image and a three-dimensional ultrasound image.
Further, the preset rule comprises a quasi-newton algorithm.
Further, the obtaining the position information of the ultrasound scan includes obtaining using an optical motion capture technique, a sensor, an electromagnetic field, an optical fiber, or a mechanical connection device.
Further, the step of obtaining the preset three-dimensional skeleton static image of the skeleton of the specified part further comprises the step of segmenting and expanding the preset three-dimensional skeleton static image.
Further, the step of acquiring an ultrasonic bone image of the bone of the specified portion further comprises the step of preprocessing the ultrasonic bone image.
Further, the method for preprocessing the ultrasonic bone image comprises an ultrasonic image filtering method, a neural network deep learning method or a broadband minimum variance algorithm.
Further, the preset three-dimensional bone static image comprises a three-dimensional nuclear magnetic resonance image, a three-dimensional CT image or a three-dimensional ultrasonic image.
The invention also provides an intraoperative real-time tracking system which comprises an image acquisition unit and an image processing unit, wherein the image acquisition unit is used for acquiring an ultrasonic bone image of a bone at a specified position and a preset three-dimensional bone static image of the bone at the specified position, and the image processing unit is used for registering and fusing the ultrasonic bone image and the preset three-dimensional bone static image so as to obtain the position information of the bone at the specified position.
The intraoperative real-time tracking method has the beneficial effects that: the accuracy of position detection is improved, no wound damage and no radiation are caused, the limitation of obtaining ultrasonic skeleton images is less, and the cost is low.
Further, the image acquisition unit comprises an ultrasonic detection module, the ultrasonic detection module comprises an ultrasonic probe and a fixing device, and the ultrasonic probe is fixed at a designated position through the fixing device and is used for acquiring the ultrasonic bone image of the designated position and transmitting the acquired ultrasonic bone image to the image processing unit.
Further, the image processing unit comprises a tracking and positioning module and an image processing module, the tracking and positioning module comprises an optical sensor and a camera, the optical sensor is fixedly connected with the ultrasonic probe, the camera is connected with the image processing module and is used for acquiring position information of the optical sensor and transmitting the position information to the image processing module, and the image processing module is used for processing the information acquired by the image acquisition unit and the tracking and positioning module so as to acquire the position information of bones of the appointed part.
Further, the ultrasound probe includes at least 1.
Drawings
FIG. 1 is a flow chart of the intra-operative real-time tracking method of the present invention;
FIG. 2 is a flow chart of registration fusion of an ultrasound bone image with a preset three-dimensional bone static image in accordance with some embodiments of the present invention;
FIG. 3 is a block diagram of an intraoperative real-time tracking system of the present invention;
fig. 4 is a schematic structural diagram of an intraoperative real-time tracking system according to some embodiments of the invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention. Unless otherwise defined, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this invention belongs. As used herein, the word "comprising" and the like means that elements or items preceding the word are included in the element or item listed after the word and equivalents thereof without precluding other elements or items.
Aiming at the problems existing in the prior art, the embodiment of the invention provides an intraoperative real-time tracking method. Referring to fig. 1, the intra-operative real-time tracking method includes the steps of:
S1: acquiring an ultrasonic bone image of a bone of a designated part;
S2: acquiring a preset three-dimensional skeleton static image of the skeleton of the appointed position;
S3: and registering and fusing the ultrasonic bone image and the preset three-dimensional bone static image to obtain the position information of the bone of the appointed position.
The ultrasonic bone image and the preset three-dimensional bone static image are registered and fused to obtain the position information of the bone of the designated part, so that the accuracy of position detection is improved, the limitation of obtaining the ultrasonic bone image is less, the operation is convenient, and the cost is low.
Referring to fig. 2, the step of registering and fusing the ultrasonic bone image with a preset three-dimensional bone static image to obtain the position information of the bone of the designated part includes:
s31: constructing a three-dimensional skeleton model according to the preset three-dimensional skeleton static image;
s32: continuously adjusting the position of the three-dimensional skeleton model according to a preset rule;
S33: acquiring ultrasonic scanning position information, and then acquiring a cross-sectional image of the three-dimensional skeleton model according to the ultrasonic scanning position information;
S34: obtaining a cross-sectional image of the ultrasound bone image;
S35: calculating a similarity measure between a cross-sectional image of the ultrasonic three-dimensional bone model and a cross-sectional image corresponding to the ultrasonic bone image once the position of the three-dimensional bone model is adjusted;
S36: and calculating a difference value between the similarity measure obtained in the last time and the similarity measure obtained in the previous time according to the acquisition sequence of the similarity measure, comparing the difference value with a preset precision requirement value, and taking the position of the similarity measure obtained in the last time corresponding to the three-dimensional skeleton model as the position of the skeleton of the appointed part to be obtained when the difference value is smaller than the preset precision requirement value.
According to some embodiments of the invention, a space positioning system is established, one point O is selected in space at will, the passing point O is used as three mutually perpendicular numerical axes Ox, oy and Oz, and the numerical axes Ox, oy and Oz all take O as an origin and have the same length unit, so that a space rectangular coordinate system O-xyz is formed.
In a first embodiment of the present invention, the process of performing registration fusion between the ultrasonic bone image and the preset three-dimensional bone static image includes obtaining 1 ultrasonic bone image, where the ultrasonic bone image is a two-dimensional image, and the process of performing registration fusion between the ultrasonic bone image and the preset three-dimensional bone static image includes: constructing a three-dimensional bone model according to the preset three-dimensional bone static image, wherein the three-dimensional bone model is provided with an initial position, continuously adjusting the position of the three-dimensional bone model according to a preset rule, obtaining ultrasonic scanning coordinate position information as (45,56,20), obtaining a cross-section image of the three-dimensional bone model according to the ultrasonic scanning position information and an ultrasonic scanning space positioning method, calculating similarity measures between the cross-section image of the ultrasonic three-dimensional bone model and the corresponding ultrasonic bone image every time the position of the three-dimensional bone model is adjusted, calculating a difference value between the similarity measures obtained at the last time and the similarity measures obtained at the last time according to the acquisition sequence of the similarity measures, comparing the difference value with a preset precision requirement value, and when the difference value is smaller than the preset precision requirement value, taking the position of the similarity measures obtained at the last time corresponding to the position of the three-dimensional bone model as the position of the designated part bone to be obtained. The ultrasonic bone image and the preset three-dimensional bone static image are registered and fused with high accuracy, are insensitive to input errors and have high anti-interference performance; and the registration accuracy can be set according to actual needs.
In a second embodiment of the present invention, the process of performing registration fusion between the ultrasonic bone image and the preset three-dimensional bone static image includes obtaining 1 ultrasonic bone image, where the ultrasonic bone image is a three-dimensional image, and the process of performing registration fusion between the ultrasonic bone image and the preset three-dimensional bone static image includes: the method comprises the steps of constructing a three-dimensional bone model according to the preset three-dimensional bone static image, wherein the three-dimensional bone model is provided with an initial position, continuously adjusting the position of the three-dimensional bone model according to a preset rule, obtaining coordinate position information (45,56,20) of ultrasonic scanning, obtaining section body images of the three-dimensional bone model according to the position information of the ultrasonic scanning and an ultrasonic scanning space positioning method, obtaining m section images of the section body images and m section images of the ultrasonic bone images, wherein m is an integer larger than 1, and the m section images of the section body images and the m section images of the ultrasonic bone images are in one-to-one correspondence. Specifically, when m is 3, a first section image, a second section image and a third section image of the section volume image of the three-dimensional bone volume model are obtained, and the first section image, the second section image and the third section image of the ultrasonic bone image are obtained. And respectively calculating the similarity measures between the first section image, the second section image and the third section image of the section body image of the three-dimensional bone body model and the first section image, the second section image and the third section image of the ultrasonic bone body image every time the position of the three-dimensional bone body model is adjusted, respectively comparing the 3 difference values with the 3 preset precision requirement values according to the acquisition sequence of the similarity measures, and when the 3 difference values are smaller than the corresponding preset precision requirement values, respectively, taking the difference values between the similarity measures obtained at the last time between the first section image, the second section image and the third section image of the section body image of the three-dimensional bone body and the similarity measures obtained at the last time, wherein the first section image, the second section image and the third section image of the ultrasonic bone body image have the corresponding 3 preset precision requirement values, respectively, and when the 3 difference values are smaller than the corresponding preset precision requirement values, obtaining the position of the three-dimensional bone body model at the position of the specified position. When the three-dimensional ultrasonic bone body is aligned with the preset three-dimensional bone static image, the three-dimensional ultrasonic bone body is provided with m two-dimensional images, and compared with a single two-dimensional ultrasonic bone body, the accuracy of the alignment is higher; and the registration accuracy can be set according to actual needs.
In a third embodiment of the present invention, the process of performing registration fusion between the ultrasonic bone image and the preset three-dimensional static bone image includes obtaining a first-part ultrasonic bone image and a second-part ultrasonic bone image, where the first-part ultrasonic bone image and the second-part ultrasonic bone image are two-dimensional images, the first-part ultrasonic bone image is an ultrasonic bone image of a proximal end of a long bone, and the second-part ultrasonic bone image is an ultrasonic bone image of a distal end of the long bone. The registration fusion process of the ultrasonic bone image and the preset three-dimensional bone static image comprises the following steps: and constructing a three-dimensional bone model according to the preset three-dimensional bone static image, wherein the three-dimensional bone model is provided with an initial position, and the position of the three-dimensional bone model is continuously adjusted according to a preset rule. The coordinate position information of the ultrasonic scanning of the first part is obtained as (45,56,20), the coordinate position information of the ultrasonic scanning of the second part is obtained as (47,57,53), the first part section image and the second part section image of the three-dimensional bone body model are respectively obtained according to the position information of the ultrasonic scanning and an ultrasonic scanning space positioning method, the corresponding first part section image and second part section image of the ultrasonic bone image are obtained, each time the position of the three-dimensional bone body model is adjusted, the similarity measure between the first part section image and the corresponding first part section image of the three-dimensional bone body model is calculated, the similarity measure between the second part section image and the corresponding second part section image of the three-dimensional bone body model is calculated, the similarity measure between the similarity measure and the similarity measure obtained after the second part section image is calculated according to the acquisition sequence of the similarity measure, the three-dimensional bone body model is calculated, the required bone precision is obtained when the first part section image and the second part section image of the three-dimensional bone body has the required position, and the required bone precision is smaller than the required value when the required position of the corresponding three-dimensional bone body model is calculated, and the required bone precision is obtained when the required position of the corresponding three-dimensional bone model is 2. Compared with the registration of the ultrasonic bone images of a single part, the embodiment registers the ultrasonic bone images of 2 different parts with the preset three-dimensional bone static image, and has higher accuracy; for long bones such as femur and tibia, the change of the morphological feature points of the bones is not obvious, the two ends of the long bones are respectively registered, and compared with the registration of one end, the accuracy can be obviously improved; and the registration accuracy can be set according to actual needs.
In a fourth embodiment of the present invention, the process of performing registration fusion between the ultrasonic bone image and the preset three-dimensional static bone image includes obtaining a first-part ultrasonic bone image and a second-part ultrasonic bone image, where the first-part ultrasonic bone image and the second-part ultrasonic bone image are both three-dimensional images, the first-part ultrasonic bone image is an ultrasonic bone image of a proximal end of a long bone, and the second-part ultrasonic bone image is an ultrasonic bone image of a distal end of the long bone. The registration fusion process of the ultrasonic bone image and the preset three-dimensional bone static image comprises the following steps: and constructing a three-dimensional bone model according to the preset three-dimensional bone static image, wherein the three-dimensional bone model is provided with an initial position, and the position of the three-dimensional bone model is continuously adjusted according to a preset rule. Obtaining coordinate position information of ultrasonic scanning of the first part as (45,56,20), coordinate position information of ultrasonic scanning of the second part as (47,57,53), respectively obtaining a first part section body image and a second part section body image of the three-dimensional bone body model according to the position information of ultrasonic scanning and an ultrasonic scanning space positioning method, respectively obtaining m section images of the first part section body image and m section images of the second part section body image of the three-dimensional bone body model, respectively obtaining a first part ultrasonic bone three-dimensional image and a second part ultrasonic bone three-dimensional image, respectively obtaining m section images of the first part ultrasonic bone three-dimensional image and m section images of the second part ultrasonic bone three-dimensional image, m is an integer greater than 1, and the 2m section images of the three-dimensional bone model are in one-to-one correspondence with the 2m section images of the ultrasonic bone image. specifically, when m is 3, a first section image, a second section image and a third section image of a first part of the three-dimensional bone body model are obtained, a first section image, a second section image and a third section image of a second part of the three-dimensional bone body model are obtained, a first section image, a second section image and a third section image of the first part ultrasonic bone three-dimensional image are obtained, and a first section image, a second section image and a third section image of the second part ultrasonic bone three-dimensional image are obtained. Calculating similarity measures between 6 cross-sectional images of the three-dimensional bone model and 6 cross-sectional images corresponding to the ultrasonic bone three-dimensional image every time the position of the three-dimensional bone model is adjusted, calculating differences between the similarity measures obtained at the last time and the similarity measures obtained at the last time of the 6 cross-sectional images of the three-dimensional bone model according to the acquisition sequence of the similarity measures, comparing the differences with preset precision requirements, and when the obtained 6 differences of the three-dimensional bone model are respectively smaller than the corresponding 6 preset precision requirements, and taking the position of the three-dimensional bone model corresponding to the similarity measure obtained once after the 6 cross-sectional images of the three-dimensional bone model as the position of the bone of the appointed part to be obtained. Compared with the registration of the ultrasonic bone images of a single part, the embodiment registers the ultrasonic bone images of 2 different parts with the preset three-dimensional bone static image, and has higher accuracy; for long bones such as femur and tibia, the change of the morphological feature points of the bones is not obvious, the two ends of the long bones are respectively registered, and compared with the registration of one end, the accuracy can be obviously improved; the three-dimensional ultrasonic bone has m two-dimensional images, and compared with the two-dimensional ultrasonic bone, the accuracy of registration is higher; and the registration accuracy can be set according to actual needs.
In a fifth embodiment of the present invention, the process of performing registration fusion between the ultrasonic bone image and the preset three-dimensional static bone image includes obtaining a first-position three-dimensional ultrasonic bone image and a second-position two-dimensional ultrasonic bone image, where the first-position three-dimensional ultrasonic bone image is a three-dimensional ultrasonic bone image of a long bone near end, and the second-position two-dimensional ultrasonic bone image is a two-dimensional ultrasonic bone image of a long bone far end. The registration fusion process of the ultrasonic bone image and the preset three-dimensional bone static image comprises the following steps: and constructing a three-dimensional bone model according to the preset three-dimensional bone static image, wherein the three-dimensional bone model is provided with an initial position, and the position of the three-dimensional bone model is continuously adjusted according to a preset rule. The coordinate position information of the ultrasonic scanning of the first part is obtained as (45,56,20), the coordinate position information of the ultrasonic scanning of the second part is obtained as (47,57,53), m section images of a first part section body image of the three-dimensional bone body model and m section images of a second part section image of the three-dimensional bone body model are respectively obtained according to the position information of the ultrasonic scanning and an ultrasonic scanning space positioning method, m section images of the three-dimensional ultrasonic bone body image of the first part and m section images of the second part two-dimensional ultrasonic bone image of the first part are obtained, m is an integer greater than 1, and the m section images of the first part section body image of the three-dimensional bone body model are in one-to-one correspondence with the m section images of the three-dimensional ultrasonic bone image of the first part. specifically, when m is 3, a first section image, a first section second section image, a first section third section image and a second section image of the three-dimensional bone body model are obtained, and a first section image, a second section image, a third section image and a second section two-dimensional ultrasonic bone image of the first section three-dimensional ultrasonic bone image are obtained. Calculating similarity measures between 4 cross-sectional images of the three-dimensional bone model and 4 cross-sectional images corresponding to the ultrasonic bone three-dimensional image every time the position of the three-dimensional bone model is adjusted, calculating 4 differences between the similarity measures obtained at the last time and the similarity measures obtained at the last time of the 4 cross-sectional images of the three-dimensional bone model according to the acquisition sequence of the similarity measures, comparing the differences with preset precision requirement values when the 4 differences are respectively smaller than the preset precision requirement values, and taking the position of the three-dimensional bone model corresponding to the similarity measure obtained once after the 4 cross-sectional images of the ultrasonic three-dimensional bone model as the position of the bone of the appointed part to be obtained. Compared with the registration of the ultrasonic bone images of a single part, the embodiment registers the ultrasonic bone images of 2 different parts with the preset three-dimensional bone static image, and has higher accuracy; for long bones such as femur and tibia, the change of the morphological feature points of the bones is not obvious, the two ends of the long bones are respectively registered, and compared with the registration of one end, the accuracy can be obviously improved; the three-dimensional ultrasonic bone body is provided with m two-dimensional images, and compared with the two-dimensional ultrasonic bone body, the accuracy of registration is higher; the registration accuracy can be set according to actual needs.
In some embodiments of the present invention, the average value of the preset precision requirement values of the cross-sectional images of the three-dimensional bone model is the precision requirement value of the registration fusion between the ultrasonic bone image and the preset three-dimensional bone image.
In some embodiments of the present invention, the method for calculating the similarity measure includes:
S3501: obtaining gradient differences obtained by calculation in the row direction between the cross-sectional image of the three-dimensional bone model and the cross-sectional image corresponding to the ultrasonic bone image, and obtaining first data;
s3502: obtaining the square of the first data to obtain second data;
s3503: obtaining gray variance of the first data to obtain third data;
s3504: the second data and the third data are summed to obtain fourth data;
S3505: dividing the third data and the fourth data to obtain fifth data;
s3506: obtaining the sum of the fifth data in the row direction and the column direction to obtain sixth data;
s3507: obtaining gradient differences calculated in the column direction between the cross-sectional image of the three-dimensional bone model and the cross-sectional image corresponding to the ultrasonic bone image to obtain seventh data;
s3508: obtaining the square of the seventh data to obtain eighth data;
s3509: obtaining gray variance of the seventh data to obtain ninth data;
s3510: the eighth data and the ninth data are summed to obtain tenth data;
S3511: dividing the ninth data and the tenth data to obtain eleventh data;
S3512: obtaining the sum of the eleventh data in the row direction and the column direction to obtain twelfth data;
s3513: the sixth data and the twelfth data are summed to obtain thirteenth data;
s3514: negative numbers of the thirteenth data are obtained as similarity measures between cross-sectional images of the three-dimensional bone model and cross-sectional images corresponding to the ultrasound bone images.
The method for calculating the similarity measure in some embodiments of the present invention is shown in formula (1), the cross-sectional image of the ultrasonic three-dimensional bone model is a cross-sectional image of the ultrasonic three-dimensional bone model at the ultrasonic scanning position, F represents the similarity measure between the cross-sectional image of the ultrasonic three-dimensional bone model and the cross-sectional image of the ultrasonic bone image, u is the image size in the row direction, v is the image size in the column direction, AU is the gray-scale variance of I dU (u, v), AV is the gray-scale variance of I dV (u, v), I dU (u, v) is the gradient difference in the row direction between the cross-sectional image of the ultrasonic three-dimensional bone model and the cross-sectional image of the ultrasonic bone image, I dV (u, v) is the gradient difference in the column direction between the cross-sectional image of the ultrasonic three-dimensional bone model and the cross-sectional image of the ultrasonic bone image, I dU (u, v) is shown in formula (2), and the method for calculating I dV (u, v) is shown in formula (3) is shown in the following formula:
In some embodiments of the present invention, the predetermined rule includes a quasi-newton algorithm.
Specifically, in some embodiments, the preset rule specifically includes the following steps: inputting an objective function F (k) to letG (x) is a quadratic equation about x variable, x is a gradient value, the iteration position x (n) is initialized, the symmetric positive definite matrix is initialized to obtain B n, n represents the iteration times, and the initial value of n is 0; then, calculating algorithm precision at the position g n=g(x(n)),gn as n, when n=0, if the II g n II < epsilon, epsilon is algorithm precision requirement, x is unknown position variable of g (x) function, and output x=x (0),x(0) is new conversion position; otherwise, entering the following steps: calculating the descending direction, wherein the formula p n=Bn -1gn,pn is a function of calculating the searching direction; performing one-dimensional search, and solving lambda n,λn to obtain the optimal step length based on a formula F (x (n)+λnpn)=minλ≥0F(x(n)+λpn); stepping, x (n+1)=x(n)+λnpn; calculating g n+1=g(x(n+1)), if g n+1 < epsilon, calculating to terminate, and outputting x=x (n+1),x(n+1) to be a new conversion position; otherwise, based on the formulaGamma n is a functional formula of n, delta n is another functional formula of n, T represents a transposed symbol, an iteration matrix B n+1 is calculated, n=n+1 is substituted into the step of calculating the descending direction, the subsequent steps are carried out, judgment is carried out according to the formula, and finally, the conversion position x which meets the calculation requirement of stopping the quasi-cow algorithm is obtained, and the conversion position x is the required optimal conversion position. And obtaining an optimal conversion position through a series of calculation, wherein the ultrasonic bone image cross-section image of the bone at the designated part is matched with the three-dimensional bone body model at the conversion position, so that the real-time position of the bone at the designated part in each frame of dynamic image is obtained, and the motion tracking of the bone at the designated part is realized.
In some embodiments of the invention, the obtaining positional information for the ultrasound scan includes obtaining using optical motion capture technology, sensors, electromagnetic fields, optical fibers, or mechanical connection devices.
In some embodiments of the invention, cross-sectional images of the three-dimensional bone model at the ultrasound scan locations are obtained by a cubic spline difference algorithm.
In some embodiments of the present invention, the step of obtaining the preset three-dimensional bone static image of the bone of the specified portion further includes a step of performing segmentation and inflation processing on the preset three-dimensional bone static image.
In some embodiments of the present invention, the step of acquiring an ultrasonic bone image of the bone of the specified region further comprises a step of preprocessing the ultrasonic bone image.
In some embodiments of the present invention, the method for preprocessing the ultrasonic bone image includes an ultrasonic image filtering method, a neural network deep learning method or a wideband minimum variance algorithm.
In some embodiments of the present invention, the method for preprocessing an ultrasonic bone image comprises the steps of:
transmitting ultrasonic waves by utilizing a single ultrasonic array element of an ultrasonic array, receiving ultrasonic echo data by utilizing all the single ultrasonic array elements, and obtaining a low-resolution ultrasonic echo image signal based on a delay superposition algorithm;
Dividing the broadband, coherent low-resolution ultrasound echo image signal into a plurality of narrowband, incoherent wavelet domain echo signals using the low-resolution ultrasound echo image signal;
Processing a plurality of narrow-band incoherent wavelet domain echo signals respectively by using a minimum variance algorithm, reserving expected signals of each frequency band in the wavelet domain echo signals, and inhibiting interference and noise signals in the expected signals;
The desired signal is converted from the wavelet domain to the time domain based on an inverse wavelet transform to obtain a high quality ultrasound image signal.
In some embodiments of the present invention, the preset three-dimensional bone static image includes a three-dimensional nuclear magnetic resonance image, a three-dimensional CT image, or a three-dimensional ultrasound image.
Fig. 3 is a block diagram of the real-time tracking system in the operation of the present invention, referring to fig. 3, the present invention further provides a real-time tracking system in the operation 1, which includes an image acquisition unit 11 and an image processing unit 12, wherein the image acquisition unit 11 is configured to acquire an ultrasonic bone image of a bone of a specified portion and a preset three-dimensional bone static image of the bone of the specified portion, and the image processing unit 12 is configured to register and fuse the ultrasonic bone image with the preset three-dimensional bone static image to obtain position information of the bone of the specified portion.
In some embodiments of the present invention, referring to fig. 3 and 4, the image acquisition unit 11 includes an ultrasound detection module 111, the ultrasound detection module 111 includes an ultrasound probe 1111 and a fixing device 1112, and the ultrasound probe 1111 is fixed to a designated site by the fixing device 1112, and is used for acquiring an ultrasound image of the designated site and transmitting the acquired ultrasound image to the image processing unit 12.
In some embodiments of the present invention, referring to fig. 3 and 4, the image processing unit 12 includes a tracking and positioning module 121 and an image processing module 122, the tracking and positioning module 121 includes an optical sensor 1211 and a camera 1212, the optical sensor 1211 is fixedly connected to the ultrasound probe 1111, the camera 1212 is connected to the image processing module 122, and is configured to acquire position information of the optical sensor 1211 and transmit the position information to the image processing module 122, and the image processing module 122 is configured to process information acquired by the image acquiring unit 11 and the tracking and positioning module 121, so as to obtain position information of the bone of the designated part.
Fig. 4 is a schematic structural diagram of an intra-operative real-time tracking system according to some embodiments of the present invention, and referring to fig. 4, the ultrasound probe 1111 includes more than 1.
In other embodiments of the present invention, the ultrasound probe 1111 comprises 1.
While embodiments of the present invention have been described in detail hereinabove, it will be apparent to those skilled in the art that various modifications and variations can be made to these embodiments. It is to be understood that such modifications and variations are within the scope and spirit of the present invention as set forth in the following claims. Moreover, the invention described herein is capable of other embodiments and of being practiced or of being carried out in various ways.
Claims (13)
1. An intraoperative real-time tracking method is characterized by comprising the following steps:
S1: acquiring an ultrasonic bone image of a bone of a designated part;
S2: acquiring a preset three-dimensional skeleton static image of the skeleton of the appointed position;
S3: registering and fusing the ultrasonic bone image and the preset three-dimensional bone image to obtain the position information of the bone of the appointed position;
The step S3 specifically includes:
s31: constructing a three-dimensional skeleton model according to the preset three-dimensional skeleton static image;
s32: continuously adjusting the position of the three-dimensional skeleton model according to a preset rule;
s33: acquiring ultrasonic scanning position information, and then acquiring a cross-sectional image of the three-dimensional skeleton model according to the ultrasonic scanning position information;
S34: obtaining a cross-sectional image of the ultrasound bone image;
s35: calculating a similarity measure between a cross-sectional image of the three-dimensional bone model and a cross-sectional image corresponding to the ultrasonic bone image every time the position of the three-dimensional bone model is adjusted;
S36: and calculating a difference value between the similarity measure obtained in the last time and the similarity measure obtained in the previous time according to the acquisition sequence of the similarity measure, comparing the difference value with a preset precision requirement value, and taking the position of the similarity measure obtained in the last time corresponding to the three-dimensional skeleton model as the position of the skeleton of the appointed part to be obtained when the difference value is smaller than the preset precision requirement value.
2. The intraoperative real-time tracking method of claim 1 wherein the ultrasound bone image comprises at least 1.
3. The intraoperative real-time tracking method of claim 1, wherein the ultrasound bone image comprises at least one of a two-dimensional ultrasound image, a three-dimensional ultrasound image.
4. A method of intra-operative real time tracking according to any one of claims 1,2 or 3 wherein the predetermined rule comprises a quasi-newton algorithm.
5. The intraoperative real-time tracking method of claim 1 wherein obtaining positional information for an ultrasound scan comprises using optical motion capture technology, sensors, electromagnetic fields, optical fibers, or mechanical connection devices.
6. The method according to claim 1, wherein the step S2 further comprises the step of segmenting and expanding the preset three-dimensional static bone image.
7. The intraoperative real-time tracking method according to claim 1, wherein the step S1 further comprises a method of preprocessing the ultrasonic bone image.
8. The intraoperative real-time tracking method of claim 7, wherein the method of preprocessing the ultrasonic bone image comprises an ultrasonic image filtering method, a neural network deep learning method or a broadband minimum variance algorithm.
9. The intraoperative real-time tracking method of claim 1, wherein the preset three-dimensional bone static image comprises a three-dimensional nuclear magnetic resonance image, a three-dimensional CT image, or a three-dimensional ultrasound image.
10. The intraoperative real-time tracking system is characterized by comprising an image acquisition unit and an image processing unit, wherein the image acquisition unit is used for acquiring an ultrasonic bone image of a bone at a specified position and a preset three-dimensional bone static image of the bone at the specified position, and the image processing unit is used for constructing a three-dimensional bone body model according to the preset three-dimensional bone static image; continuously adjusting the position of the three-dimensional skeleton model according to a preset rule; acquiring ultrasonic scanning position information, and then acquiring a cross-sectional image of the three-dimensional skeleton model according to the ultrasonic scanning position information; obtaining a cross-sectional image of the ultrasound bone image; calculating a similarity measure between a cross-sectional image of the three-dimensional bone model and a cross-sectional image corresponding to the ultrasonic bone image every time the position of the three-dimensional bone model is adjusted; and calculating a difference value between the similarity measure obtained in the last time and the similarity measure obtained in the previous time according to the acquisition sequence of the similarity measure, comparing the difference value with a preset precision requirement value, and taking the position of the similarity measure obtained in the last time corresponding to the three-dimensional skeleton model as the position of the skeleton of the appointed part to be obtained when the difference value is smaller than the preset precision requirement value.
11. The intraoperative real-time tracking system of claim 10, wherein the image acquisition unit comprises an ultrasound detection module comprising an ultrasound probe and a fixture, the ultrasound probe being secured to a designated site by the fixture for acquiring an ultrasound bone image of the designated site and transmitting the acquired ultrasound bone image to the image processing unit.
12. The intraoperative real-time tracking system according to claim 11, wherein the image processing unit comprises a tracking and positioning module and an image processing module, the tracking and positioning module comprises an optical sensor and a camera, the optical sensor is fixedly connected with the ultrasonic probe, the camera is connected with the image processing module and is used for acquiring the position information of the optical sensor and transmitting the position information to the image processing module, and the image processing module is used for processing the information acquired by the image acquisition unit and the tracking and positioning module so as to acquire the position information of the bone of the designated part.
13. An intraoperative real-time tracking system according to claim 12 wherein the ultrasound probe comprises at least 1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210240628.3A CN114587590B (en) | 2022-03-10 | 2022-03-10 | Intraoperative real-time tracking method and intraoperative real-time tracking system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210240628.3A CN114587590B (en) | 2022-03-10 | 2022-03-10 | Intraoperative real-time tracking method and intraoperative real-time tracking system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114587590A CN114587590A (en) | 2022-06-07 |
CN114587590B true CN114587590B (en) | 2024-10-18 |
Family
ID=81808664
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210240628.3A Active CN114587590B (en) | 2022-03-10 | 2022-03-10 | Intraoperative real-time tracking method and intraoperative real-time tracking system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114587590B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117717406B (en) * | 2024-01-08 | 2024-10-25 | 中国人民解放军空军军医大学 | Accurate positioning's wound orthopedics is with fixed pincers that reset |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108420529A (en) * | 2018-03-26 | 2018-08-21 | 上海交通大学 | The surgical navigational emulation mode guided based on image in magnetic tracking and art |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8771188B2 (en) * | 2007-06-20 | 2014-07-08 | Perception Raisonnement Action En Medecine | Ultrasonic bone motion tracking system |
EP2203892B1 (en) * | 2007-10-26 | 2017-04-12 | Koninklijke Philips N.V. | Closed loop registration control for multi-modality soft tissue imaging |
CN102512246B (en) * | 2011-12-22 | 2014-03-26 | 中国科学院深圳先进技术研究院 | Surgery guiding system and method |
WO2016044830A1 (en) * | 2014-09-19 | 2016-03-24 | Think Surgical, Inc. | System and process for ultrasonic determination of long bone orientation |
US10646201B2 (en) * | 2014-11-18 | 2020-05-12 | C. R. Bard, Inc. | Ultrasound imaging system having automatic image presentation |
CN106373108A (en) * | 2016-08-29 | 2017-02-01 | 王磊 | Method and device for fusing real-time ultrasonic image and preoperative magnetic resonance image |
CN109745074B (en) * | 2019-01-21 | 2024-04-26 | 上海益超医疗器械有限公司 | Three-dimensional ultrasonic imaging system and method |
CN110025379B (en) * | 2019-05-07 | 2024-08-20 | 新博医疗技术有限公司 | Ultrasonic image and CT image fusion real-time navigation system and method |
CN110537961B (en) * | 2019-08-01 | 2021-09-28 | 中国人民解放军总医院 | Minimally invasive intervention guiding system and method for CT and ultrasonic image fusion |
US11364082B2 (en) * | 2019-10-17 | 2022-06-21 | Chang Bing Show Chwan Memorial Hospital | Fusion-imaging method for radio frequency ablation |
CN111938700B (en) * | 2020-08-21 | 2021-11-09 | 电子科技大学 | Ultrasonic probe guiding system and method based on real-time matching of human anatomy structure |
CN112245004A (en) * | 2020-10-20 | 2021-01-22 | 哈尔滨医科大学 | An ablation planning verification method based on preoperative model and intraoperative ultrasound images |
CN112381750A (en) * | 2020-12-15 | 2021-02-19 | 山东威高医疗科技有限公司 | Multi-mode registration fusion method for ultrasonic image and CT/MRI image |
-
2022
- 2022-03-10 CN CN202210240628.3A patent/CN114587590B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108420529A (en) * | 2018-03-26 | 2018-08-21 | 上海交通大学 | The surgical navigational emulation mode guided based on image in magnetic tracking and art |
Also Published As
Publication number | Publication date |
---|---|
CN114587590A (en) | 2022-06-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10166002B2 (en) | Ultrasonic bone motion tracking system | |
EP0488496B1 (en) | Noninvasive myocardial motion analysis using phase contrast MRI maps of myocardial velocity | |
EP3213682A1 (en) | Method and apparatus for three dimensional reconstruction of a joint using ultrasound | |
CN113116523B (en) | Orthopedic surgery registration device, terminal equipment and storage medium | |
JP2016105803A (en) | Navigation of catheter using impedance and magnetic field measurements | |
US20110153254A1 (en) | System And Method For Calibration For Image-Guided Surgery | |
CN101252870A (en) | System and method for radar-assisted catheter guidance and control | |
CN114587590B (en) | Intraoperative real-time tracking method and intraoperative real-time tracking system | |
Škrinjar et al. | Real time 3D brain shift compensation | |
US10792122B2 (en) | Object developing and calibrating method in a surgical environment | |
KR20210154811A (en) | Guiding method of robot arm, guiding system | |
Lathrop et al. | Minimally invasive holographic surface scanning for soft-tissue image registration | |
CN115089293B (en) | A calibration method for a spinal endoscopic surgery robot | |
US20100195890A1 (en) | Method for completing a medical image data set | |
Rana et al. | 3-D gait abnormality detection employing contactless IR-UWB sensing phenomenon | |
Wu et al. | A direction space interpolation technique for calibration of electromagnetic surgical navigation systems | |
Edwards et al. | A single-point calibration technique for a six degree-of-freedom articulated arm | |
CN218075211U (en) | Tool for measuring positioning error | |
Qi et al. | Automatic scan plane identification from 2D ultrasound for pedicle screw guidance | |
CN117494058A (en) | Respiratory motion prediction method, equipment and medium for assisting surgical robot puncture | |
EP3747387B1 (en) | Wrong level surgery prevention | |
Crowley et al. | Wireless Electromagnetic Sensors for Image-Guided Cardiothoracic Procedures | |
Ha et al. | A catheter posture recognition method in three dimensions by using RF signal computation | |
Mikaeili et al. | Evaluating Kalman filter effects for ultrasound images position estimation | |
CN119326441A (en) | A method and system for reconstructing the surface of human lower limb bones based on A-type ultrasonic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |