[go: up one dir, main page]

CN109091228B - Multi-instrument optical positioning method and system - Google Patents

Multi-instrument optical positioning method and system Download PDF

Info

Publication number
CN109091228B
CN109091228B CN201810724839.8A CN201810724839A CN109091228B CN 109091228 B CN109091228 B CN 109091228B CN 201810724839 A CN201810724839 A CN 201810724839A CN 109091228 B CN109091228 B CN 109091228B
Authority
CN
China
Prior art keywords
instrument
marker
instruments
frame
markers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810724839.8A
Other languages
Chinese (zh)
Other versions
CN109091228A (en
Inventor
张楠
武博
王宇
张梦诗
叶灿
贾博奇
梁楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Capital Medical University
Original Assignee
Capital Medical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Capital Medical University filed Critical Capital Medical University
Priority to CN201810724839.8A priority Critical patent/CN109091228B/en
Publication of CN109091228A publication Critical patent/CN109091228A/en
Application granted granted Critical
Publication of CN109091228B publication Critical patent/CN109091228B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Robotics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

本发明公开了一种多器械光学定位方法及系统,包括:多器械判别步骤,通过前后帧之间标记物的增减,判断是否有器械进入或退出;多器械识别步骤,基于所述器械上标记物的几何形状,识别各个器械,当有新器械进入时,识别所述新器械;多器械跟踪步骤,利用运动矢量跟踪算法跟踪多个所述器械的运动。本发明可以实时跟踪到多个器械,且可减少丢帧、个别器械不能追踪到的情况,解决了多器械跟踪的实时性;提出了利用边长比和/或周长识别器械,减少了对器械形状的限制,即使有相近形状的器械仍可以区分出来。本发明可判断出器械的增减情况,且在器械个数发生变化时仍能正确定位出所有器械的位置和方向。

Figure 201810724839

The invention discloses a multi-instrument optical positioning method and system, comprising: a multi-instrument identification step, which judges whether an instrument enters or exits through the increase or decrease of markers between the front and rear frames; the multi-instrument identification step is based on the The geometric shape of the marker identifies each instrument, and when a new instrument enters, the new instrument is identified; the multi-instrument tracking step uses a motion vector tracking algorithm to track the movement of a plurality of the instruments. The present invention can track multiple instruments in real time, and can reduce the situation that frames are lost and individual instruments cannot be tracked, and solve the real-time tracking of multiple instruments. Due to the limitation of device shape, even devices with similar shapes can still be distinguished. The present invention can determine the increase or decrease of instruments, and can still correctly locate the positions and directions of all instruments when the number of instruments changes.

Figure 201810724839

Description

Multi-instrument optical positioning method and system
Technical Field
The invention relates to the technical field of medical operation navigation, in particular to a multi-instrument optical positioning method and system.
Background
At present, the surgical navigation system is widely applied to the fields of neurosurgery, orthopaedics, otorhinolaryngology and the like, the time for surgery is greatly shortened, the pain of a patient is relieved, and the surgery precision is improved. Among them, the optical positioning system is widely used because of its advantages such as high positioning accuracy and high flexibility. The optical positioning system tracks the active luminous near-infrared LED lamp or the passive reflective ball on the instrument through the camera, so that the relative position relation between the tip point of the instrument and the focus is displayed in real time. Most studies today often require two or more instruments to be identified and tracked simultaneously by the system in an actual clinical procedure.
Currently, the advanced technology for optical positioning of instruments is north digital corporation of canada (NDI corporation), wherein Hybrid Polaris Spectra and Passive Polaris Spectra can track up to 15 wireless instruments (up to 6 active wireless instruments), Polaris Vicra can track up to 6 instruments (up to 1 active wireless instrument), and Polaris Vega can track up to 25 instruments (up to 6 active wireless instruments).
Unlike the tracking of passive and active wireless instruments, North Digital Incorporated (NDI) tracks active wired instruments, the controller can be used to illuminate markers individually to track the instruments, which is expensive and difficult to popularize in optical positioning systems. In patent publication No. CN1199054C, NDI proposes a method of tracking multiple objects, indicating that each object must have a unique segment length in its marker pair.
In patent publication No. CN101750607A of the university of ihua, the distances between the marker points on the same instrument are different, and the distances between the marker points on different instruments are also significantly different. However, this limits the case if there are pairs of marks on one instrument that are the same distance from the other instrument that are similar in shape to the different instruments.
Some prior arts propose to use kalman filter to track multiple objects, which can overcome the problem of delay to some extent, but when the object moves too fast, the target cannot be tracked well.
Disclosure of Invention
The purpose of the invention is realized by the following technical scheme.
The invention realizes real-time positioning of multiple instruments by using different geometric shapes formed by optical markers on each instrument and a motion vector tracking method, provides that the side length ratio is used for identifying the instruments in different shapes, and if the shapes of the instruments are similar, the actual circumferences of the instruments are used for distinguishing; tracking a plurality of moving instruments by using a motion vector tracking method; the change of the number of the instruments is monitored by using the increase and decrease of the markers of the front and the back frames.
Specifically, according to one aspect of the present invention, there is provided a multi-instrument optical positioning method, comprising:
a multi-instrument judging step, namely judging whether an instrument enters or exits through increase and decrease of the markers between the front frame and the rear frame;
a multi-instrument identification step, wherein each instrument is identified based on the geometric shape of a marker on the instrument, and when a new instrument enters, the new instrument is identified;
a multi-instrument tracking step of tracking the motion of a plurality of said instruments using a motion vector tracking algorithm.
Preferably, the label is actively and/or passively luminescent.
Preferably, the geometric shape comprises a side length ratio and/or a perimeter between the markers.
Preferably, the multi-instrument recognition step specifically includes: firstly, identifying instruments in different shapes by using the side length ratio between the markers, and if the instruments are similar in shape, distinguishing the different instruments by using the circumferences between the markers.
Preferably, the multi-instrument recognition step specifically includes:
acquiring left and right views of one or more instruments newly entering the system, identifying each instrument by utilizing different side length ratios and/or circumferences of marker pairs to obtain central pixel coordinates of each instrument marker in the left and right views, and calculating the spatial position and direction of an instrument tip point according to the identified pixel coordinates of each instrument;
preferably, the central pixel coordinate of the marker is determined based on a region growing method and a gray scale centroid method.
Preferably, the multi-instrument tracking step specifically includes:
in the t-th frame (t >1), the central pixel coordinates of each marker are calculated, and the marker pixel coordinates with the minimum motion change of each marker of the instrument identified in the t-th frame and the t-1 frame are searched by using a motion vector tracking method, so that the spatial position and the direction of the tip point of each instrument in the t-th frame are calculated.
According to another aspect of the present invention, there is also provided a multi-instrument optical positioning system, comprising:
the multi-instrument judging module judges whether an instrument enters or exits through increase and decrease of the markers between the front frame and the rear frame;
the multi-instrument recognition module is used for recognizing each instrument based on the geometric shape of the marker on the instrument, and recognizing a new instrument when the new instrument enters;
a multi-instrument tracking module that tracks the motion of a plurality of said instruments using a motion vector tracking algorithm.
The invention has the advantages that: the invention tracks the apparatus by using the method of motion vector tracking, can track a plurality of apparatuses in real time, can reduce the frame loss and the condition that individual apparatus is not tracked, and solves the real-time property of multi-apparatus tracking; it is proposed to identify instruments using side length ratios and/or perimeter lengths, reducing the restrictions on the shape of the instruments, even if there are similarly shaped instruments that can still be distinguished. The invention can judge the increase and decrease of the instruments by using the increase and decrease of the number of the markers between the front frame and the back frame, and can still correctly position the positions and the directions of all the instruments when the number of the instruments changes.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 illustrates a hardware infrastructure architecture diagram according to an embodiment of the invention;
FIG. 2 illustrates a schematic block diagram of a multi-instrument tracking optical localization method according to an embodiment of the present invention;
FIG. 3 shows a schematic representation of the construction of two differently sized instruments used in the present invention;
FIG. 4 illustrates the multi-instrument recognition algorithm process of the present invention;
FIG. 5 is a flow chart of a method for real-time tracking of an instrument using a time-series motion vector tracking method according to the present invention;
FIG. 6 is a schematic diagram illustrating the operation of a multi-instrument optical positioning system of the present invention;
FIG. 7 is a schematic diagram showing the effect of the MSI and TSI tracking the track simultaneously.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
FIG. 1 illustrates a hardware infrastructure architecture diagram according to an embodiment of the invention; the invention provides a multi-instrument optical positioning system, which comprises a plurality of instruments 1 (four are listed in fig. 1, but not limited to four, and less than four or more than four are also possible), a binocular camera 3, a host 4 and a display 5. Each instrument 1 has three markers 2, which may be actively or passively illuminated, such as actively illuminated near-infrared LED lamps or passive reflective spheres. The binocular camera 3 shoots images of the instrument 1 by using a binocular vision principle, tracks an actively luminous near-infrared LED lamp or a passive reflective ball on the instrument, receives and processes data by the host 4, and displays the data on the display 5, so that the relative position relation between the tip point of the instrument and a focus is displayed in real time.
As shown in FIG. 2, the technical scheme of the invention is divided into three parts, namely, the change condition of the number of the instruments is judged according to the increase and decrease of the markers of the front frame and the rear frame, the multiple instruments are identified based on the different shapes of the instruments and the actual circumferences of the markers of the instruments, and the motion of the multiple instruments is tracked by utilizing a motion vector tracking algorithm. As shown in fig. 2, first, the system is initialized, the marker coordinate tables of all instruments are cleared, and the number of markers is set to 0. When the number of the markers is detected to increase in the 1 st frame, a new instrument enters the 1 st frame, the instruments in the 1 st frame are identified by using different side length ratios and/or circumferences of the marker pairs, the central pixel coordinates of the markers of the instruments in the left and right images are obtained, and the central pixel coordinates of the markers of the instruments in the left and right images are obtained according to the central pixel coordinatesCalculating the spatial position and direction of the tip point of the instrument according to the identified pixel coordinates of each instrument; at the tth frame (t)>1) The central pixel coordinates of each marker are calculated firstly, and the marker pixel coordinates with the minimum motion change of each marker of the instrument identified in the t-1 frame are searched in the t-th frame by using a motion vector tracking method, so that the spatial position and the direction of the tip point of each instrument in the t-th frame are calculated. Wherein, the point connected by the solid line in the t-th frame is the marker obtained in the t-th frame, the point connected by the solid line in the (t-1) th frame is the marker obtained in the (t-1) th frame,
Figure BDA0001719499820000041
is a motion vector.
The method principle and the process for implementing the present invention are described in detail below.
Example 1
Step S1: judging increase or decrease of number of instruments based on increase or decrease of number of markers
In practical applications, there are often situations where a new instrument is added or removed while tracking the instrument. In order to prevent the system from influencing the tracking of other instruments when the number of the instruments changes, the invention judges whether the instruments enter or exit by detecting the increase and decrease conditions of the markers of the frames before and after detection. Firstly, initializing the system, and clearing the marker coordinate tables of all instruments, wherein the number of the markers is 0.
Step S2: marker geometry based instrument recognition
For the convenience of tracking and identification, markers can be added at different positions on the instrument to form different geometric shapes, so that the aim of distinguishing a plurality of instruments is fulfilled. Although using the principle of binocular vision, the distance between the markers is fixed after the instrument is calibrated. However, when the binocular camera captures a plurality of infrared markers of a plurality of instruments through the filter, the poses of the respective instruments are different, and thus the instruments cannot be distinguished simply by tracking the distances between the markers in real time. The invention adopts a method that the instrument is just aligned to the camera for the first time, and based on the fixed geometric shape of the marker, different instruments are distinguished by utilizing the side length ratio and/or the perimeter.
Taking the example of tracking two instruments, as shown in FIG. 3, the left small instrument in the figure is denoted as MSI; on the right is a large machine, denoted as TSI. The instrument is provided with a marker which is an actively luminous near-infrared light-emitting diode. Line segments are formed between the luminous markers, and it can be seen that the geometrical structures of the two instruments are different, the three markers of the small instrument are approximately in the shape of a right triangle, and the three markers of the large instrument are approximately in the shape of an isosceles triangle.
If there are L instruments, at least three markers { A, B, C } are set for each instrument, and are respectively marked as { A1,B1,C1},…,{Al,Bl,Cl},,{AL,BL,CLL ═ 1,2, …, L, the instrument number involved in optical localization is indicated, and L is the total number of instruments involved in optical localization.
Labeling the AlAnd Bl、AlAnd ClThe Euclidean distances therebetween are respectively denoted as
Figure BDA0001719499820000051
Of each instrument
Figure BDA0001719499820000052
Is different, thereby enabling the registration of the identification of multiple instruments.
Figure BDA0001719499820000053
Is represented as follows:
Figure BDA0001719499820000054
wherein
Figure BDA0001719499820000055
Respectively A on the first instrumentl、Bl、ClThe center pixel coordinates of the three markers. The method is based on a region growing method and a gray scale centroid method to calculate the central pixel coordinate of the marker.
FIG. 4 is a multi-instrument recognition algorithm process of the present invention. Firstly, placing a plurality of instruments over against a camera, acquiring a 1 st frame of left and right views, and respectively solving the coordinates of central pixel points of left and right view markers. By finding the leftmost point a in the left viewl(l∈[1,L]) Separately calculate AlEuclidean distance from the rest of the points, and is recorded as
Figure BDA0001719499820000061
The calculation is not carried out with the self point, and the calculation is shown as a formula (2):
Figure BDA0001719499820000062
computing
Figure BDA0001719499820000063
The ratio between two is recorded as η when η epsilon [ delta ]l-ε,δl+ε]Then, the three markers belong to the first instrument, wherein [ delta ]l-ε,δl+ε]For a set range of ratios for the first instrument, δlWhen the first instrument is opposite to the camera
Figure BDA0001719499820000064
The value of epsilon is an empirical error to solve the problem of deviation caused by the fact that the camera cannot be exactly aligned when instruments are manually placed. And for the marker in the right view, the same operation as that in the left view is carried out, so that the registration or pairing of each instrument is realized, and then the position of the tip point of each instrument is calculated by utilizing the stereo matching principle.
Step S3: motion vector tracking
In an actual process, the pose of the instrument is often required to be adjusted in real time according to a specific tracking position, the instrument is difficult to be completely opposite to the camera for acquisition, when the instrument rotates, the shape of the markers acquired by the camera changes, and when the rotation angle is too large, the markers exceed a set side length ratio range. In order to achieve more accurate positioning, the present invention proposes to track the instrument in real time by using a motion vector tracking method on a time series, as shown in fig. 5.
In order to calculate the position of the tip of the instrument in the current t (t is more than or equal to 2) th frame, the positions of markers of the left and right views in the t frame are tracked firstly, and the central pixel coordinates of the markers in the left and right views in the (t-1) th frame and the t frame are calculated. Since the relative motion of the adjacent frames is small, i.e. the motion vectors of the adjacent frames are small, the magnitude of the motion vector can be defined, i.e. the magnitude of the motion vector of the center point of the marker in different frames is:
Figure BDA0001719499820000065
i in formula (3)v,w(t) represents a motion vector from a V point of a (t-1) th frame to a W point of a t frame
Figure BDA0001719499820000066
Of which wherein
Figure BDA0001719499820000067
Respectively represent the position vectors of V point at the (t-1) th frame and W point at the t-1 th frame, and
Figure BDA0001719499820000071
then, according to step S2, the recognition result of each instrument is obtained, if A isl,Bl,Cl(L is more than or equal to 1 and less than or equal to L) are three mark points for identifying the registered first instrument, and the mark objects belong to the first instrument of the t frame according to the minimum motion vector amplitude of each mark object in the left view of the t frame in the following formula by the central pixel coordinate of the left view of the (t-1) th frame of each instrument.
Figure BDA0001719499820000072
In the formula (4)
Figure BDA0001719499820000073
Respectively representing a marker of the left view of the t-th frame and a marker A of the left view of the (t-1) th framel、Bl、ClThe marker corresponding to the minimum value of the motion vector magnitude of (a) belongs to the i-th instrument at the t-th frame. At the same time, the right view does the same algorithm tracking.
Therefore, the method realizes the registration of the multi-instrument markers of front and back frames and left and right views based on the motion vector, calculates the coordinates of the tip point of each instrument by using the binocular vision principle, and realizes the real-time tracking of each instrument.
According to another aspect of the present invention, there is also provided a multi-instrument tracking optical localization system, comprising: the multi-instrument judging module judges whether an instrument enters or exits through increase and decrease of the markers between the front frame and the rear frame; the multi-instrument recognition module is used for recognizing each instrument based on the geometric shape of the marker on the instrument, and recognizing a new instrument when the new instrument enters; a multi-instrument tracking module that tracks the motion of a plurality of said instruments using a motion vector tracking algorithm. Fig. 6 is a schematic diagram illustrating the operation of the multi-instrument tracking optical positioning system of the present invention. Firstly, initializing the system, and clearing the marker coordinate tables of all instruments, wherein the number of the markers is 0. The camera captures left and right images of all markers to be tracked and then calculates the center pixel coordinates of each marker. Judging whether an instrument enters or exits by using a multi-instrument judging module, and if so, identifying a new instrument by using a multi-instrument identifying module; if not, tracking each instrument by using the motion vector to obtain the central pixel coordinates of each instrument marker in the left and right images. And finally, performing stereo matching three-dimensional reconstruction on each instrument, and calculating the spatial position coordinates and the directions of the tip points of each instrument.
In order to verify the technical effect of the invention, a large number of experiments were also performed, and the following are experimental results.
Results of the experiment
1 marker matching assay
In the experiment, a multi-instrument multi-marker image is obtained by using Bumblebe 2 of PointGrey company, the image resolution is 640 multiplied by 480, a near infrared LED lamp is used as a marker, the peak wavelength is 850nm, and two near infrared filters (the wavelength range is 850-1000nm) are added in front of a camera to filter the interference of natural light. The MSI and TSI instruments are adopted as the instruments.
A large number of experiments prove that the error epsilon of experience is set to be 0.05, so that the problem of deviation caused by manually placing instruments over the camera can be well solved.
Respectively acquiring 100 images when MSI and TSI are right facing to the camera, and calculating
Figure BDA0001719499820000081
Table 1 and table 2 show the first 10 ratios and 100 averages of MSI and TSI, respectively.
TABLE 1 of MSI and TSI
Figure BDA0001719499820000082
Value of (A)
Figure BDA0001719499820000083
From the data in Table 1, the MSI can be expressed
Figure BDA0001719499820000084
The range is set to [1.332,1.432 ]]Of TSI
Figure BDA0001719499820000085
The range is set to [1.212,1.312 ]]From step S1, if there is a ratio η1∈[1.332,1.432],η2∈[1.212,1.312]Then the registered identification of MSI and TSI is completed.
To verify the real-time validity of the algorithm, two instruments are placed approximately 620mm from the camera, and then both instruments are moved simultaneously, with the MSI rounding away from the camera and the TSI rounding closer to the camera. The algorithm can display the positions and the motion tracks of the two instrument tip points in real time. FIG. 7, below, is a schematic diagram of trajectory tracking after movement of two instruments, where the solid line is the trajectory of the MSI tip point and the dashed line is the trajectory of the TSI tip point.
2 experiment of positioning accuracy
In order to measure the tracking precision of the algorithm, the invention designs an experiment to measure the precision of two instruments on the distance, and fixes the large instrument and the small instrument on the grating ruler and carries out positioning and tracking. The resolution of the grating ruler is 0.005mm, the moving distance is measured and displayed digitally through the movement of the slider of the grating ruler, the value is used as a true value, then the moving distance calculated by the algorithm is compared with the true value, and the precision of the algorithm on the distance is calculated.
Firstly, clearing the numerical value of the display of the grating ruler, recording the numerical value as a zero point, acquiring 100 images, and calculating the average value of the coordinates of the tip point of the instrument. And then, moving the slide block to a certain position, recording the distance value displayed by the grating ruler, and recording as an end point. At this time, 100 images are again acquired at the end point, and the average value of the coordinates of the tip point is obtained. And finally, calculating the distance between the zero point and the terminal point, and comparing the distance with the true value.
Simulated instrument measurements are shown in tables 2 and 3, with table 2 being the MSI distance measurements and table 3 being the TSI measurements.
TABLE 2 MSI measurement accuracy
Figure BDA0001719499820000091
Figure BDA0001719499820000101
TABLE 3 TSI measurement accuracy
Figure BDA0001719499820000102
As can be seen from tables 2 and 3, the average absolute errors between the distances measured by the grating scales of the MSI and the TSI and the distance calculated by the algorithm are respectively 0.065mm and 0.031mm, and the average RMSE between the distances measured by the grating scales of the MSI and the TSI is respectively 0.041mm and 0.102mm, so that the algorithm has higher precision. Meanwhile, in order to prove that the distance between the two instruments is basically unchanged, the Euclidean distance between the tip points of the two instruments is calculated through an algorithm, and the algorithm verification is carried out, as shown in Table 4.
TABLE 4 Euclidean distance between MSI and TSI instruments
Figure BDA0001719499820000103
Figure BDA0001719499820000111
The average of the European standard deviation of the distance in Table 4 is 0.053mm, and the distance between the two instrument tips is stable.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (8)

1.一种多器械光学定位方法,其特征在于,包括:1. a multi-instrument optical positioning method, is characterized in that, comprises: 多器械判别步骤,通过前后帧之间标记物的增减,判断是否有器械进入或退出;The multi-device discrimination step, through the increase or decrease of the markers between the front and rear frames, to determine whether there is a device entering or exiting; 多器械识别步骤,基于所述器械上标记物的几何形状,识别各个器械;当有新器械进入时,识别所述新器械;所述几何形状包括所述标记物间的边长比和/或周长;A multi-instrument identification step, identifying each instrument based on the geometry of the markers on the instruments; identifying the new instrument when a new instrument enters; the geometry including the side length ratio between the markers and/or perimeter; 多器械跟踪步骤,利用运动矢量跟踪算法跟踪多个所述器械的运动;在第t帧时,t>1,先计算出各标记物的中心像素坐标,利用运动矢量跟踪的方法在第t帧寻找与第t-1帧中识别出的器械的各个标记物运动变化最小的标记物像素坐标,从而计算出在第t帧时的各器械尖端点的空间位置和方向。In the multi-device tracking step, the motion vector tracking algorithm is used to track the motion of a plurality of the devices; in the t-th frame, t>1, the center pixel coordinates of each marker are first calculated, and the motion vector tracking method is used in the t-th frame. Find the marker pixel coordinates with the smallest movement change with each marker of the instrument identified in the t-1th frame, so as to calculate the spatial position and direction of the tip point of each instrument at the t-th frame. 2.根据权利要求1所述的方法,其特征在于,2. The method according to claim 1, wherein 所述标记物是主动发光和/或被动发光的。The markers are actively luminescent and/or passively luminescent. 3.根据权利要求1所述的方法,其特征在于,3. The method according to claim 1, wherein 所述多器械识别步骤具体为:首先使用标记物间的边长比识别不同形状的器械,若所述器械形状相近时,利用所述标记物间的周长区分不同器械。The multi-instrument identification step is specifically as follows: first, use the side-length ratio between the markers to identify instruments of different shapes, and if the instruments are similar in shape, use the perimeter between the markers to distinguish different instruments. 4.根据权利要求3所述的方法,其特征在于,4. The method of claim 3, wherein 所述多器械识别步骤具体为:The multi-device identification steps are specifically: 获取进入的一个或多个器械的左、右视图,利用标记物对的不同边长比和/或周长识别各个器械,得到各个器械标记物在所述左、右视图中的中心像素坐标,根据识别出的各器械的像素坐标计算出器械尖端点的空间位置和方向。Obtaining left and right views of one or more instruments entering, identifying each instrument using different side-to-side ratios and/or perimeters of the marker pairs, and obtaining the center pixel coordinates of each instrument marker in the left and right views, According to the identified pixel coordinates of each instrument, the spatial position and direction of the instrument tip point are calculated. 5.根据权利要求4所述的方法,其特征在于,5. The method according to claim 4, characterized in that, 基于区域生长法和灰度质心法对标记物求中心像素坐标。The center pixel coordinates of the markers are obtained based on the region growing method and the gray centroid method. 6.根据权利要求4所述的方法,其特征在于,6. The method of claim 4, wherein 所述多器械识别步骤具体为:The multi-device identification steps are specifically: (1)获取进入系统的一个或多个器械的左、右视图,分别求出左、右视图各标记物的中心像素点坐标;(1) Obtain the left and right views of one or more instruments entering the system, and obtain the coordinates of the center pixel point of each marker in the left and right views respectively; (2)通过在左视图中寻找最左边的点,分别计算最左点与左视图中其余点之间的欧式距离,并计算两两距离之间的比值,将该比值与已标定器械的边长比求差值,若所述差值在一定的范围内,则认为所述比值相关的标记物为所述已标定器械上的标记物;(2) By finding the leftmost point in the left view, calculate the Euclidean distance between the leftmost point and the remaining points in the left view respectively, and calculate the ratio between the two distances, and use this ratio with the edge of the calibrated instrument. The difference is calculated from the length ratio, and if the difference is within a certain range, the marker related to the ratio is considered to be the marker on the calibrated instrument; (3)删除已识别出的标记物,检测是否还有标记物存在;若存在,则重复步骤(2);若不存在,则结束对左视图标记物的识别;对于右视图的标记物,做与上述左视图同样的操作,由此实现各器械的注册或配对,然后利用立体匹配原理,计算各器械尖端点的位置。(3) delete the identified marker, and detect whether there is any marker; if it exists, repeat step (2); if it does not exist, end the identification of the marker in the left view; for the marker in the right view, Do the same operation as the above left view, so as to realize the registration or pairing of each instrument, and then use the principle of stereo matching to calculate the position of the tip point of each instrument. 7.根据权利要求1所述的方法,其特征在于,7. The method of claim 1, wherein, 所述多器械跟踪步骤具体为:The multi-instrument tracking steps are specifically: (1)计算第t-1帧、第t帧左、右视图中标记物的中心像素坐标,此时第t-1帧左、右视图已识别出各标记物所属的器械;(1) Calculate the center pixel coordinates of the markers in the left and right views of the t-1th frame and the t-th frame. At this time, the left and right views of the t-1th frame have identified the equipment to which each marker belongs; (2)选取第t-1帧中左视图已识别的一个器械上的一个标记物,计算所述标记物的中心像素坐标与第t帧中标记物的坐标之间的运动矢量的幅度,幅度最小的值所对应的第t帧中标记物与第t-1帧中所述标记物对应,为同一标记物在不同时刻的图像;(2) Select a marker on a device identified in the left view in the t-1th frame, and calculate the amplitude of the motion vector between the center pixel coordinates of the marker and the coordinates of the marker in the t-th frame, the amplitude The marker in the t-th frame corresponding to the smallest value corresponds to the marker in the t-1th frame, and is the image of the same marker at different times; (3)对于每个器械上的每个标记物都重复步骤(2),直到没有标记物为止;(3) Repeat step (2) for each marker on each device until there are no markers; (4)右视图做与上述左视图相同的跟踪。(4) The right view performs the same tracking as the above left view. 8.一种多器械光学定位系统,其特征在于,包括:8. A multi-instrument optical positioning system, characterized in that, comprising: 多器械判别模块,通过前后帧之间标记物的增减,判断是否有器械进入或退出;The multi-device discrimination module, through the increase or decrease of markers between the front and rear frames, judges whether a device enters or exits; 多器械识别模块,基于所述器械上标记物的几何形状,识别各个器械;当有新器械进入时,识别所述新器械;所述几何形状包括所述标记物间的边长比和/或周长;A multi-instrument identification module, for identifying each instrument based on the geometric shapes of the markers on the instruments; when a new instrument enters, identifying the new instruments; the geometric shapes include the side length ratio between the markers and/or perimeter; 多器械跟踪模块,利用运动矢量跟踪算法跟踪多个所述器械的运动;在第t帧时,t>1,先计算出各标记物的中心像素坐标,利用运动矢量跟踪的方法在第t帧寻找与第t-1帧中识别出的器械的各个标记物运动变化最小的标记物像素坐标,从而计算出在第t帧时的各器械尖端点的空间位置和方向。The multi-device tracking module uses the motion vector tracking algorithm to track the motion of a plurality of the described devices; in the t-th frame, t>1, first calculate the center pixel coordinates of each marker, and use the motion vector tracking method in the t-th frame. Find the marker pixel coordinates with the smallest movement change with each marker of the instrument identified in the t-1th frame, so as to calculate the spatial position and direction of the tip point of each instrument at the t-th frame.
CN201810724839.8A 2018-07-04 2018-07-04 Multi-instrument optical positioning method and system Active CN109091228B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810724839.8A CN109091228B (en) 2018-07-04 2018-07-04 Multi-instrument optical positioning method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810724839.8A CN109091228B (en) 2018-07-04 2018-07-04 Multi-instrument optical positioning method and system

Publications (2)

Publication Number Publication Date
CN109091228A CN109091228A (en) 2018-12-28
CN109091228B true CN109091228B (en) 2020-05-12

Family

ID=64845703

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810724839.8A Active CN109091228B (en) 2018-07-04 2018-07-04 Multi-instrument optical positioning method and system

Country Status (1)

Country Link
CN (1) CN109091228B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111461049B (en) * 2020-04-13 2023-08-22 武汉联影智融医疗科技有限公司 Space registration identification method, device, equipment and computer readable storage medium
CN116650119B (en) * 2023-07-24 2024-03-01 北京维卓致远医疗科技发展有限责任公司 Calibration reference frame for adjustable operation reference frame

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009024339B3 (en) * 2009-06-09 2010-09-16 Atlas Elektronik Gmbh Bearing method and direction finding system for detecting and tracking temporally successive bearing angles
CN101980229A (en) * 2010-10-12 2011-02-23 武汉大学 Space Tracking and Localization Method Based on Single Camera and Mirror Reflection
CN108053491A (en) * 2017-12-12 2018-05-18 重庆邮电大学 The method that the three-dimensional tracking of planar target and augmented reality are realized under the conditions of dynamic visual angle

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7346381B2 (en) * 2002-11-01 2008-03-18 Ge Medical Systems Global Technology Company Llc Method and apparatus for medical intervention procedure planning
US7778686B2 (en) * 2002-06-04 2010-08-17 General Electric Company Method and apparatus for medical intervention procedure planning and location and navigation of an intervention tool
JP5268433B2 (en) * 2008-06-02 2013-08-21 キヤノン株式会社 IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD
CN101327148A (en) * 2008-07-25 2008-12-24 清华大学 Instrument identification method for passive optical surgical navigation
CN101750607B (en) * 2008-07-25 2012-11-14 清华大学 Instrument identifying method for passive optical position fixing navigation system
US8657809B2 (en) * 2010-09-29 2014-02-25 Stryker Leibinger Gmbh & Co., Kg Surgical navigation system
US10228428B2 (en) * 2012-03-22 2019-03-12 Stylaero Ab Method and device for pose tracking using vector magnetometers
EP3318213B1 (en) * 2016-11-04 2022-10-12 Globus Medical, Inc System for measuring depth of instrumentation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009024339B3 (en) * 2009-06-09 2010-09-16 Atlas Elektronik Gmbh Bearing method and direction finding system for detecting and tracking temporally successive bearing angles
CN101980229A (en) * 2010-10-12 2011-02-23 武汉大学 Space Tracking and Localization Method Based on Single Camera and Mirror Reflection
CN108053491A (en) * 2017-12-12 2018-05-18 重庆邮电大学 The method that the three-dimensional tracking of planar target and augmented reality are realized under the conditions of dynamic visual angle

Also Published As

Publication number Publication date
CN109091228A (en) 2018-12-28

Similar Documents

Publication Publication Date Title
US20220395159A1 (en) Device and method for assisting laparoscopic surgery - directing and maneuvering articulating tool
Gsaxner et al. Inside-out instrument tracking for surgical navigation in augmented reality
CN110223355B (en) Feature mark point matching method based on dual epipolar constraint
CN106344153B (en) A kind of flexible puncture needle needle tip automatic tracking device and method
US5792147A (en) Video-based systems for computer assisted surgery and localisation
CN105919595B (en) System and method for the micro device in pursuit movement object body with magnetic signal
CN112184653B (en) Binocular endoscope-based focus three-dimensional size measuring and displaying method
JP2019535467A (en) Medical imaging jig and method of use thereof
US12167941B2 (en) System and method for spatial positioning of magnetometers
JP2014211404A (en) Motion capture method
CN109091228B (en) Multi-instrument optical positioning method and system
CN211178436U (en) System for magnetometer spatial localization
CN106236264A (en) The gastrointestinal procedures air navigation aid of optically-based tracking and images match and system
TWI594208B (en) The Method Of Complete Endoscopic MIS Instrument 3D Position Estimation Using A Single 2D Image
JP5714951B2 (en) Binocular pupil examination device
CN117073526A (en) Calibration and distance calculation method and device for laser tracker tracking measurement recovery
CN105559809B (en) Scanning method and device
CN117481755A (en) Puncture path determining system, method, device and medium for brain focus
CN107330936B (en) Monocular vision-based double-circular marker positioning method and system
CN112638251A (en) Method for measuring position
Zhang et al. Realtime robust shape estimation of deformable linear object
CN113066126A (en) Locating method of puncture needle point
CN106484124B (en) A kind of sight control method
CN107714175B (en) Surgical navigation positioning method and device
García et al. Calibration of a surgical microscope with automated zoom lenses using an active optical tracker

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant