[go: up one dir, main page]

CN115211963B - Trajectory display method, surgical robot and computer-readable storage medium - Google Patents

Trajectory display method, surgical robot and computer-readable storage medium Download PDF

Info

Publication number
CN115211963B
CN115211963B CN202210851098.6A CN202210851098A CN115211963B CN 115211963 B CN115211963 B CN 115211963B CN 202210851098 A CN202210851098 A CN 202210851098A CN 115211963 B CN115211963 B CN 115211963B
Authority
CN
China
Prior art keywords
coordinate system
target
image
coordinates
guiding tool
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210851098.6A
Other languages
Chinese (zh)
Other versions
CN115211963A (en
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiehang Robot Co ltd
Original Assignee
Shanghai Jiehang Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiehang Robot Co ltd filed Critical Shanghai Jiehang Robot Co ltd
Priority to CN202210851098.6A priority Critical patent/CN115211963B/en
Publication of CN115211963A publication Critical patent/CN115211963A/en
Application granted granted Critical
Publication of CN115211963B publication Critical patent/CN115211963B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Instruments for taking body samples for diagnostic purposes; Other methods or instruments for diagnosis, e.g. for vaccination diagnosis, sex determination or ovulation-period determination; Throat striking implements
    • A61B10/02Instruments for taking cell samples or for biopsy
    • A61B10/0233Pointed or sharp biopsy instruments
    • A61B10/0241Pointed or sharp biopsy instruments for prostate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Clinical applications involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/085Clinical applications involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3405Needle locating or guiding means using mechanical guide means
    • A61B2017/3409Needle locating or guiding means using mechanical guide means including needle or instrument drives
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3413Needle locating or guiding means guided by ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Robotics (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Vascular Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

本申请涉及一种轨迹显示方法、手术机器人和计算机可读存储介质。所述方法通过旋转超声探头,使得被安装于引导工具的医疗器械处于所述超声探头所扫描的矢状面,然后在所述超声探头扫描所述矢状面得到的超声图像中显示出医疗器械的预计运动轨迹,从而解决了无法在超声图像中显示医疗器械的预计运动轨迹问题,提高了采样成功率。

The present application relates to a trajectory display method, a surgical robot and a computer-readable storage medium. The method rotates an ultrasonic probe so that a medical device installed on a guide tool is in a sagittal plane scanned by the ultrasonic probe, and then displays the predicted motion trajectory of the medical device in an ultrasonic image obtained by scanning the sagittal plane by the ultrasonic probe, thereby solving the problem that the predicted motion trajectory of the medical device cannot be displayed in the ultrasonic image and improving the sampling success rate.

Description

Track display method, surgical robot, and computer-readable storage medium
Technical Field
The present application relates to the technical field of medical devices, and in particular, to a trajectory display method, a surgical robot, and a computer-readable storage medium.
Background
Biopsy (abbreviated as "biopsy") is a technique for performing pathological examination from a lesion tissue in a patient's body in response to diagnosis, treatment, or the like. Needle biopsy is one biopsy technique that uses a medical instrument, such as a biopsy needle (or needle, etc.), to sample diseased tissue of a patient percutaneously. Currently, doctors perform needle biopsies (e.g., prostate needle biopsies, etc.) by manually pushing medical instruments into the patient under the guidance of a guiding tool, such as a needle guide, for sampling. In the sampling process, after the guiding tool is positioned, the medical instrument passes through the guiding tool, in theory, the center of the needle head sampling section can be placed on a sampling point (or target point) of a focus area of a patient, and after the medical instrument completely penetrates into the prostate of the patient, the image of the medical instrument head can be seen on an ultrasonic image obtained by scanning the cross section of the ultrasonic probe.
However, since the current ultrasonic probe only scans the cross section to obtain the ultrasonic image of the corresponding cross section, the expected motion track of the medical instrument cannot be displayed, so that the doctor cannot be given with guidance for puncturing, if the patient moves after the guiding tool is positioned, the deviation between the expected initial point and the target point can be caused, in this case, the puncture is continued, the sampling failure rate is higher, in addition, after the sampling failure, the puncture sampling is required to be carried out again, the patient is injured secondarily, and the operation precision is lower.
Disclosure of Invention
In view of the above, it is desirable to provide a trajectory display method, a surgical robot, and a computer-readable storage medium that can solve the problem that an estimated motion trajectory of a medical instrument cannot be displayed in an ultrasound image.
A track display method, the method comprising:
calculating coordinates of the guiding tool in an image coordinate system corresponding to the ultrasonic probe;
controlling the rotation of the ultrasonic probe according to the coordinates of the guiding tool in the image coordinate system and the first sagittal plane currently scanned by the ultrasonic probe;
acquiring an ultrasonic image scanned by the ultrasonic probe after rotation;
based on the coordinates of the guiding tool in the image coordinate system and the length of the medical instrument mounted in the guiding tool, an expected motion trajectory of the medical instrument is displayed in the ultrasound image. In one embodiment, the method further comprises:
Acquiring the position of a target point;
Controlling an initial point of the guiding tool to be aligned with the target point, and positioning the guiding tool based on the position of the target point to obtain the coordinate of the guiding tool in the robot coordinate system;
the calculating the coordinates of the guiding tool in the image coordinate system corresponding to the ultrasonic probe comprises the following steps:
acquiring a coordinate conversion relation between a robot coordinate system and an image coordinate system;
and converting the coordinates of the guiding tool in the robot coordinate system into the coordinates in the image coordinate system based on the coordinate conversion relation.
In one embodiment, the position of the target point includes coordinates of the target point in the robot coordinate system, the controlling the initial point of the guiding tool to aim at the target point, positioning the guiding tool based on the position of the target point to obtain the coordinates of the guiding tool in the robot coordinate system includes:
acquiring coordinates of an initial point of the guiding tool in the robot coordinate system;
Obtaining a vertical angle and a horizontal angle of an initial point of the guidance tool relative to the target point based on coordinates of the target point in the robot coordinate system and coordinates of the initial point of the guidance tool in the robot coordinate system;
And controlling an initial point of the guiding tool to aim at the target point according to the vertical angle and the horizontal angle, and positioning the guiding tool based on the position of the target point to obtain the coordinates of the guiding tool in the robot coordinate system.
In one embodiment, the controlling the rotation of the ultrasound probe according to the coordinates of the guiding tool in the image coordinate system and the first sagittal plane currently scanned by the ultrasound probe includes:
Based on the coordinates of the guiding tool in the image coordinate system and the first sagittal plane currently scanned by the ultrasonic probe, a rotation angle of the ultrasonic probe is obtained, and the ultrasonic probe is controlled to rotate by the rotation angle, so that the medical instrument installed in the guiding tool is positioned in the second sagittal plane scanned by the ultrasonic probe.
In one embodiment, the obtaining the rotation angle of the ultrasound probe based on the coordinates of the guiding tool in the image coordinate system and the first sagittal plane currently scanned by the ultrasound probe includes:
Acquiring coordinates of the ultrasonic probe in the image coordinate system and a first sagittal plane currently scanned by the ultrasonic probe;
obtaining coordinates in the image coordinate system of a medical instrument mounted in the guiding tool based on the coordinates in the image coordinate system of the guiding tool;
Obtaining a second sagittal plane based on the coordinates of the ultrasonic probe in the image coordinate system and the coordinates of the medical instrument in the image coordinate system, wherein the second sagittal plane is a plane where the ultrasonic probe and the medical instrument are located;
Obtaining an included angle between the first sagittal plane and the second sagittal plane, wherein the included angle is a rotation angle of the ultrasonic probe.
In one embodiment, the method further comprises:
Acquiring a region of at least one target in the ultrasound image;
And outputting prompt information based on the distance between the predicted motion trail and the region of the target in the ultrasonic image.
In one embodiment, the at least one object includes a first object, and the acquiring the area of the at least one object in the ultrasound image includes:
acquiring the position of the first target in the ultrasonic image;
A region of the first object in the ultrasound image is obtained in the ultrasound image based on a preset three-dimensional shape and a position of the first object in the ultrasound image.
In one embodiment, the at least one object further includes a second object, and the acquiring the area of the at least one object in the ultrasound image includes:
Taking the set starting point as a center, carrying out boundary searching outwards according to the size of the pixel value to obtain a plurality of first boundary pixel points, generating a first curved surface by the plurality of first boundary pixel points, wherein the value of the first boundary pixel points is larger than or equal to a preset first threshold value;
carrying out subdivision processing on the first curved surface by using a subdivision model to obtain a plurality of pixel points after subdivision processing;
And carrying out boundary search on the starting points according to the pixel values by taking the pixel points subjected to the subdivision processing as the starting points to obtain a plurality of second boundary pixel points, generating a second curved surface by the second boundary pixel points, wherein the value of the second boundary pixel points is smaller than or equal to a preset second threshold value, and the second curved surface is used for representing the region of the second target in the ultrasonic image.
In one embodiment, the performing the boundary search outwards with the set starting point as the center according to the pixel value size to obtain a plurality of first boundary pixels includes:
And carrying out boundary searching outwards according to a sphere coordinate system by taking the starting point as the center, and if the pixel value of the searched pixel point is greater than or equal to a preset first threshold value, taking the searched pixel point as the first boundary pixel point.
In one embodiment, the prompt information includes a first prompt information and a second prompt information, and the outputting the prompt information based on the distance between the predicted motion trail and the target in the region of the ultrasound image includes:
selecting a plurality of track points from the predicted motion track;
Obtaining the distance between each track point and the target;
If the distance between any one of the track points and the target in the ultrasonic image is smaller than or equal to a preset distance threshold value, outputting first prompt information, wherein the first prompt information is used for indicating that the predicted motion track threatens the target;
and if the distances between all the track points and the target in the ultrasonic image are greater than the preset distance threshold, outputting the second prompt information, wherein the second prompt information is used for indicating that the predicted motion track has no threat to the target.
A surgical robot, comprising:
the ultrasonic probe is used for acquiring ultrasonic images;
a guide tool having a medical instrument mounted therein, and
A control device comprising a memory storing a computer program and a processor implementing the steps of the method described in any one of the embodiments above when the processor executes the computer program;
A display for displaying in the ultrasound image an expected motion profile of the medical instrument and/or a region of at least one target in the ultrasound image.
A computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method described in any of the embodiments above.
According to the track display method, the surgical robot and the computer readable storage medium, the ultrasonic probe is rotated to enable the medical instrument mounted on the guiding tool to be in the sagittal plane scanned by the ultrasonic probe, and then the expected motion track of the medical instrument is displayed in the ultrasonic image obtained by scanning the sagittal plane by the ultrasonic probe, so that the problem that the expected motion track of the medical instrument cannot be displayed in the ultrasonic image is solved, and the sampling success rate is improved.
Drawings
FIG. 1 is an application environment diagram of a track display method in one embodiment;
FIG. 2 is a flow chart of a track display method according to an embodiment;
FIG. 3 is a schematic diagram of coordinate transformation of a robot coordinate system and an image coordinate system in a trajectory display method according to an embodiment;
FIG. 4 is a schematic diagram of coordinates of positioning a guiding tool in a track display method according to an embodiment;
FIG. 5 is a schematic diagram of controlling rotation of an ultrasound probe to a second sagittal plane in a trajectory display method in one embodiment;
FIG. 6 is a flowchart of a track display method according to another embodiment;
FIG. 7 is a schematic representation of a cross-sectional ultrasound image and a sagittal ultrasound image in one embodiment;
FIG. 8 is a schematic diagram showing predicted motion trajectories and organ structure in sagittal ultrasound images, in one embodiment;
FIG. 9 is a flow chart of segmentation of ultrasound images in a trajectory display method in one embodiment;
FIG. 10 is a flowchart of a method for displaying a track according to an embodiment;
FIG. 11 is a flowchart of a method for displaying a track according to another embodiment;
FIG. 12 is a flow diagram of a method of ultrasound image segmentation in one embodiment;
FIG. 13 is a block diagram showing a track display device in one embodiment;
fig. 14 is a block diagram showing the structure of an ultrasound image segmentation apparatus in one embodiment.
FIG. 15 is a block diagram of a surgical robot in one embodiment;
FIG. 16 is a hardware configuration diagram of a surgical robot in another embodiment;
FIG. 17 is a schematic diagram of a software serial port read-write control motor of the surgical robot in one embodiment;
fig. 18 is an internal structural view of a computer device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
As described above, since the current ultrasonic probe only scans the cross section to obtain the ultrasonic image of the corresponding cross section, the expected motion track of the medical apparatus, such as the puncture needle, for example, the expected needle insertion track, cannot be displayed, so that the doctor cannot be given with the guidance on the operation, such as the puncture, if the patient moves after the guiding tool is positioned, the expected initial point and the target point will deviate, in this case, the puncture is continued, which will result in a higher sampling failure rate, and after the sampling failure, the puncture sampling needs to be performed again, causing secondary injury to the patient, and the operation accuracy is lower.
Therefore, the application provides a track display method, an ultrasonic image segmentation device, a surgical robot, computer equipment, computer readable storage media and a computer program product, which can display the expected motion track of a medical instrument in an ultrasonic image obtained by scanning a sagittal plane by an ultrasonic probe and give a doctor guidance for puncture, thereby solving the problem that the expected motion track of the medical instrument cannot be displayed in the ultrasonic image and improving the sampling success rate. In other words, before the doctor performs the puncture operation, if deviation occurs between the predicted initial point and the target point due to movement of the patient, the predicted movement track of the medical instrument is shown to be deviated, or the predicted movement track of the medical instrument is not in the range of the area allowing needle insertion, the doctor can re-plan the target point, thereby avoiding secondary damage to the patient and improving the success rate of the doctor operation.
Taking a prostate puncture biopsy as an example, fig. 1 is a sagittal view schematic diagram of the surgical robot for performing a prostate puncture biopsy according to the present embodiment. As shown in fig. 1, the surgical robot 900 may include an ultrasound probe 910 for acquiring ultrasound images, in alternative embodiments the ultrasound probe may be replaced with other means for acquiring ultrasound images, a motor 920 for driving the ultrasound probe to move or rotate, such as a stepper motor, a rotating motor, etc., for driving the ultrasound probe to scan through the rectum of a patient to obtain ultrasound images of the prostate, a guidance tool 930 with a medical instrument 931 mounted therein, a robotic arm 940 for moving and positioning the guidance tool 930 to assist a physician in performing a lancing operation or driving the guidance tool 930 by the robotic arm 940 to perform an operation, such as a lancing operation, etc., that is used to represent a puncture of the medical instrument 931 in the guidance tool 930 from an initial point 902 on the surface of the prostate 901 to a target point 903 inside the prostate 901 for biopsy sampling. It should be noted that the guide tool 930 may be a needle guide, the needle guide may be a needle guide sheath, and the medical device 931 may be a puncture needle.
Further, before performing the above puncture operation, the surgical robot may execute the track display method provided by the embodiment of the present application to display the predicted motion track of the medical instrument in the ultrasound image obtained by the ultrasound probe 910 scanning the sagittal plane, and give the doctor guidance for performing the puncture, thereby solving the problem that the predicted motion track of the medical instrument cannot be displayed in the ultrasound image, and improving the sampling success rate.
In one embodiment, as shown in fig. 2, a track display method is provided, and the method is applied to the prostate biopsy puncture in fig. 1, for example, and includes the following steps:
S101, calculating coordinates of the guiding tool in an image coordinate system corresponding to the ultrasonic probe.
Wherein the calculation of the coordinates of the guiding tool in the image coordinate system corresponding to the ultrasonic probe may include obtaining a coordinate conversion relationship between the robot coordinate system and the image coordinate system, and converting the coordinates of the guiding tool in the robot coordinate system into the coordinates in the image coordinate system based on the coordinate conversion relationship.
In this embodiment, the robot coordinate system is referenced to the surgical robot. Specifically, the robot coordinate system may be represented as a robot arm coordinate system or a global coordinate system of the surgical robot, for example, the robot arm coordinate system uses a base of a robot arm as an origin coordinate. The image coordinate system is based on the ultrasonic probe, namely the image coordinate system can be expressed as an ultrasonic image coordinate system acquired by the ultrasonic probe.
In step S101, the coordinates of the guiding tool in the robot coordinate system may be determined according to the distance or angle between a specific position of the guiding tool and the origin coordinates of the mechanical arm, where the specific position may be determined according to needs, for example, it may be a center point of the guiding tool or a connection point of the guiding tool and the medical apparatus, and the specific position is not specifically limited herein, and may be determined by a person skilled in the art according to needs. In particular, the coordinates of the guiding tool in the robot coordinate system may be represented as three-dimensional coordinates. Further, since the medical instrument is mounted on the guiding tool, the positions of the guiding tool and the guiding tool are relatively fixed, and therefore the coordinates of the medical instrument in the guiding tool in the robot coordinate system, the needle inserting direction or the length direction of the medical instrument and the like can be determined according to the coordinates of the guiding tool in the robot coordinate system and the rotation angle of the guiding tool.
Fig. 3 is a schematic diagram showing a coordinate transformation relationship between a robot coordinate system T1 and an image coordinate system T2. Specifically, the coordinate conversion relation in the robot coordinate system and the image coordinate system can be expressed by the following formula:
Where (x, y, z) is the coordinates of point a in the robot coordinate system, (x 0,y0,z0) is the coordinates of point a in the image coordinate system, and M is the rotation matrix or coordinate conversion matrix or the like converted from the image coordinate system to the robot coordinate system. (x t0,yt0,zt0) is a compensation coordinate from the image coordinate system to the robot coordinate system, (x t0,yt0,zt0) can be determined according to the length fProbeLength of the ultrasonic probe and the calibration deviation of the robot, and in general, x t0 and y t0 are smaller and can be set to 0, z t0 = fProbeLength. λ is a scaling parameter in the image coordinate system, and is adjusted according to a user operation, and a default value is set to 1.
More specifically, the rotation matrix M may be determined when the surgical robot performs coordinate calibration, which may be expressed as performing coordinate calibration according to a relative distance or a rotation angle between the ultrasonic probe and the mechanical arm, etc., for example, the rotation matrix around x, y, and z axes in a three-dimensional rectangular coordinate system may be expressed by the following formula:
Wherein, R x (alpha) is a rotation matrix around the x axis, R y (theta) is a rotation matrix around the y axis, R z (gamma) is a rotation matrix around the z axis, and alpha, theta and gamma are rotation angles obtained by calibration according to actual coordinates of the surgical robot.
In one embodiment, the initial point of the guiding tool is aimed at a target point, which may be a focal point, target area, etc. marked by a physician in the patient (e.g., prostate, etc.), considering that the guiding tool needs to be positioned prior to performing the puncturing operation. Wherein the initial point may be a needle insertion point under a piercing operation. Specifically, the method further comprises:
s201, acquiring the position of a target point;
S202, controlling an initial point of the guiding tool to be aligned with the target point, and positioning the guiding tool based on the position of the target point to obtain the coordinates of the guiding tool in the robot coordinate system.
In step S201, the position of the target point may include the coordinate of the target point in the robot coordinate system, or the coordinate of the target point in the image coordinate system, or the like, for example, the coordinate of the target point in the image coordinate system may be obtained by marking the ultrasound image by the doctor, or the coordinate of the target point in the robot coordinate system may be obtained by converting the coordinate of the target point in the image coordinate system according to the above-described coordinate conversion relationship.
In step S202, the initial point of the guidance tool may be used to indicate a needle insertion point, a needle insertion angle or direction, etc. of the medical device mounted in the guidance tool into the patient (e.g., prostate, etc.), and the initial point alignment target point of the guidance tool may be used to indicate a needle insertion direction or angle alignment target point of the medical device of the guidance tool, so that the medical device may be linearly penetrated to a target point position in the patient, thereby completing sampling.
In one embodiment, step S202 may include:
s203, acquiring coordinates of an initial point of the guiding tool in the robot coordinate system.
S204, obtaining a vertical angle and a horizontal angle of the initial point of the guiding tool relative to the target point based on the coordinates of the target point in the robot coordinate system and the coordinates of the initial point of the guiding tool in the robot coordinate system. Wherein vertical angle as well as horizontal angle herein refer to vertical angle and horizontal angle in the robot coordinate system, wherein vertical angle may correspond to the z-axis direction in the robot coordinate system and horizontal angle may correspond to the y-axis direction in the robot coordinate system, for specific calculations see below.
And S205, controlling the initial point of the guiding tool to be aligned to the target point according to the vertical angle and the horizontal angle, and positioning the guiding tool based on the position of the target point to obtain the coordinates of the guiding tool in a robot coordinate system.
As shown in fig. 4, in step S204, the vertical angle H and the horizontal angle V are angles in the robot coordinate system, and can be calculated by the following formula:
H=tan-1(-(xt-xp)/(zt-zp))
V=tan-1(-(yt-yp)/(zt-zp))
Wherein H is a horizontal angle, V is a vertical angle, P (x p,yp,zp) is the coordinate of the initial point in the robot coordinate system, and T (x t,yt,zt) is the coordinate of the target point in the robot coordinate system.
In this embodiment, in step S205, the control robot drives the guiding tool to move or rotate according to the horizontal angle and the vertical angle, so that the initial point of the guiding tool is aligned with the target point.
It should be noted that, to ensure the accuracy of the coordinate conversion of the guiding tool, the above steps S201 to S202 are performed to position the guiding tool so that the initial point of the guiding tool is aligned with the target point, and then the step S101 is performed to perform the coordinate conversion of the guiding tool to obtain the coordinates of the guiding tool in the image coordinate system.
S102, controlling the rotation of the ultrasonic probe according to the coordinates of the guiding tool in the image coordinate system and the first sagittal plane of the current scanning of the ultrasonic probe.
Wherein controlling the rotation of the ultrasound probe according to the coordinates of the guiding tool in the image coordinate system and the first sagittal plane of the current scan of the ultrasound probe may include obtaining a rotation angle of the ultrasound probe based on the coordinates of the guiding tool in the image coordinate system and the first sagittal plane of the current scan of the ultrasound probe, and controlling the rotation angle of the ultrasound probe such that the medical instrument mounted in the guiding tool is in the second sagittal plane of the scan of the ultrasound probe.
That is, in step S102, the ultrasound probe is in a state of scanning a sagittal plane (such as a sagittal plane of the prostate, etc.), and the sagittal plane currently scanned by the ultrasound probe is the first sagittal plane. If the ultrasound probe is in a scanning cross-section (e.g., a cross-section of a prostate), the ultrasound probe is adjusted to scan the sagittal plane of the prostate.
Since the medical instrument mounted in the guiding means may not be in the first sagittal plane currently scanned by the ultrasound probe, the rotation angle of the ultrasound probe may be used to indicate the relative rotation angle of the ultrasound probe from the first sagittal plane currently scanned to the medical instrument in step S102, i.e. after the rotation of the ultrasound probe by the rotation angle, the target point, the medical instrument and its needle insertion direction (or length direction, etc.) are all in the sagittal plane (i.e. the second sagittal plane) scanned after the rotation of the ultrasound probe.
In one embodiment, step S102 may include:
S301, acquiring a coordinate of the ultrasonic probe in the image coordinate system and a first sagittal plane currently scanned by the ultrasonic probe;
S302, acquiring coordinates of a medical instrument installed in the guiding tool in the image coordinate system based on the coordinates of the guiding tool in the image coordinate system;
s303, obtaining a second sagittal plane based on the coordinates of the ultrasonic probe in the image coordinate system and the coordinates of the medical instrument in the image coordinate system, wherein the second sagittal plane is a plane where the ultrasonic probe and the medical instrument are located;
S304, obtaining an included angle between the first sagittal plane and the second sagittal plane, wherein the included angle is the rotation angle of the ultrasonic probe.
It will be appreciated that, since the coordinates of the ultrasound probe in the image coordinate system remain fixed, by rotating the ultrasound probe, the sagittal plane currently being scanned by the ultrasound probe (i.e., the first sagittal plane described above) may be rotated about the ultrasound probe until the sagittal plane to which the ultrasound probe is rotated to scan (i.e., the second sagittal plane described above) coincides with the medical instrument and its needle insertion direction or length direction.
In step S302, the coordinates of the medical instrument mounted in the guiding tool in the image coordinate system may be calculated from the coordinates of the guiding tool in the image coordinate system, the length of the medical instrument and the position or angle of the medical instrument relative to the guiding tool, wherein the length of the medical instrument and the position or angle of the medical instrument relative to the guiding tool may be measured, that is to say the position or angle of the medical instrument after mounting is fixed relative to the guiding tool, and thus may be determined by measurement and stored such that it remains unchanged throughout the operation, and thus may be directly acquired and used.
In step S303, the coordinates of the medical instrument in the image coordinate system may be represented as line segment coordinates, and the ultrasound probe and the medical instrument may define a plane, where the plane is a second sagittal plane, i.e., the ultrasound probe and the medical instrument and the needle insertion direction or the length direction thereof are in the second sagittal plane.
In step S304, since the ultrasound probe is the intersection point of the first sagittal plane and the second sagittal plane, the angle between the first sagittal plane and the second sagittal plane may be represented as the rotation angle of the ultrasound probe, and by controlling the rotation angle of the ultrasound probe, the ultrasound probe may be rotated to a position for scanning the second sagittal plane, so as to scan the second sagittal plane to obtain the ultrasound image. As shown in fig. 5, the left view is a schematic diagram of the positional relationship between the ultrasonic probe 910 and the medical apparatus 931 when scanning the first sagittal plane S1, and the right view is a schematic diagram of the ultrasonic probe 910 and the medical apparatus 931 when scanning the second sagittal plane S2 after rotating the ultrasonic probe 910 by the rotation angle a.
S103, acquiring an ultrasonic image scanned by the ultrasonic probe after rotation.
It should be noted that before performing the puncture sampling, the medical device does not puncture into the patient (such as the prostate), and the sagittal plane of the prostate may be displayed in the ultrasound image obtained by the ultrasound probe scanning the second sagittal plane, so that in order to give the doctor guidance for performing the puncture, etc., the expected motion trajectory of the medical device needs to be displayed in the ultrasound image obtained by the ultrasound probe scanning the second sagittal plane.
And S104, displaying the expected motion trail of the medical instrument in the ultrasonic image based on the coordinates of the guiding tool in the image coordinate system and the length of the medical instrument installed in the guiding tool.
That is, according to the coordinates of the guiding tool in the image coordinate system, the rotation angle of the guiding tool, and the position or angle of the medical instrument relative to the guiding tool, the coordinates of the medical instrument in the image coordinate system and the needle insertion direction of the medical instrument can be determined, and then according to the coordinates of the medical instrument in the image coordinate system, the needle insertion direction, and the length of the medical instrument, the expected motion track of the medical instrument can be drawn in the ultrasound image.
Further, as shown in fig. 6, the track display method in the embodiment of the application has the following flow:
S1, marking focus points or targets and the like;
s2, positioning the guiding tool to enable an initial point of the guiding tool to be aligned to a focus point;
S3, acquiring coordinates of the guiding tool in an image coordinate system;
S4, calculating a rotation angle of the ultrasonic probe from a first sagittal plane to a second sagittal plane;
s5, rotating the ultrasonic probe and scanning a second sagittal plane to obtain an ultrasonic image;
s6, displaying the expected motion trail in the ultrasonic image.
Therefore, the track display method provided by the embodiment of the application can display the predicted motion track of the medical instrument in the ultrasonic image obtained by the ultrasonic probe scanning the sagittal plane of the prostate (such as the second sagittal plane), and give the doctor a guidance for puncturing, thereby solving the problem that the predicted motion track of the medical instrument can not be displayed in the ultrasonic image and improving the sampling success rate.
Furthermore, the target point can be displayed in the ultrasonic image, so that a doctor can observe whether the predicted motion track of the medical instrument and the target point have obvious deviation in the ultrasonic image, and then determine whether to perform puncture operation, if the predicted motion track and the target point (such as the target point) have deviation due to movement of a patient and the like, the doctor can plan the target point again, secondary damage to the patient is avoided, and the success rate of operation of the doctor is improved.
In other alternative embodiments, the method may further include determining whether the target point deviates from the predicted motion trajectory according to the distance between the target point and the predicted motion trajectory, for example, if the distance between the target point and the predicted motion trajectory is less than or equal to a deviation threshold, the doctor may perform the puncturing operation if the distance between the target point and the predicted motion trajectory is less than or equal to the deviation threshold, if the distance between the target point and the predicted motion trajectory is greater than the deviation threshold, the doctor may reprogram the target point or reprogram the motion trajectory of the medical instrument, so as to avoid secondary damage to the patient, and improve the success rate of the doctor operation.
In addition, as shown in fig. 7, the left image is an ultrasonic image obtained by scanning a cross section of an ultrasonic probe, the black area in the left image is a cross section of a prostate, the white point in the black area is a medical instrument, the right image is an ultrasonic image obtained by scanning a sagittal plane of the ultrasonic probe, the black area in the right image is a sagittal plane of the prostate, and the white track in the black area is an actual motion track of the medical instrument. That is, in the process that the doctor performs the puncture operation, the actual motion track of the medical instrument can be displayed in real time in the ultrasonic image obtained by scanning the second sagittal plane by the ultrasonic probe, so as to guide the doctor to perform the puncture operation, and if the actual motion track has obvious deviation, the doctor can adjust the actual motion track of the medical instrument or stop the puncture operation, so as to avoid damage to the patient and improve the success rate of the operation.
In one embodiment, the method further comprises:
s401, acquiring an area of at least one target in the ultrasonic image;
s402, outputting prompt information based on the distance between the expected motion trail and the region of the target in the ultrasonic image.
In this embodiment, the target may be represented as an organ of the patient, for example, in scanning a sagittal plane of the prostate to obtain an ultrasound image, and the target may include organs such as a rectum, a urethra, and a bladder. Taking the prostate puncture sampling as an example, in step S401, the at least one target may include a first target, which may be an approximately regular-shaped organ, such as a rectum or urethra, and a second target, which may be an irregularly-shaped organ, such as a bladder, and the like. The region of the target in the ultrasound image may include three-dimensional coordinates in the ultrasound image. A sagittal view of the predicted motion profile and organs shown in the ultrasound image is shown in fig. 8, where L is the predicted motion profile, 901 is the prostate, 904 is the rectum, 905 is the urethra, and 906 is the bladder.
In order to avoid the threat or damage to each organ caused by the medical instrument during the puncture sampling process, in step S402, if the distance between the predicted motion track of the medical instrument and any organ is too short, or if the predicted motion track of the medical instrument passes through any organ, a prompt message that the predicted motion track of the medical instrument is threatening or will cause damage to the organ is output, so as to prompt a doctor to reprogram the target spot or adjust the motion track of the medical instrument, so as to avoid damage to the organ of the patient and bring unnecessary damage to the patient.
In one embodiment, since the first object, such as rectum, urethra, etc., may be represented as an approximately regular three-dimensional shape, such as a cylinder, etc., to facilitate determining the region of the first object (e.g., a regular-shaped organ, etc.) in the ultrasound image, step S401 may include:
S501, acquiring the position of a first target in the ultrasonic image;
s502, obtaining a region of the first target in the ultrasonic image based on a preset three-dimensional shape and the position of the first target in the ultrasonic image.
Alternatively, in step S501, the position of the first target in the ultrasound image may be determined according to a preset distance. The preset distance may be determined according to a relative position of the first target in the ultrasound image or a relative distance of the first target and the ultrasound probe. Taking the first object as a rectum as an example, in the case that a human body lies down, the rectum is located below the prostate, and the ultrasonic probe scans the prostate upwards through the rectum to obtain an ultrasonic image, so that the preset distance can be selected to be 5mm, which is only an example, and can be determined according to the condition of a patient, and the rectum can be represented at a position 5mm far from the lowest part of the ultrasonic image.
Alternatively, in step S501, the position of the first target in the ultrasound image may be marked in the ultrasound image by a user (e.g., a physician). Taking the first object as the urethra as an example, the physician may mark a planar contour of the urethra on the urethra area shown in the ultrasound image, which planar contour represents the position of the urethra in the ultrasound image.
In step S502, the preset three-dimensional shape may be determined according to the three-dimensional shape of the organ, such as the rectum and the urethra are both cylinders. Further, in order to show the reasonable size of the organ in the ultrasound image, the preset three-dimensional shape can be scaled according to a preset scaling ratio, and then the scaled preset three-dimensional shape is generated at the position of the first target in the ultrasound image, so that the region (i.e. the three-dimensional space region) of the first target in the ultrasound image can be obtained. The preset scale may be determined based on the proportional size of the organ to the ultrasound image.
In one embodiment, for an irregularly shaped organ, such as a bladder, etc., the above steps S501 to S502 are not suitable for determining a three-dimensional space region in an ultrasound image, so as to solve the problem of how to determine the three-dimensional space region in which the irregularly shaped organ is located in the ultrasound image, in step S401, the ultrasound image is segmented by using an ultrasound image segmentation method to obtain a region of a second object (such as an irregularly shaped organ, etc.) in the ultrasound image, and specifically, step S401 may further include:
S601, carrying out boundary searching outwards by taking a set starting point as a center according to the size of a pixel value to obtain a plurality of first boundary pixel points, and generating a first curved surface by the plurality of first boundary pixel points, wherein the value of the first boundary pixel points is larger than or equal to a preset first threshold value;
S602, carrying out subdivision processing on the first curved surface by using a subdivision model to obtain a plurality of pixel points after subdivision processing;
And S603, carrying out boundary search on the starting points by taking a plurality of pixel points subjected to subdivision processing as the starting points according to the pixel value, obtaining a plurality of second boundary pixel points, generating a second curved surface by the second boundary pixel points, wherein the value of the second boundary pixel points is smaller than or equal to a preset second threshold value, and the second curved surface is used for representing the region of the second target in the ultrasonic image.
In step S601, a position of the second target in the ultrasound image may be acquired, and then a start point may be selected from the position, where the position of the second target in the ultrasound image may be obtained by marking a region where the second target, such as a bladder, displayed in the ultrasound image by a user, such as a doctor, is located, and then the start point may be a center point or a center point of gravity, etc. at the position. In some alternative embodiments, the starting point may also be marked by the user directly on the region of the ultrasound image where the second target is located.
In consideration of a certain difference between the pixel value of the organ and the pixel value of the peripheral pixel point of the boundary in the ultrasound image, in step S601, a boundary search is performed by using the pixel value size to obtain a plurality of first boundary pixel points. Specifically, step S601 may include:
And carrying out boundary searching outwards according to a sphere coordinate system by taking the starting point as the center, and if the pixel value of the searched pixel point is greater than or equal to a preset first threshold value, taking the searched pixel point as the first boundary pixel point.
That is, the starting point is taken as the center, the method searches outwards with a preset step length according to a sphere coordinate system until the pixel value of a certain pixel point is greater than or equal to a preset first threshold value, the pixel point is a first boundary pixel point which is searched, and then a first curved surface is generated according to a point cloud formed by a plurality of first boundary pixel points, namely the first curved surface comprises a plurality of first boundary pixel points.
Considering that the boundary pixel points of the organ in the acquired ultrasound image may be lost or disturbed due to the influence of environmental noise and the like when the ultrasound probe acquires the image, the first curved surface obtained in step S601 may deviate, for example, the first curved surface is output as an image segmentation result, which obviously has the problem of lower accuracy of image segmentation. Therefore, in order to improve the accuracy of image segmentation, in step S602, the subdivision model is used to perform subdivision processing on the first curved surface to obtain a plurality of pixel points after subdivision processing, then in step S603, boundary searching is performed on the starting point according to the pixel value by taking the pixel points after subdivision processing as the starting point, so as to obtain a plurality of second boundary pixel points, and the second curved surface is generated by the plurality of second boundary pixel points, so that the second curved surface more accurately represents the three-dimensional space region of the organ in the ultrasound image, the second curved surface is output as the image segmentation result, and the accuracy of image segmentation is obviously improved.
In this embodiment, the subdivision model may be a filter based on a Loop subdivision algorithm, such as vtkLoopSubdivisoon filters, which can insert edge points and update original pixel points according to each pixel point in the first curved surface, continuously subdivide the edge points to increase the number of faces of the triangle curved surface, and obtain newly added vertices of the triangle curved surface, where the vertices are pixel points after subdivision. That is, the subdivision model is used to perform subdivision processing on a plurality of first boundary pixels in the first surface, so as to obtain a plurality of pixels after subdivision processing. For example, in the subdivision model, let P1 be the pixel set after every subdivision, eps be the newly inserted pixel set, P2 be the pixel set after the original pixel set updates the position, p1=p2 UEps. In other alternative embodiments, the subdivision model may be a linear subdivision algorithm-based filter, a butterfly subdivision algorithm-based filter, or the like, as not limited herein.
That is, the plurality of pixel points obtained through the subdivision process in step S602 are changed with respect to the position and the pixel value of each first boundary pixel point in the first curved surface, so, in step S603, with the plurality of pixel points obtained through the subdivision process as a starting point, according to a sphere coordinate system, boundary searching is performed towards the starting point (i.e. inwards) with a preset step size until the pixel value of the searched pixel point is smaller than a preset second threshold value, so as to obtain a plurality of second boundary pixel points, i.e. the pixel value of the second boundary pixel point is smaller than the preset second threshold value, and then a point cloud composed of the plurality of second boundary pixel points is generated to obtain the second curved surface. As shown in fig. 9, the starting point is O, the first boundary pixel point is X1, the first curved surface is Y1, the pixel point after the subdivision is X2, the second boundary pixel point is X3, and the second curved surface is Y2.
The preset first threshold and the preset second threshold may be determined according to the pixel value of the organ boundary, and specifically, the preset first threshold and the preset second threshold may be equal.
In one embodiment, as shown in fig. 10, the prompt information includes a first prompt information and a second prompt information, and step S402 may include:
s701, selecting a plurality of track points from the predicted motion track;
s702, obtaining the distance between each track point and the target;
s703, outputting the first prompt information if the distance between any one of the track points and the region of the target in the ultrasonic image is smaller than or equal to a preset distance threshold;
s704, outputting the second prompt information if the distances between all the track points and the target in the ultrasonic image are greater than the preset distance threshold.
Specifically, the first prompt information is used for indicating that the predicted motion trail threatens the target, and the second prompt information is used for indicating that the predicted motion trail does not threat the target.
In step S701, the predicted movement track may be equally divided into a plurality of segments in the longitudinal direction, and then one track point may be selected from each segment, thereby obtaining a plurality of track points. Or sequentially selecting and obtaining a plurality of track points from the tail end to the head end of the predicted motion track according to the preset interval.
In step S702, the distance between the trajectory point and the region of the target in the ultrasound image may be represented as the shortest distance between the trajectory point and the three-dimensional space region of the target in the ultrasound image. If the shortest distance is smaller than or equal to the preset distance threshold, outputting first prompt information, wherein the first prompt information indicates that the predicted movement track threatens the organ or causes damage and the like, so that a doctor is prompted to re-plan a target point or adjust the movement track of the medical instrument, and unnecessary damage to the organ of the patient is avoided. If the shortest distance between all the track points and the target in the ultrasonic image is greater than the preset distance threshold, outputting the second prompt information, wherein the second prompt information indicates that the predicted motion track has no threat to the organ, and a doctor can perform puncture operation to finish biopsy sampling.
As shown in fig. 11, the embodiment of the application provides a flow for information prompt based on the predicted motion trail, which is as follows:
S11, dividing an organ into a regular-shaped organ and an irregular-shaped organ;
S12, determining the area of the regular-shape organ in the ultrasonic image based on the preset three-dimensional shape;
S13, dividing the region of the irregularly-shaped organ in the ultrasonic image by utilizing an ultrasonic image dividing method;
S14, calculating the distance between the expected motion trail and each organ;
s15, outputting prompt information.
Further, as shown in fig. 12, the embodiment of the present application further provides an ultrasound image segmentation method, including:
s801, acquiring an ultrasonic image shot by an ultrasonic probe;
S802, taking a starting point set in the ultrasonic image as a center, carrying out boundary searching outwards according to the size of a pixel value to obtain a plurality of first boundary pixel points, and generating a first curved surface by the plurality of first boundary pixel points, wherein the value of the first boundary pixel points is larger than or equal to a preset first threshold value;
s803, carrying out subdivision processing on the first curved surface by using a subdivision model to obtain a plurality of pixel points after subdivision processing;
S804, taking a plurality of pixel points subjected to subdivision processing as starting points, carrying out boundary search on the starting points according to the pixel value, obtaining a plurality of second boundary pixel points, generating a second curved surface by the second boundary pixel points, wherein the value of the second boundary pixel points is smaller than or equal to a preset second threshold value, and the second curved surface is used for representing the region of the second target in the ultrasonic image.
In step S801, the ultrasound image may be an ultrasound image obtained by scanning the sagittal plane of the patient' S body part (such as the prostate) with the ultrasound probe.
The above object may be expressed as an organ, such as an irregularly shaped organ like a bladder. The principles or functions of steps S802 to S804 in the track display method described above and in steps S601 to S603 may be referred to herein, and are not described in detail.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides a track display device for realizing the track display method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation in one or more embodiments of the track display device provided below may refer to the limitation of the track display method hereinabove, and will not be repeated herein.
In one embodiment, as shown in fig. 13, there is provided a trajectory display device 100 including:
a coordinate conversion module 110, configured to calculate coordinates of the guidance tool in an image coordinate system corresponding to the ultrasound probe;
A rotation module 120, configured to control rotation of the ultrasound probe according to coordinates of the guiding tool in the image coordinate system and a first sagittal plane currently scanned by the ultrasound probe;
an image acquisition module 130, configured to acquire an ultrasonic image scanned by the ultrasonic probe after rotation;
a trajectory display module 140 for displaying an expected motion trajectory of the medical instrument in the ultrasound image based on coordinates of the guiding tool in the image coordinate system and a length of the medical instrument mounted in the guiding tool.
In one embodiment, the track display device 100 is further configured to:
Acquiring the position of a target point;
And controlling the initial point of the guiding tool to aim at the target point, and positioning the guiding tool based on the position of the target point to obtain the coordinates of the guiding tool in the robot coordinate system.
The coordinate conversion module 110 is used for obtaining a coordinate conversion relation between a robot coordinate system and an image coordinate system, and converting the coordinates of the guiding tool in the robot coordinate system into the coordinates in the image coordinate system based on the coordinate conversion relation.
In one embodiment, the position of the target point comprises coordinates of the target point in the robot coordinate system, and the trajectory display device 100 is further configured to:
acquiring coordinates of an initial point of the guiding tool in the robot coordinate system;
Obtaining a vertical angle and a horizontal angle of an initial point of the guidance tool relative to the target point based on coordinates of the target point in the robot coordinate system and coordinates of the initial point of the guidance tool in the robot coordinate system;
And controlling an initial point of the guiding tool to aim at the target point according to the vertical angle and the horizontal angle, and positioning the guiding tool based on the position of the target point to obtain the coordinates of the guiding tool in the robot coordinate system.
In one embodiment, the rotation module 120 is further configured to obtain a rotation angle of the ultrasound probe based on the coordinates of the guiding tool in the image coordinate system and the first sagittal plane currently scanned by the ultrasound probe, and control the rotation angle of the ultrasound probe such that the medical instrument mounted in the guiding tool is in the second sagittal plane scanned by the ultrasound probe.
In one embodiment, the rotation module 120 is further configured to:
Acquiring coordinates of the ultrasonic probe in the image coordinate system and a first sagittal plane currently scanned by the ultrasonic probe;
obtaining coordinates in the image coordinate system of a medical instrument mounted in the guiding tool based on the coordinates in the image coordinate system of the guiding tool;
Obtaining a second sagittal plane based on the coordinates of the ultrasonic probe in the image coordinate system and the coordinates of the medical instrument in the image coordinate system, wherein the second sagittal plane is a plane where the ultrasonic probe and the medical instrument are located;
Obtaining an included angle between the first sagittal plane and the second sagittal plane, wherein the included angle is a rotation angle of the ultrasonic probe.
In one embodiment, the track display device 100 is further configured to:
Acquiring a region of at least one target in the ultrasound image;
And outputting prompt information based on the distance between the predicted motion trail and the region of the target in the ultrasonic image.
In one embodiment, the at least one object includes a first object, and the track display device 100 is further configured to:
acquiring the position of the first target in the ultrasonic image;
A region of the first object in the ultrasound image is obtained in the ultrasound image based on a preset three-dimensional shape and a position of the first object in the ultrasound image.
In one embodiment, the at least one object further includes a second object, and the track display device 100 is further configured to:
Taking the set starting point as a center, carrying out boundary searching outwards according to the size of the pixel value to obtain a plurality of first boundary pixel points, generating a first curved surface by the plurality of first boundary pixel points, wherein the value of the first boundary pixel points is larger than or equal to a preset first threshold value;
carrying out subdivision processing on the first curved surface by using a subdivision model to obtain a plurality of pixel points after subdivision processing;
And carrying out boundary search on the starting points according to the pixel values by taking the pixel points subjected to the subdivision processing as the starting points to obtain a plurality of second boundary pixel points, generating a second curved surface by the second boundary pixel points, wherein the value of the second boundary pixel points is smaller than or equal to a preset second threshold value, and the second curved surface is used for representing the region of the second target in the ultrasonic image.
In one embodiment, the track display device 100 is further configured to obtain a rotation angle of the ultrasound probe based on the coordinates of the guiding tool in the image coordinate system and the first sagittal plane currently scanned by the ultrasound probe, and control the rotation angle of the ultrasound probe so that the medical instrument mounted in the guiding tool is in the second sagittal plane scanned by the ultrasound probe.
In one embodiment, the prompt information includes a first prompt information and a second prompt information, and the track display device 100 is further configured to:
selecting a plurality of track points from the predicted motion track;
If the distance between any one of the track points and the target in the ultrasonic image is smaller than or equal to a preset distance threshold value, outputting first prompt information, wherein the first prompt information is used for indicating that the predicted motion track threatens the target;
and if the distances between all the track points and the target in the ultrasonic image are greater than the preset distance threshold, outputting the second prompt information, wherein the second prompt information is used for indicating that the predicted motion track has no threat to the target.
The respective modules in the track display device 100 described above may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
Based on the same inventive concept, the embodiment of the application also provides an ultrasonic image segmentation device for realizing the above related ultrasonic image segmentation method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation in the embodiments of the ultrasound image segmentation device or devices provided below may be referred to the limitation of the ultrasound image segmentation method hereinabove, and will not be repeated here.
In one embodiment, as shown in fig. 14, there is provided an ultrasound image segmentation apparatus 200 including:
An image acquisition module 210 for acquiring an ultrasound image taken by the ultrasound probe;
The first searching module 220 is configured to perform boundary searching outwards based on the pixel value with the starting point set in the ultrasound image as a center to obtain a plurality of first boundary pixels, and generate a first curved surface by using the plurality of first boundary pixels, where the value of the first boundary pixels is greater than or equal to a preset first threshold;
The subdivision module 230 is configured to perform subdivision processing on the first surface by using a subdivision model, so as to obtain a plurality of subdivided pixel points;
the second searching module 240 is configured to perform boundary searching to the starting point according to the pixel values with the pixel points after the subdivision processing as the starting point, to obtain a plurality of second boundary pixel points, and generate a second curved surface by using the plurality of second boundary pixel points, where the value of the second boundary pixel point is less than or equal to a preset second threshold, and the second curved surface is used to represent the region of the second target in the ultrasound image.
In one embodiment, the first search module 220 is further configured to:
And carrying out boundary searching outwards according to a sphere coordinate system by taking the starting point as the center, and if the pixel value of the searched pixel point is greater than or equal to a preset first threshold value, taking the searched pixel point as the first boundary pixel point.
The respective modules in the above-described ultrasound image segmentation apparatus 100 may be implemented in whole or in part by software, hardware, and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
Further, as shown in fig. 15, in another embodiment, the present application provides a surgical robot 300, the surgical robot 300 may include:
an ultrasound probe 310 for acquiring ultrasound images;
a guiding tool 320, in which a medical instrument 321 is mounted in the guiding tool 320, and
A control device 330 comprising a memory and a processor, said memory storing a computer program, said processor implementing the steps of the trajectory display method and/or the ultrasound image segmentation method as described above when executing said computer program;
a display 340 for displaying in the ultrasound image an expected motion profile of the medical instrument and/or a region of at least one target in the ultrasound image.
In other embodiments, the surgical robot 300 may further include a motor controlled by the control 330 for driving the ultrasound probe to move or rotate, such as a stepper motor, a rotary motor, etc., for driving the ultrasound probe to scan through the patient's rectum to obtain an ultrasound image of the prostate. The surgical robot 300 may further comprise a robotic arm controlled by the control device 330 for moving and positioning the guiding tool to assist a doctor in performing a puncturing operation or the like by the robotic arm driving the guiding tool, the puncturing operation being used to indicate that a medical instrument in the guiding tool punctures from an initial point on the surface of the prostate to a target point inside the prostate for biopsy sampling. In other alternative embodiments, surgical robot 300 may also include other mechanisms that perform operations, such as being controlled by a control device, to perform corresponding operations, etc.
Fig. 16 is a hardware configuration diagram of a surgical robot according to an embodiment. The surgical robot can comprise a power supply, a circuit board, a medical computer, an image grabber, an ultrasonic social security, a plurality of drivers and a plurality of motors, wherein the power supply provides electric energy required by operation for each hardware in the surgical robot, ultrasonic equipment comprises an ultrasonic probe and is used for acquiring ultrasonic images, the image grabber is used for grabbing the ultrasonic images acquired by the ultrasonic equipment and transmitting the ultrasonic images to the medical computer, the hospital computer is used for displaying the ultrasonic images, the circuit board is used for controlling each driver to respectively drive the corresponding motor to execute moving or rotating actions and the like, the motors such as a stepping motor, a rotating motor, a V-axis motor and an H-axis motor and the like, the stepping motor is used for driving the ultrasonic probe to move in a stepping mode, the rotating motor is used for driving the ultrasonic probe to rotate, and the V-axis motor is used for driving the mechanical arm to rotate the guiding tool along a V-axis (such as a vertical angle), and the H-axis motor is used for driving the mechanical arm to rotate the guiding tool along an H-axis (such as a horizontal angle).
As shown in fig. 17, the software serial port read-write control motor flow of the surgical robot 300 is as follows:
s21, selecting a motor to be controlled from a plurality of motors;
s22, acquiring a motor control instruction, wherein the motor control instruction is used for representing controlling the motor;
S23, writing a motor control instruction into the serial port;
S24, creating a thread, and starting to execute a motor control instruction;
S25, reading a motor return value and a motor position;
s26, if the return value is wire, the motor reading instruction is normal;
and S27, if the return value is false, the motor reads the abnormal instruction, writes the abnormal instruction into a log and closes the serial port.
It should be noted that, the implementation of the solution provided by the surgical robot 300 is similar to the implementation described in the above-mentioned trajectory display method and/or ultrasound image segmentation method, and specific principles or functions may refer to the above-mentioned trajectory display method and/or ultrasound image segmentation method, which are not described herein.
In another embodiment, a computer device is provided, which may be a terminal, and an internal structure diagram thereof may be as shown in fig. 18. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by the processor to implement a trajectory display method and/or an ultrasound image segmentation method.
It will be appreciated by those skilled in the art that the structure shown in FIG. 18 is merely a block diagram of some of the structures associated with the present inventive arrangements and is not limiting of the computer device to which the present inventive arrangements may be applied, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having stored therein a computer program, which processor, when executing the computer program, implements the steps of the trajectory display method and/or the ultrasound image segmentation method described above.
In an embodiment a computer readable storage medium is provided, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the trajectory display method and/or the ultrasound image segmentation method described above.
In one embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the trajectory display method and/or the ultrasound image segmentation method described above:
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magneto-resistive random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (PHASE CHANGE Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in various forms such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), etc. The databases referred to in the embodiments provided herein may include at least one of a relational database and a non-relational database. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processor referred to in the embodiments provided in the present application may be a general-purpose processor, a central processing unit, a graphics processor, a digital signal processor, a programmable logic unit, a data processing logic unit based on quantum computing, or the like, but is not limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application and are described in detail herein without thereby limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.

Claims (10)

1.一种轨迹显示方法,其特征在于,所述方法包括:1. A trajectory display method, characterized in that the method comprises: 计算引导工具在超声探头对应的图像坐标系中的坐标;calculating the coordinates of the guiding tool in the image coordinate system corresponding to the ultrasound probe; 根据所述引导工具在所述图像坐标系中的坐标以及所述超声探头当前扫描的第一矢状面控制所述超声探头旋转;Controlling the rotation of the ultrasound probe according to the coordinates of the guiding tool in the image coordinate system and a first sagittal plane currently scanned by the ultrasound probe; 获取旋转后的所述超声探头扫描的超声图像;Acquiring an ultrasonic image scanned by the rotated ultrasonic probe; 基于所述引导工具在所述图像坐标系中的坐标以及被安装于所述引导工具中的医疗器械的长度,在所述超声图像中显示出所述医疗器械的预计运动轨迹;Based on the coordinates of the guiding tool in the image coordinate system and the length of the medical device installed in the guiding tool, displaying the expected motion trajectory of the medical device in the ultrasound image; 获取至少一个目标在所述超声图像中的区域;Acquire a region of at least one target in the ultrasound image; 基于所述预计运动轨迹与所述目标在所述超声图像中的区域之间的距离,输出提示信息;outputting prompt information based on the distance between the predicted motion trajectory and the area of the target in the ultrasound image; 所述至少一个目标中包含第一目标,所述获取至少一个目标在所述超声图像中的区域,包括:The at least one target includes a first target, and acquiring a region of the at least one target in the ultrasound image includes: 获取所述第一目标在所述超声图像中的位置;Acquire the position of the first target in the ultrasound image; 基于预设三维形状以及所述第一目标在所述超声图像中的位置在所述超声图像中获得所述第一目标在所述超声图像中的区域,包括:根据预设的缩放比例对所述预设三维形状进行缩放,在第一目标在超声图像中的位置上生成缩放后的预设三维形状,得到该第一目标在超声图像中的区域。The method comprises: scaling the preset three-dimensional shape according to a preset scaling ratio, generating a scaled preset three-dimensional shape at the position of the first target in the ultrasound image, and obtaining the area of the first target in the ultrasound image. 2.根据权利要求1所述的方法,其特征在于,所述方法还包括:2. The method according to claim 1, characterized in that the method further comprises: 获取目标点的位置;Get the location of the target point; 控制所述引导工具的初始点对准所述目标点,基于所述目标点的位置对所述引导工具进行定位得到所述引导工具在机器人坐标系中的坐标;Controlling the initial point of the guiding tool to align with the target point, and positioning the guiding tool based on the position of the target point to obtain the coordinates of the guiding tool in the robot coordinate system; 所述计算引导工具在超声探头对应的图像坐标系中的坐标,包括:The calculation of the coordinates of the guiding tool in the image coordinate system corresponding to the ultrasound probe includes: 获取机器人坐标系与图像坐标系之间的坐标转换关系;Obtain the coordinate transformation relationship between the robot coordinate system and the image coordinate system; 基于所述坐标转换关系将引导工具在所述机器人坐标系中的坐标转换为在所述图像坐标系中的坐标。The coordinates of the guiding tool in the robot coordinate system are converted into coordinates in the image coordinate system based on the coordinate conversion relationship. 3.根据权利要求2所述的方法,其特征在于,所述目标点的位置包括所述目标点在所述机器人坐标系中的坐标;所述控制所述引导工具的初始点对准所述目标点,基于所述目标点的位置对所述引导工具进行定位得到所述引导工具在所述机器人坐标系中的坐标,包括:3. The method according to claim 2, characterized in that the position of the target point includes the coordinates of the target point in the robot coordinate system; the controlling the initial point of the guiding tool to align with the target point, and positioning the guiding tool based on the position of the target point to obtain the coordinates of the guiding tool in the robot coordinate system, comprises: 获取所述引导工具的初始点在所述机器人坐标系中的坐标;Obtaining the coordinates of the initial point of the guiding tool in the robot coordinate system; 基于所述目标点在所述机器人坐标系中的坐标以及所述引导工具的初始点在所述机器人坐标系中的坐标,获得所述引导工具的初始点相对于所述目标点的垂直角度以及水平角度;Based on the coordinates of the target point in the robot coordinate system and the coordinates of the initial point of the guiding tool in the robot coordinate system, obtaining a vertical angle and a horizontal angle of the initial point of the guiding tool relative to the target point; 根据所述垂直角度以及所述水平角度,控制所述引导工具的初始点对准所述目标点,并基于所述目标点的位置对所述引导工具进行定位得到所述引导工具在所述机器人坐标系中的坐标。According to the vertical angle and the horizontal angle, the initial point of the guiding tool is controlled to align with the target point, and the guiding tool is positioned based on the position of the target point to obtain the coordinates of the guiding tool in the robot coordinate system. 4.根据权利要求1所述的方法,其特征在于,所述根据所述引导工具在所述图像坐标系中的坐标以及所述超声探头当前扫描的第一矢状面控制所述超声探头旋转,包括:4. The method according to claim 1, characterized in that controlling the rotation of the ultrasound probe according to the coordinates of the guiding tool in the image coordinate system and the first sagittal plane currently scanned by the ultrasound probe comprises: 基于所述引导工具在所述图像坐标系中的坐标以及所述超声探头当前扫描的第一矢状面,获得所述超声探头的旋转角度,并控制所述超声探头旋转所述旋转角度,使得被安装于所述引导工具中的医疗器械处于所述超声探头所扫描的第二矢状面。Based on the coordinates of the guiding tool in the image coordinate system and the first sagittal plane currently scanned by the ultrasound probe, the rotation angle of the ultrasound probe is obtained, and the ultrasound probe is controlled to rotate by the rotation angle so that the medical device installed in the guiding tool is in the second sagittal plane scanned by the ultrasound probe. 5.根据权利要求4所述的方法,其特征在于,所述基于所述引导工具在所述图像坐标系中的坐标以及所述超声探头当前扫描的第一矢状面,获得所述超声探头的旋转角度,包括:5. The method according to claim 4, characterized in that the obtaining the rotation angle of the ultrasound probe based on the coordinates of the guiding tool in the image coordinate system and the first sagittal plane currently scanned by the ultrasound probe comprises: 获取所述超声探头在所述图像坐标系中的坐标以及所述超声探头当前扫描的第一矢状面;Acquire the coordinates of the ultrasound probe in the image coordinate system and the first sagittal plane currently scanned by the ultrasound probe; 基于所述引导工具在所述图像坐标系中的坐标,获得被安装于所述引导工具中的医疗器械在所述图像坐标系中的坐标;Based on the coordinates of the guiding tool in the image coordinate system, obtaining the coordinates of the medical device installed in the guiding tool in the image coordinate system; 基于所述超声探头在所述图像坐标系中的坐标以及所述医疗器械在所述图像坐标系中的坐标,获得第二矢状面,所述第二矢状面为所述超声探头与所述医疗器械所处的平面;Based on the coordinates of the ultrasound probe in the image coordinate system and the coordinates of the medical device in the image coordinate system, a second sagittal plane is obtained, where the second sagittal plane is a plane where the ultrasound probe and the medical device are located; 获得所述第一矢状面与所述第二矢状面之间的夹角,所述夹角为所述超声探头的旋转角度。An angle between the first sagittal plane and the second sagittal plane is obtained, where the angle is a rotation angle of the ultrasound probe. 6.根据权利要求1所述的方法,其特征在于,所述至少一个目标中还包含第二目标,所述获取至少一个目标在所述超声图像中的区域,包括:6. The method according to claim 1, characterized in that the at least one target further includes a second target, and the acquiring the region of the at least one target in the ultrasound image comprises: 以设定的起始点为中心,根据像素值大小向外进行边界搜索得到多个第一边界像素点,并由多个所述第一边界像素点生成第一曲面,所述第一边界像素点的值大于或等于预设第一阈值;Taking the set starting point as the center, a boundary search is performed outward according to the pixel value to obtain a plurality of first boundary pixel points, and a first curved surface is generated by the plurality of the first boundary pixel points, wherein the value of the first boundary pixel point is greater than or equal to a preset first threshold value; 利用细分模型对所述第一曲面进行细分处理,得到多个细分处理后的像素点;Subdividing the first curved surface using a subdivision model to obtain a plurality of subdivided pixel points; 以多个所述细分处理后的像素点为起点,根据像素值大小向所述起始点进行边界搜索,得到多个第二边界像素点,并由多个所述第二边界像素点生成第二曲面,所述第二边界像素点的值小于或等于预设第二阈值,所述第二曲面用于表示所述第二目标在所述超声图像中的区域。Taking the multiple subdivided pixel points as the starting point, a boundary search is performed toward the starting point according to the pixel value size to obtain multiple second boundary pixel points, and a second surface is generated by the multiple second boundary pixel points, the value of the second boundary pixel point is less than or equal to a preset second threshold, and the second surface is used to represent the area of the second target in the ultrasound image. 7.根据权利要求1所述的方法,其特征在于,所述以设定的起始点为中心,根据像素值大小向外进行边界搜索得到多个第一边界像素点,包括:7. The method according to claim 1, characterized in that the step of performing boundary search outwards based on pixel values with the set starting point as the center to obtain a plurality of first boundary pixel points comprises: 以所述起始点为中心,按照球体坐标系,以预设步长向外进行边界搜索,若搜索到的像素点的像素值大于或等于预设第一阈值,则所述搜索到的像素点为所述第一边界像素点。Taking the starting point as the center, a boundary search is performed outward with a preset step size according to the spherical coordinate system. If the pixel value of the searched pixel point is greater than or equal to a preset first threshold, the searched pixel point is the first boundary pixel point. 8.根据权利要求1所述的方法,其特征在于,所述提示信息包括第一提示信息和第二提示信息,所述基于所述预计运动轨迹与所述目标在所述超声图像中的区域之间的距离,输出提示信息,包括:8. The method according to claim 1, wherein the prompt information comprises first prompt information and second prompt information, and the outputting of the prompt information based on the distance between the predicted motion trajectory and the area of the target in the ultrasound image comprises: 在所述预计运动轨迹中选取到多个轨迹点;Selecting a plurality of trajectory points in the predicted motion trajectory; 获取每个所述轨迹点与所述目标之间的距离;Obtaining the distance between each of the trajectory points and the target; 若任意一个所述轨迹点与所述目标在所述超声图像中的区域之间的距离小于或等于预设距离阈值,则输出所述第一提示信息,所述第一提示信息用于表示所述预计运动轨迹对所述目标有威胁;If the distance between any one of the trajectory points and the area of the target in the ultrasound image is less than or equal to a preset distance threshold, outputting the first prompt information, wherein the first prompt information is used to indicate that the predicted motion trajectory poses a threat to the target; 若所有的所述轨迹点与所述目标在所述超声图像中的区域之间的距离均大于所述预设距离阈值,则输出所述第二提示信息,所述第二提示信息用于表示所述预计运动轨迹对所述目标没有威胁。If the distances between all the trajectory points and the area of the target in the ultrasound image are greater than the preset distance threshold, the second prompt information is output, and the second prompt information is used to indicate that the expected motion trajectory poses no threat to the target. 9.一种手术机器人,其特征在于,包括:9. A surgical robot, comprising: 超声探头,用于采集超声图像;An ultrasound probe, used for acquiring ultrasound images; 引导工具,所述引导工具中安装有医疗器械;以及a guide tool having a medical device installed therein; and 控制装置,包括存储器和处理器,所述存储器存储有计算机程序,所述处理器执行所述计算机程序时实现权利要求1至8中任一项所述的方法的步骤;A control device, comprising a memory and a processor, wherein the memory stores a computer program, and the processor implements the steps of any one of claims 1 to 8 when executing the computer program; 显示器,用于在所述超声图像中显示出所述医疗器械的预计运动轨迹和/或至少一个目标在所述超声图像中的区域。A display is used to display the expected motion trajectory of the medical device and/or the area of at least one target in the ultrasound image in the ultrasound image. 10.一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现权利要求1至8中任一项所述的方法的步骤。10. A computer-readable storage medium having a computer program stored thereon, wherein when the computer program is executed by a processor, the steps of the method according to any one of claims 1 to 8 are implemented.
CN202210851098.6A 2022-07-20 2022-07-20 Trajectory display method, surgical robot and computer-readable storage medium Active CN115211963B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210851098.6A CN115211963B (en) 2022-07-20 2022-07-20 Trajectory display method, surgical robot and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210851098.6A CN115211963B (en) 2022-07-20 2022-07-20 Trajectory display method, surgical robot and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN115211963A CN115211963A (en) 2022-10-21
CN115211963B true CN115211963B (en) 2024-12-27

Family

ID=83611560

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210851098.6A Active CN115211963B (en) 2022-07-20 2022-07-20 Trajectory display method, surgical robot and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN115211963B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108210024A (en) * 2017-12-29 2018-06-29 威朋(苏州)医疗器械有限公司 Operation piloting method and system
CN108289714A (en) * 2015-12-04 2018-07-17 皇家飞利浦有限公司 System and workflow for the intervention of mesh free Perineal approach prostate
CN114052848A (en) * 2020-07-30 2022-02-18 深圳市理邦精密仪器股份有限公司 Image puncture guiding method, medical device, and storage medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6379302B1 (en) * 1999-10-28 2002-04-30 Surgical Navigation Technologies Inc. Navigation information overlay onto ultrasound imagery
US9498182B2 (en) * 2012-05-22 2016-11-22 Covidien Lp Systems and methods for planning and navigation
CN112292710A (en) * 2018-05-18 2021-01-29 皇家飞利浦有限公司 Multimodal Image Registration
CN109063740A (en) * 2018-07-05 2018-12-21 高镜尧 The detection model of ultrasonic image common-denominator target constructs and detection method, device
CN109044400A (en) * 2018-08-31 2018-12-21 上海联影医疗科技有限公司 Ultrasound image mask method, device, processor and readable storage medium storing program for executing
CN109259822B (en) * 2018-11-07 2019-12-24 西安交通大学 Three-dimensional guidance and dynamic real-time monitoring system and method for focused ultrasound therapy based on passive cavitation detection imaging
KR102247072B1 (en) * 2019-04-04 2021-04-29 경북대학교 산학협력단 Shape restoration device and method using ultrasonic probe
EP3804630A1 (en) * 2019-10-10 2021-04-14 Koninklijke Philips N.V. Ultrasound object zoom tracking
CN112472139B (en) * 2020-12-18 2023-01-03 深圳市德力凯医疗设备股份有限公司 Imaging parameter configuration method, storage medium and ultrasonic equipment
CN113133814A (en) * 2021-04-01 2021-07-20 上海复拓知达医疗科技有限公司 Augmented reality-based puncture surgery navigation device and computer-readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108289714A (en) * 2015-12-04 2018-07-17 皇家飞利浦有限公司 System and workflow for the intervention of mesh free Perineal approach prostate
CN108210024A (en) * 2017-12-29 2018-06-29 威朋(苏州)医疗器械有限公司 Operation piloting method and system
CN114052848A (en) * 2020-07-30 2022-02-18 深圳市理邦精密仪器股份有限公司 Image puncture guiding method, medical device, and storage medium

Also Published As

Publication number Publication date
CN115211963A (en) 2022-10-21

Similar Documents

Publication Publication Date Title
JP4738270B2 (en) Surgery support device
JP5027922B2 (en) Ultrasonic diagnostic equipment
EP3056151A1 (en) Ultrasound fusion imaging method and ultrasound fusion imaging navigation system
CN109754396A (en) Method for registering, device, computer equipment and the storage medium of image
JP2006006933A (en) Method and apparatus for medical ultrasonic navigation and user interface
JP2017532134A (en) System for real-time organ segmentation and instrument navigation during instrument insertion within interventional therapy and method of operation thereof
US20200305837A1 (en) System and method for guided ultrasound imaging
US20140364728A1 (en) Insertion Target Point Setting Apparatus, Ultrasound Diagnostic Apparatus and Method for Setting Insertion Target Point
US10945709B2 (en) Systems, methods and computer readable storage media storing instructions for image-guided interventions based on patient-specific models
US20080146933A1 (en) Ultrasonic image and visualization aid
CN115211963B (en) Trajectory display method, surgical robot and computer-readable storage medium
US20170065255A1 (en) Enhanced ultrasound imaging apparatus and associated methods of work flow
CN114983540A (en) Puncture path prediction method and device and puncture auxiliary equipment
CN119184819A (en) Image processing method and puncture guiding system
CN117084791B (en) Puncture azimuth resolving method and puncture operation executing system
CN117084790B (en) Puncture azimuth control method and device, computer equipment and storage medium
CN114730628A (en) Image capture vision for augmented reality
JP2009011628A (en) Image diagnosis support device
JP6382031B2 (en) Ultrasonic diagnostic apparatus and control program therefor
CN118415727B (en) Needle insertion control method for puncture needle, controller and surgical robot
JP6695475B2 (en) Ultrasonic diagnostic equipment
CN118415728B (en) Needle insertion control method for puncture needle, controller and surgical robot
CN119655841A (en) Ultrasonic positioning puncture system and storage medium
CN116919596B (en) Instrument navigation method, system, device, equipment and storage medium
JP5283015B2 (en) Ranging device, program therefor, and ranging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant