[go: up one dir, main page]

CN113081268A - AR and IoT based surgical guidance system - Google Patents

AR and IoT based surgical guidance system Download PDF

Info

Publication number
CN113081268A
CN113081268A CN202110349330.1A CN202110349330A CN113081268A CN 113081268 A CN113081268 A CN 113081268A CN 202110349330 A CN202110349330 A CN 202110349330A CN 113081268 A CN113081268 A CN 113081268A
Authority
CN
China
Prior art keywords
mixed reality
reality glasses
point cloud
data
guidance system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110349330.1A
Other languages
Chinese (zh)
Inventor
徐欣
钱广璞
陈罡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Electric Group Corp
Original Assignee
Shanghai Electric Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Electric Group Corp filed Critical Shanghai Electric Group Corp
Priority to CN202110349330.1A priority Critical patent/CN113081268A/en
Publication of CN113081268A publication Critical patent/CN113081268A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Robotics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses an operation guiding system based on AR and IoT, which comprises mixed reality glasses and a communication module; the mixed reality glasses are in communication connection with the medical data center through the communication module; the mixed reality glasses are used for acquiring the suspension gesture, and acquiring and displaying corresponding medical data from the medical data center according to the suspension gesture. The invention combines AR and IoT, so that the operating doctor can inquire the medical data through the suspended gesture operation, free rotation, movement, amplification, reduction and the like of the medical image can be realized, the operation can be realized without touching a mouse, a keyboard and the like, the aseptic operation is realized, and the convenience is provided for the operating doctor.

Description

AR and IoT based surgical guidance system
Technical Field
The invention belongs to the technical field of enhanced display, and particularly relates to an operation guiding system based on AR (enhanced display) and IoT (Internet of things).
Background
In the conventional surgical scheme, the operation is mostly performed by depending on the experience of the operating doctor, the surgical opening is large, meanwhile, the real condition of the patient and the medical image of the patient in the memory of the doctor have deviation, and the operation is very inconvenient when CT (Computed Tomography), X-ray and other images need to be called. The surgical scheme of the navigation device is relied on, a plurality of different displays are mostly relied on to display the content to be navigated, the angle between the content displayed by the displays and the real physical space position is changed, and doctors are required to continuously look up the screen and look down at the surgical operation position, so that the surgical operation position is easy to fatigue. The navigation device has a large volume and weight and high space requirement, and if relevant operations are required, an assistant is required to be equipped for manufacturing due to the requirement of sterile operation, so that the limitation is relatively large.
Disclosure of Invention
The technical problem to be solved by the invention is to provide an operation guiding system based on AR and IoT in order to overcome the defect of inconvenience in inquiring medical images in the operation process in the prior art.
The invention solves the technical problems through the following technical scheme:
the invention provides an operation guiding system based on AR and IoT, which comprises mixed reality glasses and a communication module;
the mixed reality glasses are in communication connection with the medical data center through the communication module;
the mixed reality glasses are used for acquiring the suspension gesture, and acquiring and displaying corresponding medical data from the medical data center according to the suspension gesture.
Preferably, the medical data includes at least one of medical record data, two-dimensional medical image data, and three-dimensional medical image data.
Preferably, the medical data includes medical image data, the mixed reality glasses are used for displaying corresponding virtual images according to the medical image data, and the mixed reality glasses are further used for magnifying or reducing the virtual images according to the suspension gesture.
Preferably, the medical data includes three-dimensional medical image data, the mixed reality glasses are used for displaying a corresponding three-dimensional virtual image according to the three-dimensional medical image data, and the mixed reality glasses are further used for adjusting the angle of the three-dimensional virtual image according to the suspension gesture.
Preferably, the mixed reality glasses include HoloLens mixed reality glasses.
Preferably, the surgical guiding system further comprises a first positioner, the first positioner is provided with a first marker, and the first positioner is rigidly connected to the surgical target to form a first object with the surgical target;
the mixed reality glasses are further used for acquiring a first space coordinate corresponding to the first marker body in real time, and displaying a first three-dimensional virtual image corresponding to the first object according to the first space coordinate and the first point cloud coordinate data, wherein the first point cloud coordinate data is point cloud coordinate data corresponding to the first object.
Preferably, the mixed reality glasses obtain a coordinate adjustment value according to the first space coordinate and a first marker point cloud coordinate, wherein the first marker point cloud coordinate is point cloud coordinate data corresponding to the first marker body in point cloud coordinate data corresponding to the first object;
the mixed reality glasses adjust the point cloud coordinates of the operation target according to the coordinate adjustment value to obtain display coordinates of the operation target, wherein the point cloud coordinates of the operation target are point cloud coordinate data corresponding to the operation target in point cloud coordinate data corresponding to the first object;
the mixed reality glasses generate a first three-dimensional virtual image according to the first space coordinate and the display coordinate of the operation target.
Preferably, the point cloud coordinate data corresponding to the first object is obtained by scanning the first object through CT.
Preferably, the surgical guiding system further comprises a second positioner, the second positioner is provided with a second marker, and the second positioner is rigidly connected to the surgical instrument to form a second object with the surgical instrument;
the mixed reality glasses are further used for acquiring a second space coordinate corresponding to the second marker body in real time, and displaying a second three-dimensional virtual image corresponding to the second object according to the second space coordinate and second point cloud coordinate data, wherein the second point cloud coordinate data is point cloud coordinate data corresponding to the second object.
Preferably, the mixed reality glasses display the first three-dimensional virtual image with a perspective effect.
Preferably, the surgical instrument comprises an operation part, the operation part is contacted with the surgical target to perform surgery, the mixed reality glasses are used for displaying the CT images corresponding to the contact points according to the contact points of the operation part and the surgical target.
Preferably, the CT image includes a first CT image and a second CT image, and the cross-sectional direction corresponding to the first CT image is orthogonal to the cross-sectional direction corresponding to the second CT image.
The positive progress effects of the invention are as follows: the invention combines AR and IoT, so that the operating doctor can inquire the medical data through the suspended gesture operation, free rotation, movement, amplification, reduction and the like of the medical image can be realized, the operation can be realized without touching a mouse, a keyboard and the like, the aseptic operation is realized, and the convenience is provided for the operating doctor.
Drawings
Fig. 1 is a schematic diagram of an AR and IoT based surgical guidance system of embodiment 1 of the present invention;
fig. 2 is a schematic diagram of an AR and IoT based surgical guidance system of embodiment 2 of the present invention;
fig. 3 is a schematic diagram of a first object of the AR and IoT based surgical guidance system of embodiment 2 of the present invention;
fig. 4 is a schematic diagram of a three-dimensional model of a first object of an AR and IoT based surgical guidance system of embodiment 2 of the present invention;
fig. 5 is a schematic diagram of an operational scenario of an AR and IoT based surgical guidance system of embodiment 2 of the present invention;
fig. 6 is a schematic view of a schematic image of the field of view of an AR and IoT based surgical guidance system of embodiment 2 of the present invention.
Detailed Description
The invention is further illustrated by the following examples, which are not intended to limit the scope of the invention.
Example 1
The present embodiments provide an AR and IoT based surgical guidance system. Referring to fig. 1, the AR and IoT based surgical guidance system includes mixed reality glasses 101, a communication module 102. The mixed reality glasses 101 are in communication connection with the medical data center 2 through the communication module 102; the mixed reality glasses 101 are used for acquiring the suspension gesture, and acquiring and displaying corresponding medical data from the medical data console according to the suspension gesture.
In particular, the mixed reality glasses 101 are implemented using HoloLens (a type of mixed reality head mounted display developed by microsoft corporation) mixed reality glasses 101. The surgeon wears the mixed reality glasses 101, a corresponding selection menu is displayed on a display interface of the mixed reality glasses 101, the surgeon selects the medical data which the surgeon wants to view through a suspension gesture, and the mixed reality glasses 101 run a preset IoT program to realize the identification of the operation gesture of the user.
It can be understood that the medical record is selected and viewed by the surgeon through the suspension gesture, and then the mixed reality glasses 101 recognize the suspension gesture selected and viewed by the surgeon, acquire corresponding medical record information from the medical data, and display the medical record information on the display interface.
It can be understood that the surgeon selects to view the two-dimensional medical image through the suspension gesture, and then the mixed reality glasses 101 recognize the suspension gesture of the surgeon selecting to view the two-dimensional medical image, and acquire the corresponding two-dimensional medical image from the medical data console, and display the two-dimensional medical image on the display interface. Two-dimensional medical images include, but are not limited to, CT images, X-ray images. The mixed reality glasses 101 also recognize the hanging gesture of the surgeon for screening the two-dimensional medical images, so that the surgeon can screen and display the target two-dimensional medical image from the plurality of two-dimensional medical images. The mixed reality glasses 101 also recognize the hanging gesture of the surgeon, and the two-dimensional medical image is enlarged or reduced in the display interface according to the hanging gesture. The mixed reality glasses 101 further recognize a dragged hanging gesture of the surgeon, and correspondingly move the position of the two-dimensional medical image in the display interface, so that the surgeon drags the two-dimensional medical image to a desired target position.
It can be understood that, the surgeon selects to view the three-dimensional medical image through the suspension gesture, the mixed reality glasses 101 recognize the suspension gesture of the surgeon selecting to view the three-dimensional medical image, and acquire the corresponding three-dimensional medical data from the medical data console, and display the three-dimensional medical image corresponding to the three-dimensional medical data on the display interface. The three-dimensional medical data includes, but is not limited to, point cloud data of an affected part (surgical target 3) obtained based on CT scanning. The mixed reality glasses 101 also recognize the hanging gesture of the surgeon for screening the three-dimensional medical images, so that the surgeon can screen and display the target three-dimensional medical image from the three-dimensional medical images. The mixed reality glasses 101 also recognize the hanging gesture of the surgeon, and the three-dimensional medical image is enlarged or reduced in the display interface according to the hanging gesture. The mixed reality glasses 101 further recognize a dragged hanging gesture of the surgeon, and correspondingly move the position of the three-dimensional medical image in the display interface, so that the surgeon drags the three-dimensional medical image to a desired target position. The mixed reality glasses 101 also recognize the rotating hover gesture of the surgeon, corresponding to the display angle of the rotating three-dimensional medical image in the display interface, so that the surgeon rotates the three-dimensional medical image to a desired angle for convenient observation.
The mixed reality glasses 101 are configured through the IoT program, so that the camera of the mixed reality glasses 101 acquires and recognizes the suspension gesture, and performs the corresponding operation of displaying the medical data according to the suspension gesture, which can be implemented by those skilled in the art according to the above description and in combination with the working principle of the mixed reality glasses 101, and is not described herein again.
It is understood that the communication module 102 is a wireless communication module. In specific implementation, the communication module 102 is a Wi-Fi (a wireless network technology) module, a bluetooth module, a 4G (fourth generation mobile communication technology) module, a 5G (fifth generation mobile communication technology) module, or the like.
The operation guidance system based on the AR and the IoT combines the AR and the IoT, so that an operator can inquire medical data through suspended gesture operation, free rotation, movement, amplification, reduction and the like of medical images can be realized, the operation can be carried out without touching a mouse, a keyboard and the like, aseptic operation is realized, and convenience is provided for the operator.
Example 2
On the basis of embodiment 1, the present embodiment provides an AR and IoT based surgical guidance system. Referring to fig. 2, the surgical guidance system further includes a first positioner 103. The first positioner 103 is provided with a first marker 131, the first positioner 103 being rigidly connected to the surgical target 3 to form a first object with the surgical target 3.
The mixed reality glasses 101 are further configured to obtain a first space coordinate corresponding to the first marker 131 in real time, and display a first three-dimensional virtual image corresponding to the first object according to the first space coordinate and the first point cloud coordinate data, where the first point cloud coordinate data is point cloud coordinate data corresponding to the first object.
In a specific implementation, taking hip replacement surgery as an example, prior to performing the surgery, referring to fig. 3, the first positioner 103 is rigidly connected to the surgical target 3 to form a first object with the surgical target 3. In hip replacement surgery, the first positioner 103 is rigidly attached to the pelvis of the hip joint to be replaced. Until the operation is completed, the first positioner 103 and the operation target 3 keep the original rigid connection without relative displacement.
Then, CT scanning is performed on the pelvic region where the first positioner 103 is installed to obtain point cloud coordinate data of the first object. Referring to fig. 4, a three-dimensional model corresponding to the first object may be obtained by fitting according to the point cloud coordinate data of the first object, and the three-dimensional model includes a pelvis portion and a first locator 103 portion.
Next, referring to fig. 5 and 6, when performing an operation, the surgeon wears the mixed reality glasses 101 and holds the surgical instrument 4 to perform the operation. For convenience of explanation, fig. 5 shows an image displayed in the field of view of the mixed reality glasses 101, that is, an image observed by the surgeon through the mixed reality glasses 101, as the visual field schematic image 7.
The mixed reality glasses 101 acquire the first spatial coordinates corresponding to the first marker 131 on the first positioner 103 in real time. The mixed reality glasses 101 further receive point cloud coordinate data of the first object, and identify point cloud coordinates of the first marker (i.e., point cloud coordinate data corresponding to the first marker 131 in the point cloud coordinate data corresponding to the first object). The mixed reality glasses 101 obtain a coordinate adjustment value according to the first space coordinate and the first marked point cloud coordinate, where the coordinate adjustment value is a value to be adjusted by converting point cloud coordinate data of the first object obtained by CT scanning of the first object into a coordinate value in a coordinate system corresponding to the mixed reality glasses 101. Then, the mixed reality glasses 101 adjust the point cloud coordinate of the surgical target 3 according to the coordinate adjustment value to obtain the display coordinate of the surgical target 3, the point cloud coordinate of the surgical target 3 is the point cloud coordinate data corresponding to the surgical target 3 in the point cloud coordinate data corresponding to the first object, and the mixed reality glasses 101 generate a first three-dimensional virtual image according to the first space coordinate and the display coordinate of the surgical target 3, that is, the mixed reality glasses 101 convert the point cloud coordinate data of the first object into a coordinate value in a coordinate system corresponding to the mixed reality glasses 101 according to the adjustment value, so that the first three-dimensional virtual image corresponding to the first object is displayed in the visual field of the mixed reality glasses 101 based on the coordinate system corresponding to the mixed reality glasses 101. During the operation, the surgeon adjusts the orientation of the mixed reality glasses 101 appropriately to keep the first marker 131 within the capture range of the camera of the mixed reality glasses 101.
Based on the above adjustment, calibration and matching of the virtual pelvis model coordinates and the pelvis coordinates in the real space are realized, so that the surgeon can accurately know the position of the surgical target 3 in the visual field of the mixed reality glasses 101 to accurately implement the surgical operation.
In the operation process, the displacement of the pelvis model can be captured by the HoloLens camera through the movement of the first marker body 131 rigidly connected with the pelvis, and the captured coordinate data can drive the virtual model in the AR program to move in real time, so that the virtual pelvis model and the pelvis of a real patient are kept in a fit state.
The surgical guiding system further comprises a second locator 104, the second locator 104 being provided with a second marker, the second locator 104 being rigidly connected to the surgical instrument 4 to form a second object with the surgical instrument 4; the mixed reality glasses 101 are further configured to obtain a second spatial coordinate corresponding to the second marker in real time, and display a second three-dimensional virtual image corresponding to the second object according to the second spatial coordinate and the second point cloud coordinate data, where the second point cloud coordinate data is point cloud coordinate data corresponding to the second object.
It can be understood that, after the second locator 104 is rigidly connected to the surgical instrument 4 to form a second object with the surgical instrument 4, second point cloud coordinate data corresponding to the second object is obtained by means of CT scanning, and a three-dimensional image corresponding to the second object can be obtained by fitting based on the second point cloud coordinate data.
In the process of performing the surgery, the mixed reality glasses 101 acquire the second spatial coordinates corresponding to the second marker in real time. The mixed reality glasses 101 receive the point cloud coordinate data of the second object, and identify the point cloud coordinates of the second marker (i.e., the point cloud coordinate data corresponding to the second marker in the point cloud coordinate data corresponding to the second object). The mixed reality glasses 101 obtain a coordinate adjustment value according to the second space coordinate and the second marked point cloud coordinate, where the coordinate adjustment value is a value to be adjusted by converting point cloud coordinate data of the second object obtained by CT scanning the second object into a coordinate value in a coordinate system corresponding to the mixed reality glasses 101. Then, the mixed reality glasses 101 adjust the point cloud coordinate of the medical instrument according to the coordinate adjustment value to obtain a display coordinate of the medical instrument, the point cloud coordinate of the medical instrument is the point cloud coordinate data corresponding to the medical instrument in the point cloud coordinate data corresponding to the second object, and the mixed reality glasses 101 generate a second three-dimensional virtual image according to the second space coordinate and the display coordinate of the medical instrument, that is, the mixed reality glasses 101 convert the point cloud coordinate data of the second object into a coordinate value in a coordinate system corresponding to the mixed reality glasses 101 according to the adjustment value, so that the second three-dimensional virtual image corresponding to the second object is displayed in the visual field of the mixed reality glasses 101 based on the coordinate system corresponding to the mixed reality glasses 101.
The first three-dimensional virtual image and the second three-dimensional virtual image displayed in the visual field of the mixed reality glasses 101 show the real-time position relationship between the surgical instrument 4 and the surgical object, and can be conveniently and accurately observed by the surgeon to implement the operation.
When the surgical instrument 4 is operated, the camera on the HoloLens mixed reality glasses 101 collects the three-dimensional coordinates of the second marker on the surgical instrument 4 in real time, and the virtual model coordinates of the surgical instrument 4 are calibrated and matched with the coordinates of the surgical instrument 4 in the hand of the doctor in the real world space by means of the AR program. In the process of guiding the operation, it needs to be ensured that the second marker on the surgical instrument 4 is within the acquisition range of the image recognition of the HoloLens camera.
When the camera of the HoloLens detects the second marker on the surgical instrument 4, the virtual model of the surgical instrument 4 is presented in the display field of view of the HoloLens, and the obtained coordinate data of the second marker on the surgical instrument 4 is captured to drive the virtual model of the surgical instrument 4 in the AR program to move in real time, so that the virtual model of the surgical instrument 4 and the real surgical instrument 4 in the hand of the doctor keep a fit state.
When the surgical instrument 4 moves, the virtual surgical instrument 4 in the HoloLens moves in real time, so that the position of the real instrument can be obtained by means of the image of the virtual surgical instrument 4 in the HoloLens, and particularly the blind spot position which cannot be seen by an operator such as subcutaneous surgery and the like. When the end of the surgical instrument 4 is located in the blind spot of the surgeon, such as under the skin, the surgeon can still know the current position of the end of the surgical instrument 4 through the virtual image.
In an alternative embodiment, the mixed reality glasses 101 display the first three-dimensional virtual image in a perspective effect, i.e., the surgical object appears transparent in the field of view of the mixed reality glasses 101. In this way, in the visual field of the mixed reality glasses 101, the surgical object does not block the surgical instrument 4, and the surgeon can clearly observe the operation part of the surgical instrument 4 (i.e., the part of the surgical instrument 4 that is in contact with the surgical object to perform the surgery), thereby eliminating the busy point in the visual field and facilitating the surgeon to accurately perform the surgery.
In an optional embodiment, a temperature and humidity information display area 8 is further arranged in the visual field schematic image 7 to display real-time temperature and humidity data, so that a surgeon can know the real-time temperature and humidity data conveniently.
As an alternative embodiment, the operation portion is in contact with the operation target 3 to perform the operation, the mixed reality glasses 101 display the CT image 109 corresponding to the contact point according to the contact point of the operation portion with the operation target 3. That is, the display field of the mixed reality glasses 101 not only displays the first three-dimensional virtual image and the second three-dimensional virtual image, but also displays the CT image 109 corresponding to the contact point, so that the operating surgeon can conveniently look up the corresponding CT image according to the contact point of the operating part and the operating target 3, so as to clearly know the condition of the operating object, thereby accurately and efficiently implementing the operation without additional looking up actions.
In order to facilitate the surgeon to know the condition of the surgical object from more angles, the CT image 109 includes a first CT image and a second CT image, and the cross-sectional direction corresponding to the first CT image is orthogonal to the cross-sectional direction corresponding to the second CT image. That is, CT scan images of the first object are acquired based on two orthogonal directions before the operation, and in the course of the operation, a first CT image corresponding to a current contact point of the operation portion and the operation target 3 in the first direction and a second CT image corresponding to the contact point in the second direction are displayed according to the current contact point as the operation progresses. Therefore, the condition of the current operation position can be conveniently and comprehensively known by an operator from more angles, so that the operation can be accurately and efficiently implemented without extra looking up actions. Based on the relationship between the three-dimensional coordinates of the pelvis model and the three-dimensional coordinates of the surgical instrument 4, the current position of the surgical instrument 4 is presented in real time in two orthogonal CT sectional views, so that the doctor can know the position between the pelvis model and the surgical instrument more clearly, and the surgeon is guided to perform the operation at the position of the visual blind spot.
While specific embodiments of the invention have been described above, it will be appreciated by those skilled in the art that this is by way of example only, and that the scope of the invention is defined by the appended claims. Various changes and modifications to these embodiments may be made by those skilled in the art without departing from the spirit and scope of the invention, and these changes and modifications are within the scope of the invention.

Claims (12)

1. An AR and IoT based surgical guidance system comprising mixed reality glasses, a communication module;
the mixed reality glasses are in communication connection with the medical data center through the communication module;
the mixed reality glasses are used for acquiring a suspension gesture, and acquiring and displaying corresponding medical data from the medical data center according to the suspension gesture.
2. The AR and IoT based surgical guidance system according to claim 1, wherein the medical data comprises at least one of medical record data, two-dimensional medical image data, and three-dimensional medical image data.
3. The AR and IoT based surgical guidance system according to claim 1, wherein the medical data comprises medical image data, the mixed reality glasses to display a corresponding virtual image according to the medical image data, the mixed reality glasses to further zoom in or zoom out on the virtual image according to the hover gesture.
4. The AR and IoT based surgical guidance system according to claim 1, wherein the medical data comprises three-dimensional medical image data, the mixed reality glasses to display a corresponding three-dimensional virtual image according to the three-dimensional medical image data, the mixed reality glasses to further adjust an angle of the three-dimensional virtual image according to the hover gesture.
5. The AR and IoT based surgical guidance system recited in claim 1, wherein the mixed reality glasses comprise HoloLens mixed reality glasses.
6. The AR and IoT based surgical guidance system of claim 1, further comprising a first positioner, the first positioner being provided with a first marker, the first positioner rigidly connected to a surgical target to form a first object with the surgical target;
the mixed reality glasses are further used for acquiring a first space coordinate corresponding to the first marker body in real time, and displaying a first three-dimensional virtual image corresponding to the first object according to the first space coordinate and first point cloud coordinate data, wherein the first point cloud coordinate data is point cloud coordinate data corresponding to the first object.
7. The AR and IoT based surgical guidance system according to claim 6, wherein the mixed reality glasses derive coordinate adjustment values from the first spatial coordinates and first marker point cloud coordinates, the first marker point cloud coordinates being the point cloud coordinate data corresponding to the first marker volume in the point cloud coordinate data corresponding to the first object;
the mixed reality glasses adjust the point cloud coordinate of the operation target according to the coordinate adjustment value to obtain an operation target display coordinate, wherein the operation target point cloud coordinate is the point cloud coordinate data corresponding to the operation target in the point cloud coordinate data corresponding to the first object;
and the mixed reality glasses generate the first three-dimensional virtual image according to the first space coordinate and the operation target display coordinate.
8. The AR and IoT-based surgical guidance system recited in claim 6, wherein the point cloud coordinate data to which the first object corresponds is point cloud coordinate data obtained by CT scanning the first object.
9. The AR and IoT-based surgical guidance system of claim 6, further comprising a second positioner, the second positioner being provided with a second marker, the second positioner rigidly connected to a surgical instrument to form a second object with the surgical instrument;
the mixed reality glasses are further used for acquiring a second space coordinate corresponding to the second marker body in real time, and displaying a second three-dimensional virtual image corresponding to the second object according to the second space coordinate and second point cloud coordinate data, wherein the second point cloud coordinate data is the point cloud coordinate data corresponding to the second object.
10. The AR and IoT based surgical guidance system of claim 9, wherein the mixed reality glasses display the first three-dimensional virtual imagery in a perspective effect.
11. The AR and IoT based surgical guidance system according to claim 9, wherein the surgical instrument comprises an operating portion that contacts the surgical target to perform a procedure, the mixed reality glasses according to a contact point of the operating portion with the surgical target and displaying a CT image corresponding to the contact point.
12. The AR and IoT-based surgical guidance system of claim 11, wherein the CT images comprise a first CT image and a second CT image, the first CT image corresponding to a cross-sectional direction orthogonal to the second CT image corresponding to a cross-sectional direction.
CN202110349330.1A 2021-03-31 2021-03-31 AR and IoT based surgical guidance system Pending CN113081268A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110349330.1A CN113081268A (en) 2021-03-31 2021-03-31 AR and IoT based surgical guidance system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110349330.1A CN113081268A (en) 2021-03-31 2021-03-31 AR and IoT based surgical guidance system

Publications (1)

Publication Number Publication Date
CN113081268A true CN113081268A (en) 2021-07-09

Family

ID=76672128

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110349330.1A Pending CN113081268A (en) 2021-03-31 2021-03-31 AR and IoT based surgical guidance system

Country Status (1)

Country Link
CN (1) CN113081268A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113633376A (en) * 2021-08-06 2021-11-12 吉林大学 A naked eye three-dimensional virtual replacement method for total hip joint
CN113648057A (en) * 2021-08-18 2021-11-16 上海电气集团股份有限公司 Surgical navigation system and surgical navigation method
CN113764093A (en) * 2021-08-18 2021-12-07 上海电气集团股份有限公司 Mixed reality display device, operation information processing method thereof and storage medium
CN114627271A (en) * 2022-03-15 2022-06-14 上海诠视传感技术有限公司 Method for viewing 3D visual data, AR glasses, system and storage medium

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6450978B1 (en) * 1998-05-28 2002-09-17 Orthosoft, Inc. Interactive computer-assisted surgical system and method thereof
JP2007029232A (en) * 2005-07-25 2007-02-08 Hitachi Medical Corp System for supporting endoscopic operation
CN102470016A (en) * 2009-07-15 2012-05-23 皇家飞利浦电子股份有限公司 Visualizing surgical trajectories
CN103211655A (en) * 2013-04-11 2013-07-24 深圳先进技术研究院 Navigation system and navigation method of orthopedic operation
CN103519895A (en) * 2013-10-18 2014-01-22 江苏艾迪尔医疗科技股份有限公司 Orthopedic operation auxiliary guide method
CN104586505A (en) * 2014-09-19 2015-05-06 张巍 Navigating system and method for orthopedic operation
CN106456125A (en) * 2014-05-02 2017-02-22 皇家飞利浦有限公司 Systems for linking features in medical images to anatomical models and methods of operation thereof
CN107296650A (en) * 2017-06-01 2017-10-27 西安电子科技大学 Intelligent operation accessory system based on virtual reality and augmented reality
US20180085173A1 (en) * 2016-09-27 2018-03-29 Covidien Lp Systems and methods for performing a surgical navigation procedure
CN108170259A (en) * 2016-12-07 2018-06-15 上海西门子医疗器械有限公司 Medical system auxiliary treating apparatus, medical system and aid in treatment method
CN108294814A (en) * 2018-04-13 2018-07-20 首都医科大学宣武医院 Intracranial puncture positioning method based on mixed reality
CN109106448A (en) * 2018-08-30 2019-01-01 上海霖晏医疗科技有限公司 A kind of operation piloting method and device
CN109223121A (en) * 2018-07-31 2019-01-18 广州狄卡视觉科技有限公司 Based on medical image Model Reconstruction, the cerebral hemorrhage puncturing operation navigation system of positioning
CN109512514A (en) * 2018-12-07 2019-03-26 陈玩君 A kind of mixed reality orthopaedics minimally invasive operation navigating system and application method
CN208974135U (en) * 2018-09-07 2019-06-14 上海霖晏医疗科技有限公司 A kind of surgical instrument that can be tracked by depth camera
CN110353774A (en) * 2018-12-15 2019-10-22 深圳铭杰医疗科技有限公司 Assist Needle-driven Robot and its control method, computer equipment, storage medium
CN110478040A (en) * 2019-08-19 2019-11-22 王小丽 Obtain the method and device of alimentary stent implantation navigation image
CN111184577A (en) * 2014-03-28 2020-05-22 直观外科手术操作公司 Quantitative three-dimensional visualization of an instrument in a field of view
CN111658065A (en) * 2020-05-12 2020-09-15 北京航空航天大学 Digital guide system for mandible cutting operation
CN111772792A (en) * 2020-08-05 2020-10-16 山东省肿瘤防治研究院(山东省肿瘤医院) Endoscopic surgery navigation method, system and readable storage medium based on augmented reality and deep learning

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6450978B1 (en) * 1998-05-28 2002-09-17 Orthosoft, Inc. Interactive computer-assisted surgical system and method thereof
JP2007029232A (en) * 2005-07-25 2007-02-08 Hitachi Medical Corp System for supporting endoscopic operation
CN102470016A (en) * 2009-07-15 2012-05-23 皇家飞利浦电子股份有限公司 Visualizing surgical trajectories
CN103211655A (en) * 2013-04-11 2013-07-24 深圳先进技术研究院 Navigation system and navigation method of orthopedic operation
CN103519895A (en) * 2013-10-18 2014-01-22 江苏艾迪尔医疗科技股份有限公司 Orthopedic operation auxiliary guide method
CN111184577A (en) * 2014-03-28 2020-05-22 直观外科手术操作公司 Quantitative three-dimensional visualization of an instrument in a field of view
CN106456125A (en) * 2014-05-02 2017-02-22 皇家飞利浦有限公司 Systems for linking features in medical images to anatomical models and methods of operation thereof
CN104586505A (en) * 2014-09-19 2015-05-06 张巍 Navigating system and method for orthopedic operation
US20180085173A1 (en) * 2016-09-27 2018-03-29 Covidien Lp Systems and methods for performing a surgical navigation procedure
CN108170259A (en) * 2016-12-07 2018-06-15 上海西门子医疗器械有限公司 Medical system auxiliary treating apparatus, medical system and aid in treatment method
CN107296650A (en) * 2017-06-01 2017-10-27 西安电子科技大学 Intelligent operation accessory system based on virtual reality and augmented reality
CN108294814A (en) * 2018-04-13 2018-07-20 首都医科大学宣武医院 Intracranial puncture positioning method based on mixed reality
CN109223121A (en) * 2018-07-31 2019-01-18 广州狄卡视觉科技有限公司 Based on medical image Model Reconstruction, the cerebral hemorrhage puncturing operation navigation system of positioning
CN109106448A (en) * 2018-08-30 2019-01-01 上海霖晏医疗科技有限公司 A kind of operation piloting method and device
CN208974135U (en) * 2018-09-07 2019-06-14 上海霖晏医疗科技有限公司 A kind of surgical instrument that can be tracked by depth camera
CN109512514A (en) * 2018-12-07 2019-03-26 陈玩君 A kind of mixed reality orthopaedics minimally invasive operation navigating system and application method
CN110353774A (en) * 2018-12-15 2019-10-22 深圳铭杰医疗科技有限公司 Assist Needle-driven Robot and its control method, computer equipment, storage medium
CN110478040A (en) * 2019-08-19 2019-11-22 王小丽 Obtain the method and device of alimentary stent implantation navigation image
CN111658065A (en) * 2020-05-12 2020-09-15 北京航空航天大学 Digital guide system for mandible cutting operation
CN111772792A (en) * 2020-08-05 2020-10-16 山东省肿瘤防治研究院(山东省肿瘤医院) Endoscopic surgery navigation method, system and readable storage medium based on augmented reality and deep learning

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113633376A (en) * 2021-08-06 2021-11-12 吉林大学 A naked eye three-dimensional virtual replacement method for total hip joint
CN113633376B (en) * 2021-08-06 2024-03-15 吉林大学 Naked eye three-dimensional virtual replacement method for total hip joint
CN113648057A (en) * 2021-08-18 2021-11-16 上海电气集团股份有限公司 Surgical navigation system and surgical navigation method
CN113764093A (en) * 2021-08-18 2021-12-07 上海电气集团股份有限公司 Mixed reality display device, operation information processing method thereof and storage medium
CN114627271A (en) * 2022-03-15 2022-06-14 上海诠视传感技术有限公司 Method for viewing 3D visual data, AR glasses, system and storage medium

Similar Documents

Publication Publication Date Title
CN113081268A (en) AR and IoT based surgical guidance system
JP2575586B2 (en) Surgical device positioning system
CN110944595B (en) System for mapping an endoscopic image dataset onto a three-dimensional volume
US20200375546A1 (en) Machine-guided imaging techniques
WO2019181632A1 (en) Surgical assistance apparatus, surgical method, non-transitory computer readable medium and surgical assistance system
JP2022512420A (en) Surgical system with a combination of sensor-based navigation and endoscopy
CN111067468B (en) Method, apparatus, and storage medium for controlling endoscope system
CN113453606A (en) Endoscope with dual image sensor
JP2004254899A (en) Operation support system and operation support method
JP6493885B2 (en) Image alignment apparatus, method of operating image alignment apparatus, and image alignment program
CN114565724B (en) Method, electronic device and medium for osteotomy planning based on visualized image
JP2021122743A (en) Extended Reality Instrument Interaction Zone for Navigated Robot Surgery
CN114631886A (en) Robot arm positioning method, readable storage medium and surgical robot system
JP2024508126A (en) Method and system for suggesting spinal rods for orthopedic surgery using augmented reality
CN109745074B (en) Three-dimensional ultrasonic imaging system and method
CN116564149A (en) An operation training method for lumbar intervertebral foramen puncture
CN118436438A (en) Method and system for projecting incision markings onto a patient
WO2020236814A1 (en) Systems and methods for generating workspace volumes and identifying reachable workspaces of surgical instruments
CN108778135B (en) Optical camera selection in multi-modal X-ray imaging
CN114418960A (en) Image processing method, system, computer equipment and storage medium
WO2013114994A1 (en) X-ray ct device
CN209847228U (en) Three-dimensional ultrasonic imaging device
EP3703012A1 (en) Map of body cavity
CN107260305A (en) Area of computer aided minimally invasive surgery system
JP6476125B2 (en) Image processing apparatus and surgical microscope system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210709

RJ01 Rejection of invention patent application after publication