[go: up one dir, main page]

CN112190331A - Method, device and system for determining surgical navigation information and electronic device - Google Patents

Method, device and system for determining surgical navigation information and electronic device Download PDF

Info

Publication number
CN112190331A
CN112190331A CN202011106421.4A CN202011106421A CN112190331A CN 112190331 A CN112190331 A CN 112190331A CN 202011106421 A CN202011106421 A CN 202011106421A CN 112190331 A CN112190331 A CN 112190331A
Authority
CN
China
Prior art keywords
information
augmented reality
surgical
image
spatial position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011106421.4A
Other languages
Chinese (zh)
Inventor
申一君
庞博
田梦泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing AK Medical Co Ltd
Original Assignee
Beijing AK Medical Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing AK Medical Co Ltd filed Critical Beijing AK Medical Co Ltd
Priority to CN202011106421.4A priority Critical patent/CN112190331A/en
Publication of CN112190331A publication Critical patent/CN112190331A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Gynecology & Obstetrics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Robotics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

本发明公开了一种手术导航信息的确定方法、装置及系统、电子装置。其中,该手术导航信息的确定系统,包括:手术端设备,用于获取待手术部位的空间位置信息和手术器械的移动轨迹信息,其中,上述手术器械位于包含上述待手术部位的目标范围内;远端设备,与上述手术端设备连接,用于依据上述空间位置信息和上述移动轨迹信息确定虚拟图像,并对上述虚拟图像进行增强现实处理得到增强现实图像,以及基于上述增强现实图像确定上述手术器械的导航信息。本发明解决了现有技术中的手术导航方案存在导航信息与手术场景相分离,导致导航信息不易于理解的技术问题。

Figure 202011106421

The invention discloses a method, a device, a system and an electronic device for determining surgical navigation information. Wherein, the system for determining surgical navigation information includes: surgical end equipment for acquiring spatial position information of the to-be-operated site and movement track information of the surgical instrument, wherein the surgical instrument is located within the target range including the to-be-operated site; A remote device, connected to the above-mentioned surgical end device, is used to determine a virtual image according to the above-mentioned spatial position information and the above-mentioned movement trajectory information, perform augmented reality processing on the above-mentioned virtual image to obtain an augmented reality image, and determine the above-mentioned operation based on the above-mentioned augmented reality image. Navigation information for the device. The invention solves the technical problem that the navigation information is separated from the operation scene in the operation navigation scheme in the prior art, which leads to the difficulty in understanding the navigation information.

Figure 202011106421

Description

Method, device and system for determining surgical navigation information and electronic device
Technical Field
The invention relates to the technical field of medical treatment, in particular to a method, a device and a system for determining surgical navigation information and an electronic device.
Background
In the related art, a surgical navigation system generally installs an active or passive marking device near a surgical site of a patient and on a surgical instrument, tracks a bone position of the patient and a position and a motion track of the surgical instrument by using a transmitted signal, and needs to perform dotting registration on the surgical site of the patient during surgery.
In order to enable a doctor to clearly know the position of a surgical instrument relative to the anatomical structure of a patient, a navigation technology commonly adopted in the related technology is a computer-aided navigation technology, and the navigation technology has the technical problem that navigation information is separated from an operation scene, so that the navigation information is not easy to understand.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a method, a device and a system for determining surgical navigation information and an electronic device, which are used for at least solving the technical problem that navigation information is difficult to understand because the navigation information is separated from a surgical scene in a surgical navigation scheme in the prior art.
According to an aspect of an embodiment of the present invention, there is provided a surgical navigation information determining system including: the surgical end equipment is used for acquiring spatial position information of a to-be-operated part and movement track information of a surgical instrument, wherein the surgical instrument is positioned in a target range containing the to-be-operated part; and the remote equipment is connected with the operation end equipment and used for determining a virtual image according to the spatial position information and the movement track information, performing augmented reality processing on the virtual image to obtain an augmented reality image, and determining navigation information of the surgical instrument based on the augmented reality image.
Optionally, the system further includes: and a fifth generation mobile communication 5G module, configured to establish a communication connection between the operation end device and the remote end device, where the operation end device sends the spatial position information and the movement track information to the remote end device through the 5G module.
Optionally, the surgical end device includes: the first positioning module is arranged in the target range and used for acquiring the spatial position information of the part to be operated in real time; and the second positioning module is connected with the surgical instrument and is used for acquiring the movement track information of the surgical instrument in real time.
Optionally, the system further includes: and the display auxiliary module is connected with the first positioning module and the second positioning module, and is used for receiving the spatial position information and the movement track information and sending the spatial position information and the movement track information to the remote equipment.
Optionally, the remote device includes: an image processing module, connected to the display auxiliary module, for acquiring an X-ray fluoroscopic image of the to-be-operated site, and synthesizing the spatial position information, the movement trajectory information, and the X-ray fluoroscopic image to obtain the virtual image; performing augmented reality processing on the virtual image to obtain an augmented reality image, and selecting the navigation information matched with the spatial position information from the augmented reality image; and a memory connected to the image processing module and configured to store the virtual image, the augmented reality image, and the navigation information.
Optionally, the display auxiliary module is further configured to receive the augmented reality image and the navigation information returned by the image processing module; the above system further comprises: and the AR display end is connected with the display auxiliary module and is used for displaying the augmented reality image and the navigation information.
According to another aspect of the embodiments of the present invention, there is also provided a method for determining surgical navigation information, including: acquiring spatial position information of a part to be operated and movement track information of a surgical instrument, wherein the surgical instrument is positioned in a target range including the part to be operated; sending the spatial position information and the movement track information to a remote device, and receiving an augmented reality image and navigation information of the surgical instrument returned by the remote device, wherein the remote device is used for determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain the augmented reality image; and determining the navigation information based on the augmented reality image.
According to another aspect of the embodiments of the present invention, there is also provided a method for determining surgical navigation information, including: receiving spatial position information of a to-be-operated part and movement track information of a surgical instrument from surgical end equipment, wherein the surgical instrument is positioned in a target range containing the to-be-operated part; determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain an augmented reality image; determining navigation information of the surgical instrument based on the augmented reality image; and returning the augmented reality image and the navigation information to the operation end equipment.
According to another aspect of the embodiments of the present invention, there is also provided a surgical navigation information determining apparatus, including: the surgical instrument comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring spatial position information of a to-be-operated part and movement track information of a surgical instrument, and the surgical instrument is positioned in a target range containing the to-be-operated part; a processing module, configured to send the spatial position information and the movement trajectory information to a remote device, and receive an augmented reality image and navigation information of the surgical instrument returned by the remote device, where the remote device is configured to determine a virtual image according to the spatial position information and the movement trajectory information, and perform augmented reality processing on the virtual image to obtain the augmented reality image; and determining the navigation information based on the augmented reality image.
According to another aspect of the embodiments of the present invention, there is also provided a surgical navigation information determining apparatus, including: a receiving unit, configured to receive spatial position information of a to-be-operated portion from an operation end device and movement trajectory information of a surgical instrument, where the surgical instrument is located within a target range including the to-be-operated portion; the processing unit is used for determining a virtual image according to the spatial position information and the movement track information and performing augmented reality processing on the virtual image to obtain an augmented reality image; a determination unit configured to determine navigation information of the surgical instrument based on the augmented reality image; and the return unit is used for returning the augmented reality image and the navigation information to the operation end equipment.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium storing a plurality of instructions, the instructions being adapted to be loaded by a processor and to execute the method for determining surgical navigation information.
According to another aspect of the embodiments of the present invention, there is also provided a processor for executing a program, wherein the program is configured to execute the method for determining surgical navigation information.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to perform the method for determining surgical navigation information.
In the embodiment of the invention, the spatial position information of a part to be operated and the movement track information of a surgical instrument are acquired through surgical end equipment, wherein the surgical instrument is positioned in a target range containing the part to be operated; and the remote end equipment is connected with the operation end equipment, a virtual image is determined according to the spatial position information and the movement track information, the virtual image is subjected to augmented reality processing to obtain an augmented reality image, and the navigation information of the surgical instrument is determined based on the augmented reality image, so that the purpose of fusing the navigation information and a real operation scene is achieved, the technical effect of improving the practicability of the navigation information is realized, and the technical problem that the navigation information is separated from the operation scene in an operation navigation scheme in the prior art and is difficult to understand is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a schematic structural diagram of a surgical navigation information determination system according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of an alternative surgical navigation information determination system according to an embodiment of the present invention;
FIG. 3 is a flow chart of a method of determining surgical navigation information in accordance with an embodiment of the present invention;
FIG. 4 is a flow chart of another method of determining surgical navigational information, in accordance with an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a surgical navigation information determining apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of another surgical navigation information determining apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
First, in order to facilitate understanding of the embodiments of the present invention, some terms or nouns referred to in the present invention will be explained as follows:
augmented Reality (AR) is a technology for calculating the position and angle of a camera image in real time and adding corresponding images, videos and 3D models, and aims to cover a virtual world on a screen in the real world and perform interaction. This technique was proposed in 1990. With the improvement of the CPU operation capability of the portable electronic product, the application of augmented reality is expected to be wider and wider.
Example 1
According to an embodiment of the present invention, an embodiment of a system for determining surgical navigation information is provided, fig. 1 is a schematic structural diagram of a system for determining surgical navigation information according to an embodiment of the present invention, and as shown in fig. 1, the system for determining surgical navigation information includes: the operation end device 100 is connected with the remote end device 120, wherein:
an operation terminal device 100 configured to acquire spatial position information of a site to be operated and movement trajectory information of a surgical instrument, where the surgical instrument is located within a target range including the site to be operated; and a remote device 120, connected to the surgical end device 100, configured to determine a virtual image according to the spatial position information and the movement track information, perform augmented reality processing on the virtual image to obtain an augmented reality image, and determine navigation information of the surgical instrument based on the augmented reality image.
In the embodiment of the invention, the spatial position information of a part to be operated and the movement track information of a surgical instrument are acquired through surgical end equipment, wherein the surgical instrument is positioned in a target range containing the part to be operated; and the remote end equipment is connected with the operation end equipment, a virtual image is determined according to the spatial position information and the movement track information, the virtual image is subjected to augmented reality processing to obtain an augmented reality image, and the navigation information of the surgical instrument is determined based on the augmented reality image, so that the purpose of fusing the navigation information and a real operation scene is achieved, the technical effect of improving the practicability of the navigation information is realized, and the technical problem that the navigation information is separated from the operation scene in an operation navigation scheme in the prior art and is difficult to understand is solved.
Optionally, the remote device may be a remote cloud server, and the operation end device may be an AR navigation system; the above-mentioned portion of waiting to operate is the portion of waiting to operate of patient (human or animal), for example, waits to replace prosthesis position etc. and above-mentioned surgical instrument can be any kind of surgical instrument, specifically can correspond to the surgical instrument of selection to wait to carry out, does not give unnecessary details to this to can realize this application embodiment for the standard.
This application embodiment obtains the augmented reality image through carrying out the augmented reality with the virtual image and handling to and confirm above-mentioned surgical instruments's navigation information based on above-mentioned augmented reality image, and through show the function at operation end equipment integration AR, can realize the accurate stack of virtual image and reality image, prevent among the operation process doctor from switching back and forth between operation navigation and patient and observing, make the doctor more be absorbed in operation itself.
In an alternative embodiment, fig. 2 is a schematic structural diagram of an alternative surgical navigation information determining system according to an embodiment of the present invention, and as shown in fig. 2, the system further includes: a fifth generation mobile communication 5G module 140, configured to establish a communication connection between the surgical end device and the remote end device, where the surgical end device sends the spatial position information and the movement track information to the remote end device through the 5G module.
In the optional embodiment, the 5G module establishes the communication connection between the operation terminal device and the remote terminal device, so that the computing power of the remote cloud server can be loaded to the local AR navigation system, the defects of the local AR navigation system are optimized, and the determined navigation information applied to the operation scene is more accurate.
In addition, in the embodiment of the application, the communication connection between the operation end device and the far end device is established through the 5G module, so that the reaction delay of the AR navigation system can be effectively reduced, and real-time navigation information is provided for the operation instrument in the operation scene when the operation is performed.
In an alternative embodiment, the surgical end device comprises: the first positioning module is arranged in the target range and used for acquiring the spatial position information of the part to be operated in real time; and the second positioning module is connected with the surgical instrument and is used for acquiring the movement track information of the surgical instrument in real time.
In the above alternative embodiment, it is possible, but not limited to, to install an active or passive positioning module near the surgical site of the patient and on the surgical instrument, and use infrared as a transmission source and a CCD (charge coupled device) camera as a receiver, and use the emitted signal to obtain the bone space position information of the patient, and track the position and motion trajectory of the surgical instrument to obtain the motion trajectory information.
In an alternative embodiment, as also shown in fig. 2, the system further comprises: the display assisting module 130 is connected to the surgical end device 100 (i.e., connected to the first positioning module and the second positioning module, respectively), and configured to receive the spatial position information and the movement track information, and send the spatial position information and the movement track information to the remote end device 120 (i.e., the display assisting module 130 sends the spatial position information and the movement track information to the remote end device 120 through the 5G module 140).
Optionally, the display auxiliary module is mainly configured to process the spatial position information and the movement track information, that is, send the received spatial position information and the movement track information to the remote device; and receiving the augmented reality image and the navigation information returned by the far-end equipment, and sending the augmented reality image and the navigation information to an AR display end in the operation end equipment for displaying.
In an alternative embodiment, as shown in fig. 2, the remote device 120 comprises: an image processing module 122, connected to the display auxiliary module, for acquiring an X-ray fluoroscopic image of the to-be-operated region, and synthesizing the spatial position information, the movement trajectory information, and the X-ray fluoroscopic image to obtain the virtual image; performing augmented reality processing on the virtual image to obtain an augmented reality image, and selecting the navigation information matched with the spatial position information from the augmented reality image; a memory 124 connected to the image processing module 122 for storing the virtual image, the augmented reality image and the navigation information.
Optionally, the image processing module is further configured to obtain an X-ray perspective image of a to-be-operated portion, synthesize the spatial position information, the movement trajectory information, and the X-ray perspective image, and render the synthetic image to obtain a virtual image (i.e., a virtual image), perform augmented reality on the virtual image to obtain an augmented reality image (i.e., an AR image), and determine the navigation information matched with the spatial position information based on the spatial position information of the to-be-operated portion and the movement trajectory information of the surgical instrument through a path optimization algorithm.
In an optional embodiment, the display assisting module 130 is further configured to receive the augmented reality image and the navigation information returned by the image processing module; as also shown in fig. 2, the above-mentioned surgical end apparatus comprises: an AR display terminal 150 connected to the display auxiliary module 130 for displaying the augmented reality image and the navigation information.
In this application embodiment, through show the end in order to realize AR display function at the integrated AR of operation end equipment, can realize the accurate stack of virtual image and reality image and show, prevent that the doctor from switching back and forth between operation navigation and patient and observing among the operation process, make the doctor more be absorbed in operation itself.
As another optional embodiment, the image processing module may be further configured to render the virtual image to generate a virtual image file, and transmit the virtual image file to the display auxiliary module through the 5G module, so as to be displayed on an AR display end in the surgical end device.
Optionally, the image processing module includes: the cloud rendering server is used for segmenting files to be rendered to form one or more subfiles and distributing the subfiles to the rendering node machines, the rendering node machines are used for processing the segmented subtasks to generate picture subfiles and returning the subfiles to the cloud rendering server, and the cloud rendering server combines the subfiles rendered by the rendering node machines.
It should be noted that the specific structure of the surgical navigation information determination system shown in fig. 1 in the present application is merely an illustration, and the surgical navigation information determination system in the present application may have more or less structures than the surgical navigation information determination system shown in fig. 1 in specific applications.
Example 2
In accordance with an embodiment of the present invention, there is provided an embodiment of a method for determining surgical navigation information, it being noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than presented herein.
Fig. 3 is a flowchart of a method for determining surgical navigation information according to an embodiment of the present invention, as shown in fig. 3, the method includes the following steps:
step S102, acquiring spatial position information of a part to be operated and movement track information of a surgical instrument, wherein the surgical instrument is positioned in a target range including the part to be operated;
step S104, sending the spatial position information and the movement track information to a remote device, and receiving an augmented reality image and navigation information of the surgical instrument returned by the remote device, wherein the remote device is used for determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain the augmented reality image; and determining the navigation information based on the augmented reality image.
In the embodiment of the invention, the spatial position information of a part to be operated and the movement track information of a surgical instrument are acquired through surgical end equipment, wherein the surgical instrument is positioned in a target range containing the part to be operated; and the remote end equipment is connected with the operation end equipment, a virtual image is determined according to the spatial position information and the movement track information, the virtual image is subjected to augmented reality processing to obtain an augmented reality image, and the navigation information of the surgical instrument is determined based on the augmented reality image, so that the purpose of fusing the navigation information and a real operation scene is achieved, the technical effect of improving the practicability of the navigation information is realized, and the technical problem that the navigation information is separated from the operation scene in an operation navigation scheme in the prior art and is difficult to understand is solved.
Optionally, the remote device may be a remote cloud server, and the operation end device may be an AR navigation system; the above-mentioned portion of waiting to operate is the portion of waiting to operate of patient (human or animal), for example, waits to replace prosthesis position etc. and above-mentioned surgical instrument can be any kind of surgical instrument, specifically can correspond to the surgical instrument of selection to wait to carry out, does not give unnecessary details to this to can realize this application embodiment for the standard.
This application embodiment obtains the augmented reality image through carrying out the augmented reality with the virtual image and handling to and confirm above-mentioned surgical instruments's navigation information based on above-mentioned augmented reality image, and through show the function at operation end equipment integration AR, can realize the accurate stack of virtual image and reality image, prevent among the operation process doctor from switching back and forth between operation navigation and patient and observing, make the doctor more be absorbed in operation itself.
According to an embodiment of the present invention, another embodiment of a method for determining surgical navigation information is provided, and fig. 4 is a flowchart of another method for determining surgical navigation information according to an embodiment of the present invention, as shown in fig. 4, the method includes the following steps:
step S202, receiving space position information of a to-be-operated part from an operation end device and movement track information of a surgical instrument, wherein the surgical instrument is positioned in a target range containing the to-be-operated part;
step S204, determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain an augmented reality image;
step S206, determining navigation information of the surgical instrument based on the augmented reality image;
and step S208, returning the augmented reality image and the navigation information to the operation terminal equipment.
In the embodiment of the invention, the spatial position information of a part to be operated and the movement track information of a surgical instrument are acquired through surgical end equipment, wherein the surgical instrument is positioned in a target range containing the part to be operated; and the remote end equipment is connected with the operation end equipment, a virtual image is determined according to the spatial position information and the movement track information, the virtual image is subjected to augmented reality processing to obtain an augmented reality image, and the navigation information of the surgical instrument is determined based on the augmented reality image, so that the purpose of fusing the navigation information and a real operation scene is achieved, the technical effect of improving the practicability of the navigation information is realized, and the technical problem that the navigation information is separated from the operation scene in an operation navigation scheme in the prior art and is difficult to understand is solved.
Optionally, the remote device may be a remote cloud server, and the operation end device may be an AR navigation system; the above-mentioned portion of waiting to operate is the portion of waiting to operate of patient (human or animal), for example, waits to replace prosthesis position etc. and above-mentioned surgical instrument can be any kind of surgical instrument, specifically can correspond to the surgical instrument of selection to wait to carry out, does not give unnecessary details to this to can realize this application embodiment for the standard.
This application embodiment obtains the augmented reality image through carrying out the augmented reality with the virtual image and handling to and confirm above-mentioned surgical instruments's navigation information based on above-mentioned augmented reality image, and through show the function at operation end equipment integration AR, can realize the accurate stack of virtual image and reality image, prevent among the operation process doctor from switching back and forth between operation navigation and patient and observing, make the doctor more be absorbed in operation itself.
It should be noted that any optional or preferred method for determining the surgical navigation information in this embodiment may be implemented or realized in the system for determining the surgical navigation information provided in embodiment 1.
In addition, it should be noted that, for alternative or preferred embodiments of the present embodiment, reference may be made to the relevant description in embodiment 1, and details are not described herein again.
Example 3
According to an embodiment of the present invention, there is further provided an apparatus embodiment for implementing the method for determining surgical navigation information, fig. 5 is a schematic structural diagram of an apparatus for determining surgical navigation information according to an embodiment of the present invention, and as shown in fig. 5, the apparatus for determining surgical navigation information includes: an acquisition module 400 and a processing module 420, wherein:
an obtaining module 400, configured to obtain spatial position information of a to-be-operated portion and movement trajectory information of a surgical instrument, where the surgical instrument is located within a target range including the to-be-operated portion; a processing module 420, configured to send the spatial position information and the movement track information to a remote device, and receive an augmented reality image and navigation information of the surgical instrument returned by the remote device, where the remote device is configured to determine a virtual image according to the spatial position information and the movement track information, and perform augmented reality processing on the virtual image to obtain the augmented reality image; and determining the navigation information based on the augmented reality image.
It should be noted that the above modules may be implemented by software or hardware, for example, for the latter, the following may be implemented: the modules can be located in the same processor; alternatively, the modules may be located in different processors in any combination.
It should be noted here that the above-mentioned acquiring module 400 and the processing module 420 correspond to steps S102 to S104 in embodiment 2, and the above-mentioned modules are the same as the examples and application scenarios realized by the corresponding steps, but are not limited to the disclosure of embodiment 2. It should be noted that the modules described above may be implemented in a computer terminal as part of an apparatus.
According to an embodiment of the present invention, there is provided another apparatus embodiment for implementing the method for determining surgical navigation information, fig. 6 is a schematic structural diagram of another apparatus for determining surgical navigation information according to an embodiment of the present invention, and as shown in fig. 6, the apparatus for determining surgical navigation information includes: a receiving unit 500, a processing unit 520, a determining unit 540 and a returning unit 560, wherein:
a receiving unit 500, configured to receive spatial position information of a to-be-operated portion from a surgical end device and movement trajectory information of a surgical instrument, where the surgical instrument is located within a target range including the to-be-operated portion; a processing unit 520, configured to determine a virtual image according to the spatial position information and the movement track information, and perform augmented reality processing on the virtual image to obtain an augmented reality image; a determining unit 540, configured to determine navigation information of the surgical instrument based on the augmented reality image; a returning unit 560, configured to return the augmented reality image and the navigation information to the operation end device.
It should be noted that the above modules may be implemented by software or hardware, for example, for the latter, the following may be implemented: the modules can be located in the same processor; alternatively, the modules may be located in different processors in any combination.
It should be noted here that the receiving unit 500, the processing unit 520, the determining unit 540, and the returning unit 560 correspond to steps S202 to S208 in embodiment 2, and the modules are the same as the examples and application scenarios realized by the corresponding steps, but are not limited to the disclosure in embodiment 2. It should be noted that the modules described above may be implemented in a computer terminal as part of an apparatus.
It should be noted that, reference may be made to the relevant description in embodiments 1 and 2 for alternative or preferred embodiments of this embodiment, and details are not described here again.
The above-mentioned device for determining surgical navigation information may further include a processor and a memory, where the above-mentioned obtaining module 400, the processing module 420, the receiving unit 500, the processing unit 520, the determining unit 540, the returning unit 560, and the like are all stored in the memory as program units, and the processor executes the program units stored in the memory to implement corresponding functions.
The processor comprises a kernel, and the kernel calls a corresponding program unit from the memory, wherein one or more than one kernel can be arranged. The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
According to the embodiment of the application, the embodiment of the nonvolatile storage medium is also provided. Optionally, in this embodiment, the nonvolatile storage medium includes a stored program, and when the program runs, the apparatus in which the nonvolatile storage medium is located is controlled to execute the method for determining any one of the surgical navigation information.
Optionally, in this embodiment, the nonvolatile storage medium may be located in any one of a group of computer terminals in a computer network, or in any one of a group of mobile terminals, and the nonvolatile storage medium includes a stored program.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: acquiring spatial position information of a part to be operated and movement track information of a surgical instrument, wherein the surgical instrument is positioned in a target range including the part to be operated; sending the spatial position information and the movement track information to a remote device, and receiving an augmented reality image and navigation information of the surgical instrument returned by the remote device, wherein the remote device is used for determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain the augmented reality image; and determining the navigation information based on the augmented reality image.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: receiving spatial position information of a to-be-operated part and movement track information of a surgical instrument from surgical end equipment, wherein the surgical instrument is positioned in a target range containing the to-be-operated part; determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain an augmented reality image; determining navigation information of the surgical instrument based on the augmented reality image; and returning the augmented reality image and the navigation information to the operation end equipment.
According to the embodiment of the application, the embodiment of the processor is also provided. Optionally, in this embodiment, the processor is configured to execute a program, where the program executes the method for determining any one of the surgical navigation information.
An embodiment of the present application provides an electronic device, which includes a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the computer program to perform any one of the above methods for determining surgical navigation information.
The present application further provides a computer program product adapted to perform a program of initializing the steps of the method of determining surgical navigation information of any of the above when executed on a data processing device.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer-readable nonvolatile storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a non-volatile storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned nonvolatile storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (13)

1.一种手术导航信息的确定系统,其特征在于,包括:1. a determination system of surgical navigation information, is characterized in that, comprises: 手术端设备,用于获取待手术部位的空间位置信息和手术器械的移动轨迹信息,其中,所述手术器械位于包含所述待手术部位的目标范围内;a surgical end device, configured to acquire spatial position information of the to-be-operated site and movement track information of a surgical instrument, wherein the surgical instrument is located within a target range including the to-be-operated site; 远端设备,与所述手术端设备连接,用于依据所述空间位置信息和所述移动轨迹信息确定虚拟图像,并对所述虚拟图像进行增强现实处理得到增强现实图像,以及基于所述增强现实图像确定所述手术器械的导航信息。a remote device, connected to the surgical end device, for determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain an augmented reality image, and based on the augmented reality The actual image determines navigation information for the surgical instrument. 2.根据权利要求1所述的系统,其特征在于,所述系统还包括:2. The system of claim 1, wherein the system further comprises: 第五代移动通信5G模块,用于建立所述手术端设备和所述远端设备的通信连接,其中,所述手术端设备经所述5G模块发送所述空间位置信息和所述移动轨迹信息至所述远端设备。The fifth-generation mobile communication 5G module is used to establish a communication connection between the surgical end device and the remote device, wherein the surgical end device sends the spatial position information and the movement track information through the 5G module to the remote device. 3.根据权利要求1所述的系统,其特征在于,所述手术端设备包括:3. The system of claim 1, wherein the surgical end device comprises: 第一定位模块,设置在所述目标范围内,用于实时获取所述待手术部位的所述空间位置信息;a first positioning module, arranged within the target range, for acquiring the spatial position information of the to-be-operated site in real time; 第二定位模块,与所述手术器械连接,用于实时获取所述手术器械的所述移动轨迹信息。The second positioning module is connected with the surgical instrument, and is used for acquiring the movement track information of the surgical instrument in real time. 4.根据权利要求3所述的系统,其特征在于,所述系统还包括:4. The system of claim 3, wherein the system further comprises: 显示辅助模块,与所述第一定位模块和所述第二定位模块连接,用于接收所述空间位置信息和所述移动轨迹信息,并将所述空间位置信息和所述移动轨迹信息发送至所述远端设备。A display auxiliary module, connected with the first positioning module and the second positioning module, is used for receiving the spatial position information and the movement trajectory information, and sending the spatial position information and the movement trajectory information to the remote device. 5.根据权利要求4所述的系统,其特征在于,所述远端设备包括:5. The system of claim 4, wherein the remote device comprises: 图像处理模块,与所述显示辅助模块连接,用于获取所述待手术部位的X光透视图像,并将所述空间位置信息、所述移动轨迹信息与所述X光透视图像进行合成处理,得到所述虚拟图像;以及对所述虚拟图像进行增强现实处理得到增强现实图像,并从所述增强现实图像中选取与所述空间位置信息匹配的所述导航信息;an image processing module, connected to the display auxiliary module, for acquiring an X-ray fluoroscopic image of the to-be-operated site, and synthesizing the spatial position information, the movement trajectory information and the X-ray fluoroscopic image, obtaining the virtual image; and performing augmented reality processing on the virtual image to obtain an augmented reality image, and selecting the navigation information matching the spatial position information from the augmented reality image; 存储器,与所述图像处理模块连接,用于存储所述虚拟图像、所述增强现实图像和所述导航信息。a memory, connected to the image processing module, for storing the virtual image, the augmented reality image and the navigation information. 6.根据权利要求5所述的系统,其特征在于,所述显示辅助模块,还用于接收所述图像处理模块返回的所述增强现实图像和所述导航信息;6. The system according to claim 5, wherein the display auxiliary module is further configured to receive the augmented reality image and the navigation information returned by the image processing module; 所述系统还包括:AR显示端,与所述显示辅助模块连接,用于展示所述增强现实图像和所述导航信息。The system further includes: an AR display terminal, connected to the display auxiliary module, for displaying the augmented reality image and the navigation information. 7.一种手术导航信息的确定方法,其特征在于,包括:7. A method for determining surgical navigation information, comprising: 获取待手术部位的空间位置信息和手术器械的移动轨迹信息,其中,所述手术器械位于包含所述待手术部位的目标范围内;Acquiring spatial position information of the to-be-operated site and movement track information of the surgical instrument, wherein the surgical instrument is located within a target range including the to-be-operated site; 发送所述空间位置信息和所述移动轨迹信息至远端设备,并接收所述远端设备返回的增强现实图像和所述手术器械的导航信息,其中,所述远端设备用于依据所述空间位置信息和所述移动轨迹信息确定虚拟图像,并对所述虚拟图像进行增强现实处理得到所述增强现实图像;以及基于所述增强现实图像确定所述导航信息。Sending the spatial position information and the movement track information to a remote device, and receiving an augmented reality image returned by the remote device and navigation information of the surgical instrument, wherein the remote device is used to The spatial position information and the movement track information determine a virtual image, and perform augmented reality processing on the virtual image to obtain the augmented reality image; and determine the navigation information based on the augmented reality image. 8.一种手术导航信息的确定方法,其特征在于,包括:8. A method for determining surgical navigation information, comprising: 接收来自手术端设备的待手术部位的空间位置信息和手术器械的移动轨迹信息,其中,所述手术器械位于包含所述待手术部位的目标范围内;receiving the spatial position information of the to-be-operated part and the movement track information of the surgical instrument from the surgical end device, wherein the surgical instrument is located within a target range including the to-be-operated part; 依据所述空间位置信息和所述移动轨迹信息确定虚拟图像,并对所述虚拟图像进行增强现实处理得到增强现实图像;determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain an augmented reality image; 基于所述增强现实图像确定所述手术器械的导航信息;determining navigation information for the surgical instrument based on the augmented reality image; 返回所述增强现实图像和所述导航信息至所述手术端设备。Return the augmented reality image and the navigation information to the surgical end device. 9.一种手术导航信息的确定装置,其特征在于,包括:9. A device for determining surgical navigation information, comprising: 获取模块,用于获取待手术部位的空间位置信息和手术器械的移动轨迹信息,其中,所述手术器械位于包含所述待手术部位的目标范围内;an acquisition module, configured to acquire spatial position information of the to-be-operated site and movement track information of a surgical instrument, wherein the surgical instrument is located within a target range including the to-be-operated site; 处理模块,用于发送所述空间位置信息和所述移动轨迹信息至远端设备,并接收所述远端设备返回的增强现实图像和所述手术器械的导航信息,其中,所述远端设备用于依据所述空间位置信息和所述移动轨迹信息确定虚拟图像,并对所述虚拟图像进行增强现实处理得到所述增强现实图像;以及基于所述增强现实图像确定所述导航信息。a processing module, configured to send the spatial position information and the movement trajectory information to a remote device, and receive an augmented reality image returned by the remote device and navigation information of the surgical instrument, wherein the remote device for determining a virtual image according to the spatial position information and the movement track information, performing augmented reality processing on the virtual image to obtain the augmented reality image; and determining the navigation information based on the augmented reality image. 10.一种手术导航信息的确定装置,其特征在于,包括:10. A device for determining surgical navigation information, comprising: 接收单元,用于接收来自手术端设备的待手术部位的空间位置信息和手术器械的移动轨迹信息,其中,所述手术器械位于包含所述待手术部位的目标范围内;a receiving unit, configured to receive the spatial position information of the to-be-operated part and the movement track information of the surgical instrument from the surgical end device, wherein the surgical instrument is located within a target range including the to-be-operated part; 处理单元,用于依据所述空间位置信息和所述移动轨迹信息确定虚拟图像,并对所述虚拟图像进行增强现实处理得到增强现实图像;a processing unit, configured to determine a virtual image according to the spatial position information and the movement track information, and perform augmented reality processing on the virtual image to obtain an augmented reality image; 确定单元,用于基于所述增强现实图像确定所述手术器械的导航信息;a determining unit for determining navigation information of the surgical instrument based on the augmented reality image; 返回单元,用于返回所述增强现实图像和所述导航信息至所述手术端设备。A returning unit, configured to return the augmented reality image and the navigation information to the surgical end device. 11.一种非易失性存储介质,其特征在于,所述非易失性存储介质存储有多条指令,所述指令适于由处理器加载并执行权利要求7或8所述的手术导航信息的确定方法。11. A non-volatile storage medium, wherein the non-volatile storage medium stores a plurality of instructions, the instructions are adapted to be loaded by a processor and execute the surgical navigation of claim 7 or 8 method of determining information. 12.一种处理器,其特征在于,所述处理器用于运行程序,其中,所述程序被设置为运行时执行权利要求7或8所述的手术导航信息的确定方法。12. A processor, wherein the processor is configured to run a program, wherein the program is configured to execute the method for determining surgical navigation information according to claim 7 or 8 when the program is executed. 13.一种电子装置,包括存储器和处理器,其特征在于,所述存储器中存储有计算机程序,所述处理器被设置为运行所述计算机程序以执行权利要求7或8所述的手术导航信息的确定方法。13. An electronic device comprising a memory and a processor, wherein a computer program is stored in the memory, and the processor is configured to run the computer program to perform the surgical navigation of claim 7 or 8 method of determining information.
CN202011106421.4A 2020-10-15 2020-10-15 Method, device and system for determining surgical navigation information and electronic device Pending CN112190331A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011106421.4A CN112190331A (en) 2020-10-15 2020-10-15 Method, device and system for determining surgical navigation information and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011106421.4A CN112190331A (en) 2020-10-15 2020-10-15 Method, device and system for determining surgical navigation information and electronic device

Publications (1)

Publication Number Publication Date
CN112190331A true CN112190331A (en) 2021-01-08

Family

ID=74009202

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011106421.4A Pending CN112190331A (en) 2020-10-15 2020-10-15 Method, device and system for determining surgical navigation information and electronic device

Country Status (1)

Country Link
CN (1) CN112190331A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113509265A (en) * 2021-04-01 2021-10-19 上海复拓知达医疗科技有限公司 Dynamic position identification prompting system and method thereof
CN115909035A (en) * 2022-11-29 2023-04-04 武汉联影智融医疗科技有限公司 Method, system, device and medium for counting surgical instruments
CN118502584A (en) * 2024-04-30 2024-08-16 广州影华科技有限公司 A collaborative information interaction method and system between a three-degree-of-freedom mechanism and an AR device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103211655A (en) * 2013-04-11 2013-07-24 深圳先进技术研究院 Navigation system and navigation method of orthopedic operation
CN107374729A (en) * 2017-08-21 2017-11-24 上海霖晏医疗科技有限公司 Operation guiding system and method based on AR technologies
US20180293802A1 (en) * 2017-04-07 2018-10-11 Unveil, LLC Systems and methods for mixed reality medical training
US20190053855A1 (en) * 2017-08-15 2019-02-21 Holo Surgical Inc. Graphical user interface for a surgical navigation system and method for providing an augmented reality image during operation
CN109730771A (en) * 2019-03-19 2019-05-10 安徽紫薇帝星数字科技有限公司 A kind of operation guiding system based on AR technology

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103211655A (en) * 2013-04-11 2013-07-24 深圳先进技术研究院 Navigation system and navigation method of orthopedic operation
US20180293802A1 (en) * 2017-04-07 2018-10-11 Unveil, LLC Systems and methods for mixed reality medical training
US20190053855A1 (en) * 2017-08-15 2019-02-21 Holo Surgical Inc. Graphical user interface for a surgical navigation system and method for providing an augmented reality image during operation
CN107374729A (en) * 2017-08-21 2017-11-24 上海霖晏医疗科技有限公司 Operation guiding system and method based on AR technologies
CN109730771A (en) * 2019-03-19 2019-05-10 安徽紫薇帝星数字科技有限公司 A kind of operation guiding system based on AR technology

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113509265A (en) * 2021-04-01 2021-10-19 上海复拓知达医疗科技有限公司 Dynamic position identification prompting system and method thereof
CN115909035A (en) * 2022-11-29 2023-04-04 武汉联影智融医疗科技有限公司 Method, system, device and medium for counting surgical instruments
CN118502584A (en) * 2024-04-30 2024-08-16 广州影华科技有限公司 A collaborative information interaction method and system between a three-degree-of-freedom mechanism and an AR device
CN118502584B (en) * 2024-04-30 2025-03-14 广州影华科技有限公司 A collaborative information interaction method and system between a three-degree-of-freedom mechanism and an AR device

Similar Documents

Publication Publication Date Title
US11025889B2 (en) Systems and methods for determining three dimensional measurements in telemedicine application
EP1883052B1 (en) Generating images combining real and virtual images
EP3786890B1 (en) Method and apparatus for determining pose of image capture device, and storage medium therefor
KR102018565B1 (en) Method, apparatus and program for constructing surgical simulation information
JP3992629B2 (en) Image generation system, image generation apparatus, and image generation method
Clarkson et al. The NifTK software platform for image-guided interventions: platform overview and NiftyLink messaging
CN103841894B (en) The image segmentation of organ and anatomical structure
CN110866977B (en) Augmented reality processing method and device, system, storage medium and electronic equipment
CA3105356A1 (en) Synthesizing an image from a virtual perspective using pixels from a physical imager array
CN108335353A (en) Three-dimensional rebuilding method, device and system, server, the medium of dynamic scene
CN109887077B (en) Method and apparatus for generating three-dimensional model
CN112190331A (en) Method, device and system for determining surgical navigation information and electronic device
JP2018026064A (en) Image processor, image processing method, system
WO2025001689A1 (en) Image registration method, electronic device, and computer readable storage medium
CN116012586A (en) Image processing method, storage medium and computer terminal
CN117174249A (en) AR-assisted method for remote surgery based on 5G communication
CN112216376A (en) Remote boot system, method, computer device, and readable storage medium
CN114388145B (en) Online inquiry method and device
Ahmad et al. Automatic feature‐based markerless calibration and navigation method for augmented reality assisted dental treatment
CN119052516A (en) Virtual remote live broadcast method, device, equipment and storage medium
CN118298267A (en) Multisource information fusion method, device, equipment and medium
CN119941803A (en) Mixed reality navigation registration method, device, computer equipment and storage medium
CN113763467B (en) Image processing method, device, computing equipment and medium
CN107491631A (en) The control object of dual energy CT view data is transmitted to client device for controlling
WO2018222181A1 (en) Systems and methods for determining three dimensional measurements in telemedicine application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210108

RJ01 Rejection of invention patent application after publication