CN114550876A - Operation decision support system and method based on augmented reality - Google Patents
Operation decision support system and method based on augmented reality Download PDFInfo
- Publication number
- CN114550876A CN114550876A CN202011295770.5A CN202011295770A CN114550876A CN 114550876 A CN114550876 A CN 114550876A CN 202011295770 A CN202011295770 A CN 202011295770A CN 114550876 A CN114550876 A CN 114550876A
- Authority
- CN
- China
- Prior art keywords
- surgical
- decision support
- augmented reality
- training
- current
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/20—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Theoretical Computer Science (AREA)
- Epidemiology (AREA)
- General Business, Economics & Management (AREA)
- Software Systems (AREA)
- Business, Economics & Management (AREA)
- Public Health (AREA)
- Primary Health Care (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Surgery (AREA)
- Urology & Nephrology (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Biomedical Technology (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention discloses an operation decision support system based on augmented reality and a method thereof, which are characterized in that a surgeon establishes an optimal operation which can be reached by self before an operation is carried out, so that demonstration is carried out through the augmented reality in the operation, the current operation is detected to be compared with the optimal operation, and when the comparison difference exceeds an allowable range, difference information is displayed to provide operation decision support so as to achieve the technical effect of improving the operation efficiency and the success rate.
Description
Technical Field
The invention relates to an operation decision system and a method thereof, in particular to an operation decision support system and a method thereof based on augmented reality.
Background
In recent years, with the popularization and vigorous development of augmented reality, various applications based on augmented reality, such as operations, navigation and games, are emerging as spring shoots after rain.
Generally, in the conventional method of applying augmented reality to surgery, a virtual-real combined situation can be constructed by a camera device and a display, wherein medical data, images, organ models, etc. are displayed on the display as virtual objects, and simultaneously, real images captured by the camera device are displayed, so that when a surgeon browses the display, the surgeon can see the virtual objects and the real images simultaneously for reference in surgery. However, in an actual surgical procedure, the method is a passive method for providing messages, and cannot actively provide corresponding surgical decision support according to the surgical operation of the surgeon. That is, if the experience of the surgeon is insufficient, although the method can provide a large amount of relevant data and images in real time, the surgeon cannot help the surgeon, but the attention of the surgeon may be distracted and the pressure may be increased, and even the operation error may occur, so that the problem that the surgeon is prone to error due to the insufficient experience or pressure is caused.
In view of the above, manufacturers propose a technical means for simulating an organ with augmented reality and teaching a surgeon with a surgical demonstration operation, which indirectly deepens the experience of the surgeon by modeling the organ as a virtual object in advance and playing the surgical demonstration operation for the surgeon to learn. However, there is a great gap between the actual surgical procedure and the learning procedure, and even if the surgeon learns and exercises many times in advance, the inexperienced surgeon will often feel great stress when actually operating the surgical knife, especially when unexpected situations occur during the surgical procedure. Therefore, the aforementioned method still fails to effectively solve the problem that the surgeon is prone to error due to inexperience or stress.
From the above, it is known that there is a problem in the prior art that surgeons are prone to error due to inexperience or stress, and thus there is a need for an improved technical means to solve the problem.
Disclosure of Invention
The invention discloses an operation decision support system based on augmented reality and a method thereof.
First, the present invention discloses an operation decision support system based on augmented reality, which comprises: the system comprises an operation database, a training module, a sensing module and a decision support module. The operation database is used for storing operation schemes, the operation schemes comprise organ models, operation processes, operation instrument use time points and physiological data, and each operation scheme is allowed to be presented in an augmented reality mode; the training module is connected with the operation database and used for selecting and loading one of the operation schemes corresponding to the operation for training before the operation is performed, presenting the selected and loaded operation scheme by augmented reality, and enabling the sensor to continuously sense the free motion of the three-dimensional space of the operation tool during the training to establish an optimized operation; the sensing module is used for enabling the sensor to continuously sense the free movement of the surgical instrument in three-dimensional space in the process of performing surgery to serve as the current surgery operation; and the decision support module is connected with the training module and the sensing module and used for comparing the optimized operation with the current operation, and when the comparison difference exceeds the allowable range, the difference message is output to provide operation decision support.
In addition, the invention also discloses an operation decision support method based on augmented reality, which comprises the following steps: providing a surgical plan comprising an organ model, an operational procedure, surgical instrument usage time points, and physiological data, each surgical plan allowing presentation in augmented reality; before performing an operation, selecting and loading one of the operation schemes corresponding to the operation for training, presenting the selected and loaded operation scheme in an augmented reality, and enabling the sensor to continuously sense the three-dimensional free motion of the surgical instrument in the training to establish an optimized operation; during the course of performing a surgical procedure, an Enable (Enable) sensor continuously senses the free motion of the surgical instrument in three-dimensional space as a current surgical procedure; and comparing the optimized operation with the current operation, and outputting a difference message to provide operation decision support when the comparison difference exceeds an allowable range.
The system and method disclosed in the present invention are different from the prior art in that the present invention optimizes the operation procedure by the surgeon himself/herself before the operation is performed, so as to perform the demonstration through the augmented reality during the operation, and simultaneously detects the current operation procedure to compare with the optimized operation procedure, and when the comparison difference exceeds the allowable range, the difference message is displayed to provide the operation decision support.
Through the technical means, the invention can achieve the technical effects of improving the operation efficiency and success rate.
Drawings
Fig. 1 is a system block diagram of an augmented reality based surgical decision support system of the present invention.
Fig. 2A to 2C are flow charts of the augmented reality-based surgical decision support method of the present invention.
Fig. 3A and 3B are schematic views of a surgical procedure and decision support provided by the present invention at different times.
Fig. 4 is a schematic diagram of a setting scheme according to the present invention.
Description of reference numerals:
110 surgical database
120 training module
130 sensing module
140 decision support module
300 display interface
301, 302 display block
312a, 312b surgical instrument
320 motion sensor
331,332 dotted line
400 setting window
410 display block
420 input block
421 pause element
422 sound recording element
430 validation element
Detailed Description
The embodiments of the present invention will be described in detail with reference to the drawings and examples, so that how to implement the technical means for solving the technical problems and achieving the technical effects of the present invention can be fully understood and implemented.
First, before describing the operation decision support system and method based on augmented reality disclosed by the present invention, the environment to which the present invention is applied will be described, and the present invention is applied to augmented reality, which is a technology that images are captured by a camera device, and a calculation is performed according to the position and angle of the images, and then a virtual object and a real world scene are simultaneously presented on a display device by matching with an image analysis technology, and interaction can be performed. In actual implementation, the display device may include a head mounted display, a heads-up display, a touch screen, and so on.
Referring to fig. 1, fig. 1 is a block diagram of a system of an augmented reality-based surgical decision support system according to the present invention, the system comprising: a surgical database 110, a training module 120, a sensing module 130, and a decision support module 140. The surgical database 110 is used to store a plurality of surgical plans, each of which includes an organ model, an operation procedure, a usage time point of a surgical tool, and physiological data, and each of the surgical plans is allowed to be presented in an augmented reality. For example, assume that the surgical plan (or called surgery) is "hysteromyomectomy", which includes organ models of uterus and peripheral organs, an operation procedure as an execution step of the surgery, recording various surgical instruments and usage time points thereof at the usage time of the surgical instruments, and recording various physiological data required to be noticed by the surgery.
The training module 120 is connected to the surgical database 110, and is used to select one of the surgical recipes corresponding to the surgical procedure to be loaded for training before the surgical procedure is performed, and to present the selected loaded surgical recipe in an augmented reality, and to enable the sensors to continuously sense the three-dimensional free motion of the surgical tool during the training to establish an optimized surgical procedure. In other words, the optimized operation is established by the surgeon himself/herself in the training process and is used as the basis for the subsequent comparison, rather than the conventional operation based on textbook models or famous physicians, so that the problem that the inexperienced surgeon cannot reproduce the operation of others can be avoided because the established optimized operation is necessarily the operation that the surgeon can reach by himself/herself. In addition, in practical implementation, the three-dimensional free motion of the surgical instrument continuously sensed during training can be used as training data input into a Machine Learning Model (Machine Learning Model) to train the Machine Learning Model corresponding to the surgery.
The sensing module 130 is used to enable the sensor to continuously sense the three-dimensional free motion of the surgical tool during the operation as the current operation. In practical implementation, if the training module 120 completes the training of the machine learning model, it is allowed to input the three-dimensional free motion of the surgical instrument continuously sensed during the operation into the machine learning model so as to recognize whether the current operation is similar to the optimized operation.
The decision support module 140 is connected to the training module 120 and the sensing module 130 for comparing the optimized operation with the current operation, and outputting a difference message to provide operation decision support when the comparison difference exceeds an allowable range. For example, an optimized surgical operation may include data for a particular organ and a path of movement of a surgical instrument; the current surgical operation may include data of the specific organ and the movement path of the current surgical instrument, if the data of the specific organ is the same, for example: if the type, position and size of the myoma of the uterus are the same, the offset is output as a difference message when the moving path of the current surgical tool deviates from the moving path of the surgical tool for the optimized surgical operation and exceeds the allowable range. In addition, it is assumed that the training of the machine learning model is completed and whether the current operation is close to the optimized operation is identified by the machine learning model, so as to dynamically adjust the allowable range according to the identification result.
In particular, in practical implementation, all modules described in the present invention can be implemented by various manners, including software, hardware, or any combination thereof, for example, in some embodiments, each module can be implemented by software, hardware, or any combination thereof, and besides, the present invention can also be implemented partially or completely by hardware, for example, one or more modules in a System can be implemented by an integrated circuit Chip, a System on Chip (SoC), a Complex Programmable Logic Device (CPLD), a Field Programmable Gate Array (FPGA), or the like. The present invention may be a system, method and/or computer program. The computer program may include a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement various aspects of the present invention, the computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: hard disk, random access memory, read only memory, flash memory, compact disk, floppy disk, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical signals through a fiber optic cable), or electrical signals transmitted through a wire. Additionally, the computer-readable program instructions described herein may be downloaded to the various computing/processing devices from a computer-readable storage medium, or over a network, for example: the internet, a local area network, a wide area network, and/or a wireless network to an external computer device or an external storage device. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, hubs and/or gateways. The network card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device. The computer program instructions which perform the operations of the present invention may be combinatorial language instructions, instruction set architecture instructions, machine dependent instructions, microinstructions, firmware instructions, or Object Code (Object Code) written in any combination of one or more programming languages, including an Object oriented programming language, such as: common Lisp, Python, C + +, Objective-C, Smalltalk, Delphi, Java, Swift, C #, Perl, Ruby, and PHP, etc., as well as conventional Procedural (Procedural) programming languages, such as: c or a similar programming language. The computer program instructions may execute entirely on the computer, partly on the computer, as stand-alone software, partly on a client computer and partly on a remote computer or entirely on the remote computer or server.
Referring to fig. 2A to 2C, fig. 2A to 2C are flow charts of the augmented reality-based surgical decision support method of the present invention, which includes the steps of: providing surgical protocols, the surgical protocols including organ models, operational procedures, surgical instrument use points and physiological data, each surgical protocol allowing presentation in augmented reality (step 210); before performing the surgery, selecting and loading one of the surgical plans corresponding to the surgery for training, presenting the selected and loaded surgical plan in an augmented reality, and enabling the sensor to continuously sense the three-dimensional free motion of the surgical instrument during the training to establish an optimized surgical operation (step 220); during the operation, enabling the sensor to continuously sense the free movement of the surgical instrument in three-dimensional space as the current operation (step 230); and comparing the optimized operation with the current operation, and outputting a difference message to provide operation decision support when the comparison difference exceeds an allowable range (step 240). Through the steps, the optimal operation which can be achieved by self-building force of a surgeon before the operation is performed, so that the operation process is demonstrated through augmented reality, the current operation is detected to be compared with the optimal operation, and when the comparison difference exceeds an allowable range, difference information is displayed to provide operation decision support.
In addition, as illustrated in fig. 2B, after step 240, the three-dimensional free motion of the surgical instrument continuously sensed in the training may be used as training data input into the machine learning model to train the machine learning model corresponding to the surgery, and after the training of the machine learning model is completed, the three-dimensional free motion of the surgical instrument continuously sensed in the process of the surgery is allowed to be input into the machine learning model to recognize the current surgery operation, and the allowable range is dynamically adjusted according to the recognition result (step 250). Alternatively, as shown in fig. 2C, after step 240, the operation behavior is continuously detected, and when the operation behavior is abnormally suspended or delayed, the organ model, the operation flow, the operation tool usage time and the physiological data of the loaded operation plan are synchronously displayed for auxiliary support and guidance (step 260). It should be noted that, assuming that the abnormal suspension or delay of the operation behavior is detected after the operation time passes 6 minutes, the operation flow, the operation tool use time point, the physiological data, and the like of the loaded operation plan after the time point (i.e., 6 minutes) can be synchronously displayed, and the complete operation plan is not displayed from beginning to end, but only the partial operation plan that has not been performed is displayed.
Fig. 3A and 3B are schematic views illustrating a surgical operation and decision support provided by the present invention at different time periods, according to the following description by way of example with reference to fig. 3A to 4. First, before performing an operation, as illustrated in fig. 3A, a surgeon may load an operation plan corresponding to the operation from an operation database, and display the loaded operation plan in a display area 301 of a display interface 300, and the display interface may also display images of organs, tissues, and the like corresponding to the operation plan, so that the surgeon performs pre-operation training according to the operation plan. Then, during the pre-operation training, the image sensing technology of the display interface 300 or the motion sensor 320 disposed on the operating table continuously detects the three-dimensional free motion of the surgical instruments (312a, 312b) to record as the operation. After the surgeon has undergone a plurality of preoperative training sessions in augmented reality, the most satisfactory surgical procedure can be selected from the above as the optimized surgical procedure, for example: movement of the surgical instrument 312b along the dashed line 331 may be considered an optimized surgical procedure. In practical implementations, the surgical plan may also include a solution for dealing with various emergency situations at different points in time, such as: emergency surgical operation. Then, as shown in fig. 3B, during the actual operation, the motion sensor 320 can also continuously detect the three-dimensional free motion of the surgical instruments (312a, 312B) to record the current operation, for example: the movement of the surgical tool 312b along the dotted line 332 is regarded as the current surgical operation, and then compared with the optimized surgical operation, if the difference between the two exceeds the allowable range, it may represent that the surgical procedure is not smooth since the current point, so that various coping schemes at the time when the surgical procedure is not smooth may be displayed in the display block 302 for the surgeon to decide, so that the surgeon can cope with the emergency situation. For example, if it is detected that the difference between the moving paths (e.g., the dashed line 331 and the dashed line 332) of the surgical instrument 312b during the training and the actual operation exceeds the allowable range during the middle stage of the operation, the possible situations during the middle stage of the operation and the coping schemes can be displayed in the display area 302, so that the surgeon can select the appropriate coping scheme, and the operation of the selected coping scheme can be displayed on the augmented reality display interface 300.
Referring to fig. 4, fig. 4 is a schematic diagram illustrating a setting handling scheme according to the present invention. In practical implementation, the coping scheme can be set by a remote device, such as: when the inexperienced surgeon encounters an emergency during training or surgery, the surgeon with rich experience at the remote end can enter text into the input block 420, or write and draw a picture in the input block 420 with a handwriting pen, or even click the pause element 421 and the recording element 422 to control recording so as to use the text, the picture and the voice as corresponding solutions, and then click the confirmation element 430 to transmit the corresponding solutions to the surgery database 110 for storage. Therefore, when the inexperienced surgeon encounters an emergency, the display and the loudspeaker are matched to output a corresponding coping scheme corresponding to the time point, so that the inexperienced surgeon can cope with the emergency easily.
In summary, it can be seen that the difference between the present invention and the prior art lies in that the surgeon establishes the best surgical operation by himself before the operation is performed, so as to perform the demonstration by augmented reality during the operation, and simultaneously detects the current surgical operation to compare with the best surgical operation, and when the comparison difference exceeds the allowable range, the difference message is displayed to provide the operation decision support.
Although the present invention has been described with reference to the foregoing embodiments, it should be understood that various changes and modifications can be made therein by those skilled in the art without departing from the spirit and scope of the invention.
Claims (10)
1. An augmented reality based surgical decision support system, the system comprising:
a surgical database for storing a plurality of surgical protocols, the surgical protocols including organ models, operational procedures, surgical instrument usage time points, and physiological data, each surgical protocol allowing presentation in augmented reality;
a training module connected to the surgical database for selecting and loading one of the surgical plans corresponding to the surgery for training before performing the surgery, presenting the selected and loaded surgical plan in an augmented reality, and enabling a plurality of sensors to continuously sense the three-dimensional free motion of a surgical instrument during training to establish an optimized surgical operation;
the sensing module is used for enabling the sensor to continuously sense the free movement of the three-dimensional space of the surgical instrument in the process of performing the surgery to serve as the current surgery operation; and
and the decision support module is connected with the training module and the sensing module and used for comparing the optimized operation with the current operation and outputting a difference message to provide operation decision support when the comparison difference exceeds an allowable range.
2. The augmented reality-based surgical decision support system according to claim 1, wherein the training module takes free motion of the surgical instrument in three-dimensional space continuously sensed in training as training data input to a machine learning model to train the machine learning model corresponding to the surgery, and after the training of the machine learning model is completed, allows free motion of the surgical instrument in three-dimensional space continuously sensed by the sensing module to be input to the machine learning model to recognize the current surgical operation, and dynamically adjusts the allowable range according to the recognition result.
3. The augmented reality-based surgical decision support system of claim 1, wherein the optimized surgical procedure and the current surgical procedure both include an operation step sequence and a movement path range of the surgical instrument at different time points, and when the decision support module detects that the optimized surgical procedure and the current surgical procedure are at the same time point and a difference of at least one of the operation step sequence and the movement path range exceeds the allowable range, the difference is marked and embedded in a significant manner.
4. The augmented reality-based surgical decision support system of claim 1, wherein the surgical plan further comprises at least one coping plan at different time points, and when the decision support module outputs the difference message, the coping plan at the corresponding time point is loaded from the surgical database to be output, the coping plan being allowed to be established by a remote device and comprising text, voice and image for output in cooperation with a display and a speaker.
5. The augmented reality-based surgical decision support system of claim 1, wherein the decision support module further comprises a continuous detection of surgical operation behavior, and when the surgical operation behavior is abnormally suspended or delayed, the loaded organ model, operation flow, operation tool usage time point and physiological data of the surgical plan are synchronously displayed for auxiliary support and guidance.
6. An augmented reality-based surgical decision support method, comprising the steps of:
providing a plurality of surgical protocols, the surgical protocols including organ models, operational procedures, surgical instrument use points and physiological data, each surgical protocol allowing presentation in augmented reality;
before performing an operation, selecting and loading one of the operation schemes corresponding to the operation for training, presenting the selected and loaded operation scheme in an augmented reality, and enabling a plurality of sensors to continuously sense the free motion of a three-dimensional space of an operation instrument in the training to establish an optimized operation;
enabling the sensor to continuously sense the free movement of the surgical instrument in three-dimensional space during the operation as the current operation; and
comparing the optimized operation with the current operation, and outputting a difference message to provide operation decision support when the comparison difference exceeds an allowable range.
7. The augmented reality-based surgical decision support method according to claim 6, wherein the method further comprises a step of inputting the three-dimensional free motion of the surgical instrument continuously sensed in training as training data of a machine learning model to train the machine learning model corresponding to the surgery, and after the training of the machine learning model is completed, allowing the three-dimensional free motion of the surgical instrument continuously sensed in the course of the surgery to be input into the machine learning model to recognize the current surgical operation, and dynamically adjusting the allowable range according to the recognition result.
8. The augmented reality-based surgical decision support method of claim 6, wherein the optimized surgical procedure and the current surgical procedure both include an operation step sequence and a movement path range of the surgical tool at different time points, and when it is detected that the optimized surgical procedure and the current surgical procedure are at the same time point, and a difference of at least one of the operation step sequence and the movement path range exceeds the allowable range, a difference is marked in a significant manner and the difference message is embedded.
9. The augmented reality-based surgical decision support method according to claim 6, wherein the surgical plan further includes at least one coping plan at different time points, and when the difference message is outputted, the coping plan at the corresponding time point is loaded for output, the coping plan being allowed to be established by a remote device and including text, voice and image for output in cooperation with a display and a speaker.
10. The augmented reality-based surgical decision support method according to claim 6, wherein the method further comprises the step of continuously detecting the surgical operation behavior, and synchronously displaying the loaded organ model, operation flow, operation tool usage time point and physiological data of the surgical plan for auxiliary support and guidance when the surgical operation behavior is abnormally suspended or delayed.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202011295770.5A CN114550876A (en) | 2020-11-18 | 2020-11-18 | Operation decision support system and method based on augmented reality |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202011295770.5A CN114550876A (en) | 2020-11-18 | 2020-11-18 | Operation decision support system and method based on augmented reality |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN114550876A true CN114550876A (en) | 2022-05-27 |
Family
ID=81659290
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202011295770.5A Pending CN114550876A (en) | 2020-11-18 | 2020-11-18 | Operation decision support system and method based on augmented reality |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN114550876A (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107296650A (en) * | 2017-06-01 | 2017-10-27 | 西安电子科技大学 | Intelligent operation accessory system based on virtual reality and augmented reality |
| CN107847289A (en) * | 2015-03-01 | 2018-03-27 | 阿里斯医疗诊断公司 | The morphology operation of reality enhancing |
| CN110603002A (en) * | 2017-03-10 | 2019-12-20 | 拜欧米特制造有限责任公司 | Augmented reality supported knee surgery |
| CN110796739A (en) * | 2019-09-27 | 2020-02-14 | 哈雷医用(广州)智能技术有限公司 | Virtual reality simulation method and system for craniocerebral operation |
| CN111772792A (en) * | 2020-08-05 | 2020-10-16 | 山东省肿瘤防治研究院(山东省肿瘤医院) | Endoscopic surgery navigation method, system and readable storage medium based on augmented reality and deep learning |
-
2020
- 2020-11-18 CN CN202011295770.5A patent/CN114550876A/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107847289A (en) * | 2015-03-01 | 2018-03-27 | 阿里斯医疗诊断公司 | The morphology operation of reality enhancing |
| CN110603002A (en) * | 2017-03-10 | 2019-12-20 | 拜欧米特制造有限责任公司 | Augmented reality supported knee surgery |
| CN107296650A (en) * | 2017-06-01 | 2017-10-27 | 西安电子科技大学 | Intelligent operation accessory system based on virtual reality and augmented reality |
| CN110796739A (en) * | 2019-09-27 | 2020-02-14 | 哈雷医用(广州)智能技术有限公司 | Virtual reality simulation method and system for craniocerebral operation |
| CN111772792A (en) * | 2020-08-05 | 2020-10-16 | 山东省肿瘤防治研究院(山东省肿瘤医院) | Endoscopic surgery navigation method, system and readable storage medium based on augmented reality and deep learning |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11087518B2 (en) | Second-person avatars | |
| CN110555426A (en) | Sight line detection method, device, equipment and storage medium | |
| JP7442444B2 (en) | Augmented reality activation of the device | |
| KR20190100011A (en) | Method and apparatus for providing surgical information using surgical video | |
| Laraba et al. | Dance performance evaluation using hidden Markov models | |
| US20230334998A1 (en) | Surgical teaching auxiliary system using virtual reality and method thereof | |
| CN115768370A (en) | Systems and methods for video and audio analysis | |
| US20230329806A1 (en) | Surgical decision support system based on augmented reality (ar) and method thereof | |
| US11836839B2 (en) | Method for generating animation figure, electronic device and storage medium | |
| CN110443826A (en) | A virtual-real fusion simulation experiment error assistance method and system | |
| US20230410499A1 (en) | Visibility metrics in multi-view medical activity recognition systems and methods | |
| CN108498102B (en) | Rehabilitation training method and device, storage medium and electronic equipment | |
| JP2020135141A (en) | Training device, training method, and prediction device | |
| CN110062290A (en) | Video interactive content generating method, device, equipment and medium | |
| CN114550876A (en) | Operation decision support system and method based on augmented reality | |
| CN114520040A (en) | Surgical teaching assistant system and method using virtual reality | |
| CN117563111B (en) | Training method and device for magnetic guide wire control model, electronic equipment and medium | |
| TWI765369B (en) | Decision support system for surgical based on augmented reality (ar) and method thereof | |
| US11688295B2 (en) | Network learning system and method thereof | |
| US12189869B1 (en) | Mixed reality production method using inertial measurement data and electronic device performing the method thereof | |
| CN113253837A (en) | Air writing method and device, online live broadcast system and computer equipment | |
| CN117733848A (en) | Surgical robot control system and method | |
| CN118946326A (en) | Simulation system, information processing device and information processing method | |
| US11751972B2 (en) | Methods and systems for remote augmented reality communication for guided surgery | |
| KR20190111305A (en) | System and method for language learning for the deaf |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| RJ01 | Rejection of invention patent application after publication |
Application publication date: 20220527 |
|
| RJ01 | Rejection of invention patent application after publication |