CN116343538B - Method and device for surgical teaching based on wearable devices - Google Patents
Method and device for surgical teaching based on wearable devices Download PDFInfo
- Publication number
- CN116343538B CN116343538B CN202310273331.1A CN202310273331A CN116343538B CN 116343538 B CN116343538 B CN 116343538B CN 202310273331 A CN202310273331 A CN 202310273331A CN 116343538 B CN116343538 B CN 116343538B
- Authority
- CN
- China
- Prior art keywords
- information
- target object
- target
- teaching
- surgical
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 56
- 230000005540 biological transmission Effects 0.000 claims description 35
- 230000000007 visual effect Effects 0.000 claims description 33
- 238000006243 chemical reaction Methods 0.000 claims description 28
- 239000011521 glass Substances 0.000 claims description 19
- 239000004984 smart glass Substances 0.000 claims description 13
- 238000005457 optimization Methods 0.000 claims description 12
- 238000001356 surgical procedure Methods 0.000 claims description 11
- 238000013507 mapping Methods 0.000 claims description 7
- 231100000279 safety data Toxicity 0.000 claims description 7
- 238000004458 analytical method Methods 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 claims description 2
- 230000000694 effects Effects 0.000 abstract description 19
- 230000009286 beneficial effect Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 6
- 239000003550 marker Substances 0.000 description 6
- 230000000740 bleeding effect Effects 0.000 description 4
- 239000008280 blood Substances 0.000 description 4
- 210000004369 blood Anatomy 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 239000007787 solid Substances 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 3
- 230000036772 blood pressure Effects 0.000 description 3
- 230000036760 body temperature Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 229910052760 oxygen Inorganic materials 0.000 description 3
- 239000001301 oxygen Substances 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 2
- 238000002271 resection Methods 0.000 description 2
- 210000002784 stomach Anatomy 0.000 description 2
- 210000000988 bone and bone Anatomy 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/08—Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a method and a device for realizing surgical teaching based on wearable equipment, wherein the method comprises the following steps: acquiring real-time operation information of a target object, wherein the target object is an object wearing wearable equipment; based on the real-time operation information, determining first operation data of the target object and further determining second operation data of the target object, judging whether the target object meets preset operation teaching guide conditions, if so, generating operation teaching guide information according to the first operation data and the second operation data, determining target operation teaching guide information from the operation teaching guide information, and transmitting the target operation teaching guide information to wearable equipment corresponding to the target object, so that the target object receives the target operation teaching guide information through the wearable equipment and executes target operation. Therefore, the intelligent operation teaching method and the intelligent operation teaching device can improve the intelligence of operation teaching, and further improve the teaching effect of operation teaching.
Description
Technical Field
The invention relates to the technical field of intelligent teaching, in particular to a surgical teaching realization method and device based on wearable equipment.
Background
With the continuous progress of technology, people gradually enter an intelligent age, wherein man-machine interaction is an indispensable part of the intelligent age. At present, most of operation teaching is usually performed on a test bed of an operation test bed, and a teacher operates and explains the operation test bed, and students observe the operation of the teacher around the operation test bed and learn the operation teaching by combining the teaching of the teacher. But this kind of study mode can't guarantee that the student can both carefully see the operation of teacher clearly to the teacher need demonstrate again or explain again, not only lead to the inefficiency of teacher's teaching, can also lead to the student to learn the effect of operation process poor. It is important to provide a new operation teaching method to improve the efficiency and convenience of operation teaching and further improve the teaching effect.
Disclosure of Invention
The invention aims to solve the technical problem of providing a method and a device for realizing operation teaching based on wearable equipment, which can improve the convenience and efficiency of operation teaching of teachers, improve the definition of the learning operation process of students and further improve the teaching effect of operation teaching.
In order to solve the technical problems, the first aspect of the invention discloses a surgical teaching implementation method based on wearable equipment, which comprises the following steps:
Acquiring real-time operation information of a target object, wherein the target object is an object wearing the wearable device; the real-time surgical operation information comprises real-time operation image information of the target object;
Determining first surgical operation data of the target object based on the real-time surgical operation information; the first surgical operation data comprises one or more of surgical operation progress data of the target object, surgical operation safety data of the target object, surgical operation duration data of the target object and surgical operation action data of the target object;
determining second surgical operation data of the target object according to the first surgical operation data; the second surgical operation data comprises the performed surgical operation data of the target object and the surgical operation data to be performed of the target object;
judging whether the target object meets a preset operation teaching guide condition according to the second operation data;
When the target object is judged to meet the preset operation teaching guide condition, operation teaching guide information is generated according to the first operation data and the second operation data, the operation teaching guide information comprises teaching guide video information and teaching guide voice information, and the teaching guide video information comprises teaching guide image information and teaching guide video information;
And determining target operation teaching guide information from the operation teaching guide information, and transmitting the target operation teaching guide information to wearable equipment corresponding to the target object, so that the target object receives the target operation teaching guide information through the wearable equipment and executes target operation.
As an optional implementation manner, in a first aspect of the present invention, the generating surgical teaching guidance information according to the first surgical operation data and the second surgical operation data includes:
determining target operation information of the target object according to the first operation data and the second operation data, wherein the target operation information comprises one or more of target operation appliance information, target operation area information and target operation category information;
Generating target mark information according to the target operation information, wherein the target mark information comprises operation mark information of an operation object corresponding to the target object, and the operation mark information is used for assisting the target object to complete corresponding operation on the operation object;
Generating entity operation marking information according to the operation marking point and the real-time operation information and a combined preset entity space conversion model, wherein the entity operation marking information comprises image guiding marking information of the operation marking point in the current visual field of the target object and voice guiding marking information of the operation marking point in the current visual field of the target object, the voice guiding marking information comprises voice guiding information of the operation marking point in the current visual field of the target object, and the voice guiding marking information is used for guiding the target object to search the operation marking point and executing corresponding operation on the operation marking point;
and generating operation teaching guide information according to the entity operation marking information.
As an optional implementation manner, in the first aspect of the present invention, the wearable device includes smart glasses and smart headphones, where the smart glasses and the smart headphones are worn on the target object; the target surgery teaching guide information at least comprises target teaching video information and target teaching voice information;
The transmitting the target surgical teaching guiding information to the wearable device corresponding to the target object, so that the target object receives the target surgical teaching guiding information through the wearable device and performs a target surgical operation, including:
Mapping the target teaching visual information to intelligent glasses corresponding to the target object, so that the target object views the target teaching visual information through the intelligent glasses, and executing target operation matched with the target teaching visual information; and
Transmitting the target teaching voice information to an intelligent earphone corresponding to the target object, so that the target object can answer the target teaching voice information through the intelligent earphone, and executing target operation matched with the target teaching voice information;
The matching degree between the target teaching video information and the target teaching voice information is larger than or equal to a preset matching threshold value.
In an optional implementation manner, in a first aspect of the present invention, the determining, according to the second surgical operation data, whether the target object meets a preset surgical teaching guiding condition includes:
Predicting a first stage surgical outcome of the target subject based on the performed surgical data of the target subject, the stage surgical outcome comprising a stage surgical outcome of the target subject after completion of the performed surgical data;
Determining a second-stage operation result matched with the executed operation data in a preset teaching-stage operation result set according to the executed operation data of the target object, wherein an operation stage corresponding to the second-stage operation result is matched with an operation stage corresponding to the first-stage operation result of the target object; each teaching stage operation result included in the preset teaching stage operation result set meets the preset operation teaching result condition;
judging whether the first-stage operation result is matched with the second-stage operation result;
When the first-stage operation result is judged to be matched with the second-stage operation result, determining that the target object does not meet the preset operation teaching guide condition;
and when the first-stage operation result is not matched with the second-stage operation result, determining that the target object meets the preset operation teaching guide condition.
As an optional implementation manner, in a first aspect of the present invention, the determining, according to the first operation data, second operation data of the target object includes:
Acquiring target operation information corresponding to the target object, wherein the target operation information corresponding to the target object comprises one or more of operation type information, operation duration information, operation object information and operation safety information of the target object;
determining operation data to be executed of the target object according to the first operation data of the target object and the target operation information corresponding to the target object; the surgical operation data to be executed of the target object comprises one or more of surgical operation duration data to be executed of the target object and surgical operation tool data to be executed;
And generating second surgical operation data of the target object based on the target surgical information corresponding to the target object and the surgical operation data to be executed of the target object.
As an optional implementation manner, in the first aspect of the present invention, before the generating the surgical teaching guidance information according to the first surgical operation data and the second surgical operation data, the method further includes:
Judging whether wearable equipment corresponding to the target object receives request help information from the target object or not; the request help information comprises one or more of body posture information, gesture information and voice information;
When the wearable equipment corresponding to the target object is judged to receive the request help information from the target object, analyzing the request help information to obtain a request help result;
determining a first solution matching the request help result based on the request help result;
Acquiring real-time environment information corresponding to the target object, determining a data transmission type matched with the target object according to the real-time environment information, and executing data type conversion operation on the first solution to obtain a second solution so as to enable the data type of the second solution to be matched with the data transmission type matched with the target object;
the generating the surgical teaching guidance information according to the first surgical operation data and the second surgical operation data includes:
And generating operation teaching guide information according to the first operation data, the second operation data and the second solution.
As an optional implementation manner, in the first aspect of the present invention, after the transmitting the target teaching information to the wearable device corresponding to the target object, so that the target object performs the operation performed by the surgery matched with the target teaching information based on the target teaching information in the wearable device, the method further includes:
Collecting the operation execution result of the target object;
Calculating a surgical result matching parameter between the surgical execution result and a preset surgical target result; the preset operation target result is an operation target teaching result;
Judging whether the surgical result matching parameters meet preset surgical teaching conditions or not;
when the operation result matching parameters are judged to not meet the preset operation teaching conditions, analyzing target reasons that the operation result matching parameters do not meet the preset operation teaching conditions, and generating an optimized teaching scheme of the target object based on the target reasons.
The invention discloses a surgical teaching realization device based on wearable equipment, which comprises:
the acquisition module is used for acquiring real-time operation information of a target object, wherein the target object is an object wearing the wearable equipment; the real-time surgical operation information comprises real-time operation image information of the target object;
A determining module for determining first surgical operation data of the target object based on the real-time surgical operation information; the first surgical operation data comprises one or more of surgical operation progress data of the target object, surgical operation safety data of the target object, surgical operation duration data of the target object and surgical operation action data of the target object;
The determining module is further configured to determine second surgical operation data of the target object according to the first surgical operation data; the second surgical operation data comprises the performed surgical operation data of the target object and the surgical operation data to be performed of the target object;
The judging module is used for judging whether the target object meets the preset operation teaching guide condition according to the second operation data;
The generation module is used for generating surgical teaching guide information according to the first surgical operation data and the second surgical operation data when the judgment module judges that the target object meets the preset surgical teaching guide condition, wherein the surgical teaching guide information comprises teaching guide visual information and teaching guide voice information, and the teaching guide visual information comprises teaching guide image information and teaching guide video information;
the determining module is further used for determining target operation teaching guide information from the operation teaching guide information;
The transmission module is used for transmitting the target operation teaching guide information to wearable equipment corresponding to the target object, so that the target object receives the target operation teaching guide information through the wearable equipment and executes target operation.
In a second aspect of the present invention, as an optional implementation manner, the generating module generates the surgical teaching guidance information according to the first surgical operation data and the second surgical operation data, where a specific manner includes:
determining target operation information of the target object according to the first operation data and the second operation data, wherein the target operation information comprises one or more of target operation appliance information, target operation area information and target operation category information;
Generating target mark information according to the target operation information, wherein the target mark information comprises operation mark information of an operation object corresponding to the target object, and the operation mark information is used for assisting the target object to complete corresponding operation on the operation object;
Generating physical operation marking information according to the operation marking point and the real-time operation information by combining a preset physical space conversion model, wherein the physical operation marking information comprises image guiding marking information of the operation marking point in the current field of view of the target object and voice guiding marking information of the operation marking point in the current field of view of the target object, the voice guiding marking information comprises voice guiding information of the operation marking point in the current field of view of the target object, and the voice guiding marking information is used for guiding the target object to search the operation marking point and executing corresponding operation on the operation marking point;
and generating operation teaching guide information according to the entity operation marking information.
As an optional implementation manner, in a second aspect of the present invention, the wearable device includes smart glasses and smart headphones, where the smart glasses and the smart headphones are worn on the target object; the target surgery teaching guide information at least comprises target teaching video information and target teaching voice information;
The transmission module transmits the target surgical teaching guide information to the wearable device corresponding to the target object, so that the specific modes of the target object receiving the target surgical teaching guide information through the wearable device and executing the target surgical operation include:
Mapping the target teaching visual information to intelligent glasses corresponding to the target object, so that the target object views the target teaching visual information through the intelligent glasses, and executing target operation matched with the target teaching visual information; and
Transmitting the target teaching voice information to an intelligent earphone corresponding to the target object, so that the target object can answer the target teaching voice information through the intelligent earphone, and executing target operation matched with the target teaching voice information;
The matching degree between the target teaching video information and the target teaching voice information is larger than or equal to a preset matching threshold value.
In a second aspect of the present invention, as an optional implementation manner, the determining module determines, according to the second surgical operation data, whether the target object meets a preset surgical teaching guidance condition, where a specific manner includes:
Predicting a first stage surgical outcome of the target subject based on the performed surgical data of the target subject, the stage surgical outcome comprising a stage surgical outcome of the target subject after completion of the performed surgical data;
Determining a second-stage operation result matched with the executed operation data in a preset teaching-stage operation result set according to the executed operation data of the target object, wherein an operation stage corresponding to the second-stage operation result is matched with an operation stage corresponding to the first-stage operation result of the target object; each teaching stage operation result included in the preset teaching stage operation result set meets the preset operation teaching result condition;
judging whether the first-stage operation result is matched with the second-stage operation result;
When the first-stage operation result is judged to be matched with the second-stage operation result, determining that the target object does not meet the preset operation teaching guide condition;
and when the first-stage operation result is not matched with the second-stage operation result, determining that the target object meets the preset operation teaching guide condition.
In a second aspect of the present invention, as an optional implementation manner, the determining module determines, according to the first operation data, a second operation data of the target object, where the specific manner includes:
Acquiring target operation information corresponding to the target object, wherein the target operation information corresponding to the target object comprises one or more of operation type information, operation duration information, operation object information and operation safety information of the target object;
determining operation data to be executed of the target object according to the first operation data of the target object and the target operation information corresponding to the target object; the surgical operation data to be executed of the target object comprises one or more of surgical operation duration data to be executed of the target object and surgical operation tool data to be executed;
And generating second surgical operation data of the target object based on the target surgical information corresponding to the target object and the surgical operation data to be executed of the target object.
In a second aspect of the present invention, the determining module is further configured to determine, before the generating module generates the surgical teaching guidance information according to the first surgical operation data and the second surgical operation data, whether the wearable device corresponding to the target object receives the request help information from the target object; the request help information comprises one or more of body posture information, gesture information and voice information;
the apparatus further comprises:
the analysis module is used for analyzing the request help information to obtain a request help result when the judgment module judges that the wearable equipment corresponding to the target object receives the request help information from the target object;
The determining module is further configured to determine, based on the request help result, a first solution that matches the request help result;
the acquisition module is further used for acquiring real-time environment information corresponding to the target object;
The determining module is further used for determining a data transmission type matched with the target object according to the real-time environment information;
The conversion module is used for executing data type conversion operation on the first solution to obtain a second solution so as to enable the data type of the second solution to be matched with the data transmission type matched with the target object;
the specific mode of generating the operation teaching guide information by the generating module according to the first operation data and the second operation data comprises the following steps:
And generating operation teaching guide information according to the first operation data, the second operation data and the second solution.
As an alternative embodiment, in the second aspect of the present invention, the apparatus further includes:
The acquisition module is used for acquiring the operation execution result of the target object after the transmission module transmits the target teaching information to the wearable device corresponding to the target object so that the target object executes the operation execution operation matched with the target teaching information based on the target teaching information in the wearable device;
the calculation module is used for calculating a surgical result matching parameter between the surgical execution result and a preset surgical target result; the preset operation target result is an operation target teaching result;
The judging module is also used for judging whether the operation result matching parameters meet preset operation teaching conditions or not;
The analysis module is further used for analyzing a target reason that the operation result matching parameter does not meet the preset operation teaching condition when the judgment module judges that the operation result matching parameter does not meet the preset operation teaching condition;
the generating module is further used for generating an optimized teaching scheme of the target object based on the target reason.
The third aspect of the invention discloses another surgical teaching implementation device based on a wearable device, which comprises:
A memory storing executable program code;
A processor coupled to the memory;
the processor invokes the executable program code stored in the memory to execute the surgical teaching implementation method based on the wearable device disclosed in the first aspect of the present invention.
A fourth aspect of the present invention discloses a computer storage medium storing computer instructions for executing the method for implementing the wearable device-based surgical teaching disclosed in the first aspect of the present invention when the computer instructions are called.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
in the embodiment of the invention, real-time operation information of a target object is acquired, wherein the target object is an object wearing wearable equipment; based on the real-time operation information, determining first operation data of the target object and further determining second operation data of the target object, judging whether the target object meets preset operation teaching guide conditions, if so, generating operation teaching guide information according to the first operation data and the second operation data, determining target operation teaching guide information from the operation teaching guide information, and transmitting the target operation teaching guide information to wearable equipment corresponding to the target object, so that the target object receives the target operation teaching guide information through the wearable equipment and executes target operation. Therefore, the intelligent operation teaching device can improve the intelligence of operation teaching, improve the convenience and efficiency of operation teaching of teachers, improve the definition of the operation learning process of students and further improve the teaching effect of operation teaching.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow diagram of a method for implementing surgical teaching based on a wearable device according to an embodiment of the present invention;
Fig. 2 is a schematic flow chart of another implementation method of surgical teaching based on a wearable device according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a surgical teaching implementation device based on a wearable device according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of another surgical teaching implementation device based on a wearable device according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of another surgical teaching implementation device based on a wearable device according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terms first, second and the like in the description and in the claims and in the above-described figures are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, apparatus, article, or article that comprises a list of steps or elements is not limited to only those listed but may optionally include other steps or elements not listed or inherent to such process, method, article, or article.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
The invention discloses a method and a device for realizing surgical teaching based on wearable equipment, which can improve the intelligence of surgical teaching, improve the convenience and efficiency of a teacher in performing surgical teaching, improve the definition of a student in learning a surgical process and further improve the teaching effect of surgical teaching. The following will describe in detail.
Example 1
Referring to fig. 1, fig. 1 is a schematic flow chart of a method for implementing surgical teaching based on a wearable device according to an embodiment of the present invention. The surgical teaching implementation method based on the wearable device described in fig. 1 can be applied to a surgical teaching implementation device based on the wearable device, can also be applied to a cloud server or a local server based on the surgical teaching implementation of the wearable device, and can also be applied to the wearable device itself, and the embodiment of the invention is not limited. As shown in fig. 1, the surgical teaching implementation method based on the wearable device may include the following operations:
101. And acquiring real-time operation information of the target object.
In the embodiment of the invention, the target object is an object wearing wearable equipment; the real-time surgical operation information includes real-time operation image information of the target object.
In the embodiment of the present invention, optionally, the real-time surgical operation information of the target object may be obtained in real time, or may be obtained at fixed time according to a preset time period, and the embodiment of the present invention is not limited specifically.
In the embodiment of the present invention, optionally, the real-time surgical operation information of the target object may be acquired through a wearable device. Further, the wearable device may include devices that can be worn on a human body such as smart glasses, smart headphones, and the like.
In an embodiment of the present invention, optionally, the real-time operation information further includes one or more of usage information of the surgical instrument used by the target object in real time, environmental information of a surgical environment in which the target object is located, and duration information of a procedure performed by the target object.
102. First surgical operation data of the target object is determined based on the real-time surgical operation information.
In the embodiment of the invention, the first surgical operation data comprises one or more of surgical operation progress data of the target object, surgical operation safety data of the target object, surgical operation duration data of the target object and surgical operation action data of the target object.
In the embodiment of the present invention, optionally, the operation progress data of the target object includes information of the executed operation duration of the target object and information of the operation processing progress of the target object; the surgical operation safety data of the target object comprise one or more of blood output of the surgical object corresponding to the target object and vital sign data of the surgical object corresponding to the target object, wherein the vital sign data of the surgical object corresponding to the target object comprises human body vital sign data such as blood pressure, heart rate, blood oxygen saturation and body temperature of the surgical object corresponding to the target object; the surgical operation action data of the target object includes one or more of surgical operation step data of the target object, action data of the target object in each surgical operation step, and duration data of the target object in each surgical operation step.
103. And determining second operation data of the target object according to the first operation data.
In the embodiment of the invention, the second surgical operation data includes the performed surgical operation data of the target object and the surgical operation data to be performed of the target object.
104. And judging whether the target object meets the preset operation teaching guide condition according to the second operation data.
105. When the target object is judged to meet the preset operation teaching guide condition, operation teaching guide information is generated according to the first operation data and the second operation data.
In the embodiment of the invention, the operation teaching guide information comprises teaching guide video information and teaching guide voice information, wherein the teaching guide video information comprises teaching guide image information and teaching guide video information.
In the embodiment of the present invention, optionally, when it is determined that the target object does not meet the preset operation teaching guiding condition, the process may be ended, and the operation evaluation information of the target object may also be generated according to the second operation data.
106. And determining target operation teaching guide information from the operation teaching guide information, and transmitting the target operation teaching guide information to wearable equipment corresponding to the target object, so that the target object receives the target operation teaching guide information through the wearable equipment and performs target operation.
In the embodiment of the invention, optionally, the wearable device corresponding to the target object includes smart glasses and smart headphones.
As can be seen, implementing the method for implementing the wearable device-based surgical teaching described in fig. 1 can obtain real-time surgical operation information of a target object, where the target object is an object wearing a wearable device; based on real-time operation information, first operation data of a target object are determined, second operation data of the target object are further determined, whether the target object meets preset operation teaching guide conditions is judged, if yes, operation teaching guide information is generated according to the first operation data and the second operation data, the target operation teaching guide information is determined from the operation teaching guide information, the target operation teaching guide information is transmitted to wearable equipment corresponding to the target object, the target object receives the target operation teaching guide information through the wearable equipment and performs target operation, convenience and efficiency of operation teaching of teachers can be improved, definition of a student learning operation process can be improved, and then teaching effect of operation teaching is improved.
In an alternative embodiment, generating surgical teaching guidance information from the first surgical operation data and the second surgical operation data includes:
Determining target operation information of a target object according to the first operation data and the second operation data, wherein the target operation information comprises one or more of target operation tool information, target operation area information and target operation category information;
Generating target mark information according to the target operation information, wherein the target mark information comprises operation mark information of an operation object corresponding to the target object, and the operation mark information is used for assisting the target object to complete corresponding operation on the operation object;
Generating entity operation marking information according to the operation marking point and real-time operation information by combining a preset entity space conversion model, wherein the entity operation marking information comprises image guiding marking information of the operation marking point in the current field of view of the target object and voice guiding marking information of the operation marking point in the current field of view of the target object, the voice guiding marking information comprises voice guiding information of the operation marking point in the current field of view of the target object, and the voice guiding marking information is used for guiding the target object to find the operation marking point and executing corresponding operation on the operation marking point;
and generating operation teaching guide information according to the entity operation marking information.
In this optional embodiment, optionally, the surgical marker information includes one or more of a surgical marker point, a surgical marker line, and a surgical marker arrow, and the embodiment of the present invention is not specifically limited.
In this optional embodiment, optionally, according to the surgical marker point and the real-time surgical operation information, generating the physical surgical marker information by combining with a preset physical space conversion model includes:
according to the operation marking points and the real-time operation image information, combining a preset entity space conversion model, combining the operation marking points with the real-time operation image information, and merging the operation marking points into the real-time operation image information to generate entity operation marking information corresponding to the current field of view of the target object.
Therefore, the operation marking points can be merged into the real-time operation image information according to the operation marking points and the real-time operation image information in combination with the preset entity space conversion model, so that entity operation marking information corresponding to the current field of view of the target object is generated, the accuracy and reliability of generating the entity operation marking information can be improved, the intelligence of generating the entity operation marking information can be improved, the convenience of operation of the target object through the entity operation marking information can be improved, and the teaching effect of operation teaching can be improved.
In this optional embodiment, optionally, the target operation area information includes operation area information of an operation object corresponding to the target object, for example: a surgical operation region corresponding to a target object such as a heart region or a stomach region; the target operation category information includes operation categories of incision, suture, resection, and the like.
In this alternative embodiment, the image-guided marking information of the surgical marking point in the current field of view of the target object includes physical surgical marking information corresponding to the current field of view of the target object.
It can be seen that implementing this alternative embodiment can determine target operation information of a target object according to the first operation data and the second operation data, generate target mark information according to the target operation information, and combine a preset solid space conversion model according to the operation mark point and the real-time operation information to generate solid operation mark information corresponding to a current field of view of the target object, and generate operation teaching guide information according to the solid operation mark information, where the solid operation mark information includes image guide mark information of the operation mark point in the current field of view of the target object and voice guide mark information of the operation mark point in the current field of view of the target object, the voice guidance marking information comprises voice guidance information of the operation marking point in the current field of view of the target object, the voice guidance marking information is used for guiding the target object to search the operation marking point and executing corresponding operation on the operation marking point, the operation of the target object can be guided through the image guidance marking information in the current field of view of the target object and the voice guidance information of the operation marking point in the current field of view of the target object, the convenience and the efficiency of operation teaching of teachers can be improved, the definition of the student in the operation learning process can be improved, and the teaching effect of the operation teaching is further improved.
In another alternative embodiment, the wearable device includes smart glasses and smart headphones, each worn on the target object; the target surgery teaching guide information at least comprises target teaching video information and target teaching voice information;
Transmitting the target surgical teaching guide information to a wearable device corresponding to the target object, so that the target object receives the target surgical teaching guide information through the wearable device and performs target surgical operation, and the method comprises the following steps:
Mapping the target teaching visual information to intelligent glasses corresponding to the target object, so that the target object can view the target teaching visual information through the intelligent glasses, and execute target operation matched with the target teaching visual information; and
Transmitting the target teaching voice information to an intelligent earphone corresponding to the target object, so that the target object can answer the target teaching voice information through the intelligent earphone, and executing target operation matched with the target teaching voice information;
The matching degree between the target teaching video information and the target teaching voice information is larger than or equal to a preset matching threshold value.
In this optional embodiment, optionally, the target teaching video information includes target teaching image information and target teaching video information; further alternatively, the target teaching video information may be pre-recorded surgical teaching video information or pre-recorded surgical teaching picture information. For example, the pre-recorded operation teaching video information may be an operation video of a standard operation of the current operation pre-recorded by a teacher, and the pre-recorded operation teaching picture information may be operation picture information of the standard operation of the current operation pre-recorded by the teacher.
In this optional embodiment, optionally, mapping the target teaching visual information to the smart glasses corresponding to the target object, so that the target object views the target teaching visual information through the smart glasses may include: transmitting the target teaching visual information to the intelligent glasses corresponding to the target object in a wireless communication mode, mapping the target teaching visual information to the intelligent glasses corresponding to the target object, and enabling the target object to view the target teaching visual information through the intelligent glasses; the wireless communication mode comprises one or more of WiFi (IEEE 802.11 protocol), mesh, bluetooth and ZigBee, thread, Z-Wave, NFC, UWB, liFi.
In this optional embodiment, optionally, transmitting the target teaching voice information to the intelligent earphone corresponding to the target object, so that the target object listens to the target teaching voice information through the intelligent earphone may include: transmitting the target teaching voice information to the intelligent earphone corresponding to the target object in a wireless communication mode, so as to transmit the target teaching voice information to the intelligent earphone corresponding to the target object, and enabling the target object to answer the target teaching voice information through the intelligent earphone; the wireless communication mode comprises one or more of WiFi (IEEE 802.11 protocol), mesh, bluetooth and ZigBee, thread, Z-Wave, NFC, UWB, liFi.
In this optional embodiment, the matching degree between the target teaching video information and the target teaching voice information is greater than or equal to a preset matching threshold, including: the content expressed by the target teaching video information is consistent with the content expressed by the target teaching voice information.
Therefore, the implementation of the optional embodiment can map the target teaching visual information to the intelligent glasses corresponding to the target object, so that the target object can check the target teaching visual information through the intelligent glasses and execute the target operation matched with the target teaching visual information, and the target teaching voice information is transmitted to the intelligent earphone corresponding to the target object, so that the target object can answer the target teaching voice information through the intelligent earphone and execute the target operation matched with the target teaching voice information, the target object can receive the teaching information through the wearable device (the intelligent glasses and the intelligent earphone) to complete the learning of the operation, the operation information can be learned without a teacher, the convenience of the target object in learning the operation can be improved, the efficiency of the target object in learning the operation can be improved, the convenience and the efficiency of the operation teaching of the teacher can be improved, the definition of the student in learning the operation process can be improved, and the teaching effect of the operation teaching operation can be improved.
In yet another alternative embodiment, determining whether the target object meets the preset surgical teaching guidance condition according to the second surgical operation data includes:
Predicting a first stage surgical operation result of the target object according to the performed surgical operation data of the target object, wherein the stage surgical operation result comprises a stage surgical result of the target object after the performed surgical operation data is completed;
according to the executed operation data of the target object, determining a second-stage operation result matched with the executed operation data in a preset teaching-stage operation result set, wherein an operation stage corresponding to the second-stage operation result is matched with an operation stage corresponding to the first-stage operation result of the target object; each teaching stage operation result included in the preset teaching stage operation result set meets the preset operation teaching result condition;
Judging whether the first-stage operation result is matched with the second-stage operation result;
when the first-stage operation result is judged to be matched with the second-stage operation result, determining that the target object does not meet the preset operation teaching guide condition;
and when the first-stage operation result is not matched with the second-stage operation result, determining that the target object meets the preset operation teaching guide condition.
In this alternative embodiment, optionally, the first stage surgical operation result includes human vital sign data of the bleeding volume, bleeding site, blood pressure, heart rate, blood oxygen saturation, body temperature, etc. of the surgical object corresponding to the target object.
In this optional embodiment, optionally, determining, from the performed surgical operation data of the target object, a second stage surgical operation result matching the performed surgical operation data in a preset teaching stage surgical result set, includes: according to the executed operation data, determining a target stage corresponding to the executed operation data, determining a target stage result corresponding to the target stage in a preset teaching stage operation result set, and determining the target stage result as a second stage operation result.
In this alternative embodiment, optionally, determining whether the first stage surgical operation result matches the second stage surgical operation result includes: judging whether the first-stage operation result is the same as the second-stage operation result, if so, matching the first-stage operation result with the second-stage operation result, and if not, not matching the first-stage operation result with the second-stage operation result.
It can be seen that, implementing this alternative embodiment can predict the first stage operation result of the target object according to the executed operation data of the target object, determine the second stage operation result matched with the executed operation data in the preset teaching stage operation result set according to the executed operation data of the target object, determine whether the first stage operation result is matched with the second stage operation result, if so, determine that the target object does not meet the preset operation teaching guiding condition, if not, determine that the target object meets the preset operation teaching guiding condition, determine whether the target object needs to be subjected to operation teaching guiding through the executed operation data of the target object, can improve accuracy and reliability of determining whether the target object meets the preset operation teaching guiding condition, thereby being beneficial to improving the operation guiding operation of the target object in a targeted operation, improving the convenience of learning operation of the target object, improving the convenience and efficiency of learning operation of the target object, and improving the convenience and efficiency of learning operation of the student in operation teaching, and improving the definition of teaching.
In yet another alternative embodiment, determining second surgical operation data of the target object from the first surgical operation data includes:
Acquiring target operation information corresponding to a target object, wherein the target operation information corresponding to the target object comprises one or more of operation type information, operation duration information, operation object information and operation safety information of the target object;
Determining operation data to be executed of a target object according to first operation data of the target object and target operation information corresponding to the target object; the surgical operation data to be executed of the target object comprises one or more of surgical operation duration data to be executed of the target object and surgical operation tool data to be executed;
And generating second surgical operation data of the target object based on the target surgical information corresponding to the target object and the surgical operation data to be executed of the target object.
In this alternative embodiment, the surgical type information may optionally include surgical operation site type information, such as one of heart surgery, bone surgery, brain surgery, stomach surgery; the operation duration information comprises operation required duration information and operation already-performed duration information; the operation safety information comprises bleeding amount information, bleeding part information, blood pressure information, heart rate information, blood oxygen saturation information and body temperature information of an operation.
In this optional embodiment, optionally, the surgical operation data to be performed of the target object may further include surgical operation type data to be performed of the target object, wherein the surgical operation type data to be performed includes one or more of a cutting type, a suturing type, and a resection type.
In this optional embodiment, optionally, generating second surgical operation data of the target object based on the target surgical information corresponding to the target object and the surgical operation data to be performed of the target object includes: and determining the target operation information corresponding to the target object and the operation data to be executed of the target object as second operation data of the target object.
It can be seen that implementing the alternative embodiment can obtain the target operation information corresponding to the target object, determine the operation data to be performed of the target object according to the first operation data of the target object and the target operation information corresponding to the target object, generate the second operation data of the target object based on the target operation information corresponding to the target object and the operation data to be performed of the target object, the second operation data of the target object can be determined based on the target operation information of the target object and the first operation data, the accuracy and the reliability of the second operation data of the target object can be improved, the convenience of the target object in learning operation can be improved, the efficiency of the target object in learning operation can be improved, the convenience and the efficiency of operation teaching by teachers can be improved, the definition of the student in learning operation process can be improved, and the teaching effect of operation teaching can be improved.
In yet another alternative embodiment, before generating the surgical teaching guidance information according to the first surgical operation data and the second surgical operation data, the method further comprises:
Judging whether wearable equipment corresponding to a target object receives request help information from the target object or not; requesting help information including one or more of body posture information, gesture information, voice information;
When the wearable equipment corresponding to the target object is judged to receive the request help information from the target object, the request help information is analyzed, and a request help result is obtained;
determining a first solution matching the requested help result based on the requested help result;
acquiring real-time environment information corresponding to a target object, determining a data transmission type matched with the target object according to the real-time environment information, and executing data type conversion operation on the first solution to obtain a second solution so as to enable the data type of the second solution to be matched with the data transmission type matched with the target object;
generating surgical teaching guidance information according to the first surgical operation data and the second surgical operation data, including:
and generating the surgical teaching guide information according to the first surgical operation data, the second surgical operation data and the second solution.
In this optional embodiment, optionally, for example, when the wearable device corresponding to the target object receives that the voice information sent by the target object is "please ask for which step i go next", it is determined that the wearable device corresponding to the target object receives the request help information of the target object.
In this optional embodiment, optionally, the real-time environment information includes one or more of gesture information of the target object, volume information of the current environment, and number of people information of the current environment.
In this alternative embodiment, the data transmission type may optionally include a video information transmission type and/or a voice information transmission type. For example, when the data transmission type matched with the target object is the video information transmission type, performing a data type conversion operation on the first solution to obtain a second solution with the data type being the video information transmission type; and when the data transmission type matched with the target object is the video information transmission type and the voice information transmission type, performing data type conversion operation on the first solution to respectively obtain a second solution with the data type being the video information transmission type and the data type being the voice information transmission type.
In this optional embodiment, optionally, when it is determined that the wearable device corresponding to the target object does not receive the request help information from the target object, the present process may be ended.
It can be seen that before the operation teaching guidance information is generated according to the first operation data and the second operation data, the implementation of the optional embodiment can determine whether the wearable device corresponding to the target object receives the request help information from the target object, if yes, the request help information is analyzed to obtain a request help result, a data transmission type matched with the target object is determined based on the obtained real-time environment information corresponding to the target object, a data type conversion operation is performed on the first solution to obtain a second solution with a data type matched with the data transmission type matched with the target object, and the operation teaching guidance information is generated based on the first operation data, the second operation data and the second solution, so that accuracy and reliability of the operation teaching guidance information can be improved, intelligence of the operation teaching guidance information can be improved, convenience of the operation learning of the target object can be improved, efficiency of the operation learning of the target object can be improved, convenience and efficiency of the operation learning of the teacher can be improved, students in learning operation teaching can be improved, and the teaching effect of students of the operation learning operation can be improved.
Example two
Referring to fig. 2, fig. 2 is a flow chart of another implementation method of surgical teaching based on a wearable device according to an embodiment of the present invention. The surgical teaching implementation method based on the wearable device described in fig. 2 can be applied to a surgical teaching implementation device based on the wearable device, can also be applied to a cloud server or a local server implemented based on the surgical teaching of the wearable device, and can also be applied to the wearable device itself, and the embodiment of the invention is not limited. As shown in fig. 2, the surgical teaching implementation method based on the wearable device may include the following operations:
201. And acquiring real-time operation information of the target object.
202. First surgical operation data of the target object is determined based on the real-time surgical operation information.
203. And determining second operation data of the target object according to the first operation data.
204. And judging whether the target object meets the preset operation teaching guide condition according to the second operation data.
205. When the target object is judged to meet the preset operation teaching guide condition, operation teaching guide information is generated according to the first operation data and the second operation data.
206. And determining target operation teaching guide information from the operation teaching guide information, and transmitting the target operation teaching guide information to wearable equipment corresponding to the target object, so that the target object receives the target operation teaching guide information through the wearable equipment and performs target operation.
In the embodiment of the present invention, for the detailed description of step 201 to step 206, please refer to other descriptions of step 101 to step 106 in the first embodiment, and the detailed description of the embodiment of the present invention is omitted.
207. And collecting the operation execution result of the target object.
In the embodiment of the present invention, the surgical execution result of the acquisition target object may be acquired by a wearable device, that is, the surgical execution result of the acquisition target object may be acquired by a wearable device such as smart glasses and/or smart headphones.
208. And calculating a surgical result matching parameter between the surgical execution result and a preset surgical target result.
In the embodiment of the invention, the preset operation target result is an operation target teaching result.
209. Judging whether the surgical result matching parameters meet preset surgical teaching conditions.
210. When the operation result matching parameters are judged to not meet the preset operation teaching conditions, analyzing the target reasons that the operation result matching parameters do not meet the preset operation teaching conditions, and generating an optimized teaching scheme of the target object based on the target reasons.
In the embodiment of the present invention, optionally, generating an optimized teaching scheme of the target object based on the target reason includes:
and determining a reason keyword in the target reason, determining a target optimization scheme matched with the reason keyword in a preset optimization teaching scheme set according to the reason keyword, and determining the target optimization scheme as an optimization teaching scheme of a target object.
It can be seen that, implementing the method for implementing the surgical teaching based on the wearable device described in fig. 2 can collect the surgical execution result of the target object, calculate the surgical result matching parameter between the surgical execution result and the preset surgical target result, determine whether the surgical result matching parameter meets the preset surgical teaching condition, if not, analyze the target reason that the surgical result matching parameter does not meet the preset surgical teaching condition, generate the optimal teaching scheme of the target object based on the target reason, determine the optimal teaching scheme of the target object according to the surgical execution result of the target object, improve the intelligence of the surgical teaching, and improve the accuracy and reliability of the surgical teaching, thereby being beneficial to optimizing the existing surgical teaching scheme, improving the accuracy and reliability of the generated surgical teaching guiding information, improving the intelligence of the generated surgical teaching guiding information, improving the convenience of the target object for learning the surgical operation, and improving the efficiency of the target object for learning the surgical operation, so as to be beneficial to improving the convenience and efficiency of the teacher for performing the surgical teaching, and improving the teaching effect of students for learning the surgical procedure.
Example III
Referring to fig. 3, fig. 3 is a schematic structural diagram of an operation teaching implementation device based on a wearable device according to an embodiment of the present invention. As shown in fig. 3, the surgical teaching implementation device based on the wearable device may include:
The acquiring module 301 is configured to acquire real-time surgical operation information of a target object, where the target object is an object wearing a wearable device; the real-time operation information comprises real-time operation image information of the target object;
a determining module 302, configured to determine first surgical operation data of the target object based on the real-time surgical operation information; the first surgical operation data comprises one or more of surgical operation progress data of a target object, surgical operation safety data of the target object, surgical operation duration data of the target object and surgical operation action data of the target object;
The determining module 302 is further configured to determine second surgical operation data of the target object according to the first surgical operation data; the second surgical operation data includes the performed surgical operation data of the target object and the surgical operation data to be performed of the target object;
A judging module 303, configured to judge whether the target object meets a preset surgical teaching guiding condition according to the second surgical operation data;
A generating module 304, configured to generate, when the determining module 303 determines that the target object meets a preset surgical guidance condition, surgical guidance information according to the first surgical operation data and the second surgical operation data, where the surgical guidance information includes teaching guidance video information and teaching guidance voice information, and the teaching guidance video information includes teaching guidance image information and teaching guidance video information;
The determining module 302 is further configured to determine target surgical teaching guidance information from the surgical teaching guidance information;
The transmission module 305 is configured to transmit the target surgical teaching guiding information to a wearable device corresponding to the target object, so that the target object receives the target surgical teaching guiding information through the wearable device and performs the target surgical operation.
As can be seen, implementing the apparatus described in fig. 3 can obtain real-time surgical operation information of a target object, where the target object is an object wearing a wearable device; based on real-time operation information, first operation data of a target object are determined, second operation data of the target object are further determined, whether the target object meets preset operation teaching guide conditions is judged, if yes, operation teaching guide information is generated according to the first operation data and the second operation data, the target operation teaching guide information is determined from the operation teaching guide information, the target operation teaching guide information is transmitted to wearable equipment corresponding to the target object, the target object receives the target operation teaching guide information through the wearable equipment and performs target operation, convenience and efficiency of operation teaching of teachers can be improved, definition of a student learning operation process can be improved, and then teaching effect of operation teaching is improved.
In an alternative embodiment, the specific manner of generating the surgical instruction guidance information by the generating module 304 according to the first surgical operation data and the second surgical operation data includes:
Determining target operation information of a target object according to the first operation data and the second operation data, wherein the target operation information comprises one or more of target operation tool information, target operation area information and target operation category information;
Generating target mark information according to the target operation information, wherein the target mark information comprises operation mark information of an operation object corresponding to the target object, and the operation mark information is used for assisting the target object to complete corresponding operation on the operation object;
Generating entity operation marking information according to the operation marking point and real-time operation information by combining a preset entity space conversion model, wherein the entity operation marking information comprises image guiding marking information of the operation marking point in the current field of view of the target object and voice guiding marking information of the operation marking point in the current field of view of the target object, the voice guiding marking information comprises voice guiding information of the operation marking point in the current field of view of the target object, and the voice guiding marking information is used for guiding the target object to find the operation marking point and executing corresponding operation on the operation marking point;
and generating operation teaching guide information according to the entity operation marking information.
As can be seen, implementing the apparatus described in fig. 3 can determine the target operation information of the target object according to the first operation data and the second operation data, generate the target mark information according to the target operation information, and combine the preset physical space conversion model according to the operation mark point and the real-time operation information to generate the physical operation mark information corresponding to the current field of view of the target object, and generate the operation teaching guide information according to the physical operation mark information, where the physical operation mark information includes the image guide mark information of the operation mark point in the current field of view of the target object and the voice guide mark information of the operation mark point in the current field of view of the target object, and the voice guide mark information includes the voice guide information of the operation mark point in the current field of view of the target object, and is used for guiding the target object to find the operation mark point and executing the corresponding operation on the operation mark point.
In another alternative embodiment, the wearable device includes smart glasses and smart headphones, each worn on the target object; the target surgery teaching guide information at least comprises target teaching video information and target teaching voice information;
the specific ways of transmitting the target surgical teaching guiding information to the wearable device corresponding to the target object by the transmission module 305 so that the target object receives the target surgical teaching guiding information through the wearable device and performs the target surgical operation include:
Mapping the target teaching visual information to intelligent glasses corresponding to the target object, so that the target object can view the target teaching visual information through the intelligent glasses, and execute target operation matched with the target teaching visual information; and
Transmitting the target teaching voice information to an intelligent earphone corresponding to the target object, so that the target object can answer the target teaching voice information through the intelligent earphone, and executing target operation matched with the target teaching voice information;
The matching degree between the target teaching video information and the target teaching voice information is larger than or equal to a preset matching threshold value.
Therefore, the device described in fig. 3 can be implemented to map the target teaching visual information to the intelligent glasses corresponding to the target object, so that the target object can check the target teaching visual information through the intelligent glasses and execute the target operation matched with the target teaching visual information, and transmit the target teaching voice information to the intelligent earphone corresponding to the target object, so that the target object can answer the target teaching voice information through the intelligent earphone and execute the target operation matched with the target teaching voice information, the target object can receive the teaching information through the wearable device (the intelligent glasses and the intelligent earphone) to complete the learning of the operation, the operation information can be learned without a teacher, the convenience of the target object in learning the operation can be improved, the efficiency of the target object in learning the operation can be improved, the convenience and the efficiency of the teacher in performing the operation teaching can be improved, the definition of the student in learning the operation process can be improved, and the teaching effect of the operation teaching can be improved.
In yet another alternative embodiment, the specific manner of determining whether the target object meets the preset surgical teaching guidance condition by the determining module 303 according to the second surgical operation data includes:
Predicting a first stage surgical operation result of the target object according to the performed surgical operation data of the target object, wherein the stage surgical operation result comprises a stage surgical result of the target object after the performed surgical operation data is completed;
according to the executed operation data of the target object, determining a second-stage operation result matched with the executed operation data in a preset teaching-stage operation result set, wherein an operation stage corresponding to the second-stage operation result is matched with an operation stage corresponding to the first-stage operation result of the target object; each teaching stage operation result included in the preset teaching stage operation result set meets the preset operation teaching result condition;
Judging whether the first-stage operation result is matched with the second-stage operation result;
when the first-stage operation result is judged to be matched with the second-stage operation result, determining that the target object does not meet the preset operation teaching guide condition;
and when the first-stage operation result is not matched with the second-stage operation result, determining that the target object meets the preset operation teaching guide condition.
As can be seen, implementing the apparatus described in fig. 3 can predict the first stage operation result of the target object according to the performed operation data of the target object, determine the second stage operation result matched with the performed operation data in the preset teaching stage operation result set according to the performed operation data of the target object, determine whether the first stage operation result is matched with the second stage operation result, if so, determine that the target object does not meet the preset operation teaching guide condition, if not, determine that the target object meets the preset operation teaching guide condition, determine whether the target object needs to perform operation teaching guide through the performed operation data of the target object, and can improve accuracy and reliability of determining whether the target object meets the preset operation teaching guide condition, thereby being beneficial to improving the accuracy and reliability of performing targeted operation teaching guide operation on the target object, improving the convenience of learning operation of the target object, improving the convenience and efficiency of learning operation of the target object, thereby being beneficial to improving the convenience and efficiency of performing operation teaching operation of the student, and improving the teaching effect of learning operation.
In yet another alternative embodiment, the determining module 302 determines the second surgical operation data of the target object according to the first surgical operation data in a specific manner comprising:
Acquiring target operation information corresponding to a target object, wherein the target operation information corresponding to the target object comprises one or more of operation type information, operation duration information, operation object information and operation safety information of the target object;
Determining operation data to be executed of a target object according to first operation data of the target object and target operation information corresponding to the target object; the surgical operation data to be executed of the target object comprises one or more of surgical operation duration data to be executed of the target object and surgical operation tool data to be executed;
And generating second surgical operation data of the target object based on the target surgical information corresponding to the target object and the surgical operation data to be executed of the target object.
It can be seen that implementing the apparatus described in fig. 3 can obtain the target operation information corresponding to the target object, determine the operation data to be performed of the target object according to the first operation data of the target object and the target operation information corresponding to the target object, generate the second operation data of the target object based on the target operation information corresponding to the target object and the operation data to be performed of the target object, the second operation data of the target object can be determined based on the target operation information of the target object and the first operation data, the accuracy and the reliability of the second operation data of the target object can be improved, the convenience of the target object in learning operation can be improved, the efficiency of the target object in learning operation can be improved, the convenience and the efficiency of operation teaching by teachers can be improved, the definition of the student in learning operation process can be improved, and the teaching effect of operation teaching can be improved.
In yet another optional embodiment, the determining module 303 is further configured to determine, before the generating module 304 generates the surgical teaching guidance information according to the first surgical operation data and the second surgical operation data, whether the wearable device corresponding to the target object receives the request help information from the target object; requesting help information including one or more of body posture information, gesture information, voice information;
as shown in fig. 4, the apparatus further includes:
The analysis module 306 is configured to analyze the request help information to obtain a request help result when the determination module 303 determines that the wearable device corresponding to the target object receives the request help information from the target object;
A determining module 302, further configured to determine, based on the request help result, a first solution that matches the request help result;
The acquiring module 301 is further configured to acquire real-time environment information corresponding to the target object;
The determining module 302 is further configured to determine a data transmission type matched with the target object according to the real-time environmental information;
a conversion module 307, configured to perform a data type conversion operation on the first solution to obtain a second solution, so that a data type of the second solution matches a data transmission type matched with the target object;
The specific manner of generating the surgical teaching guidance information by the generating module 304 according to the first surgical operation data and the second surgical operation data includes:
and generating the surgical teaching guide information according to the first surgical operation data, the second surgical operation data and the second solution.
It can be seen that, before the device described in fig. 4 is implemented to generate the operation teaching guide information according to the first operation data and the second operation data, it can be determined whether the wearable device corresponding to the target object receives the request help information from the target object, if yes, the request help information is analyzed to obtain the request help result, and the data transmission type matched with the target object is determined based on the obtained real-time environment information corresponding to the target object, the data type conversion operation is performed on the first solution to obtain the second solution with the data type matched with the data transmission type matched with the target object, and the operation teaching guide information is generated based on the first operation data, the second operation data and the second solution, so that the accuracy and the reliability of generating the operation teaching guide information can be improved, the intelligence of generating the operation teaching guide information can be improved, the convenience of learning operation of the target object can be improved, the efficiency of learning operation of the target object can be improved, the convenience and the efficiency of performing operation teaching by a teacher can be improved, the students of learning operation can be improved, and the teaching effect of teaching operation can be improved.
In yet another alternative embodiment, as shown in fig. 4, the apparatus further comprises:
The acquisition module 308 is configured to acquire a surgical execution result of the target object after the transmission module transmits the target teaching information to the wearable device corresponding to the target object, so that the target object performs a surgical execution operation matched with the target teaching information based on the target teaching information in the wearable device;
a calculating module 309, configured to calculate a surgical result matching parameter between a surgical execution result and a preset surgical target result; the preset operation target result is an operation target teaching result;
The judging module 303 is further configured to judge whether the surgical result matching parameter meets a preset surgical teaching condition;
the analysis module 306 is further configured to, when the judging module 303 judges that the operation result matching parameter does not meet the preset operation teaching condition, analyze a target reason that the operation result matching parameter does not meet the preset operation teaching condition;
the generating module 304 is further configured to generate an optimized teaching scheme of the target object based on the target reason.
It can be seen that implementing the apparatus described in fig. 4 can collect the operation execution result of the target object, calculate the operation result matching parameter between the operation execution result and the preset operation target result, determine whether the operation result matching parameter meets the preset operation teaching condition, if not, analyze the target cause that the operation result matching parameter does not meet the preset operation teaching condition, and generate the optimized teaching plan of the target object based on the target cause, determine the optimized teaching plan of the target object according to the operation execution result of the target object, improve the intelligence of operation teaching, and improve the accuracy and reliability of operation teaching, thereby being beneficial to optimizing the existing operation teaching plan, improving the accuracy and reliability of generating operation teaching guiding information, improving the intelligence of generating operation teaching guiding information, being beneficial to improving the convenience and efficiency of learning operation of the target object, and improving the efficiency of learning operation of the target object, thereby being beneficial to improving the convenience and efficiency of operating teaching of teachers, and improving the definition of students in learning operation process, and further improving the teaching effect of operation teaching.
Example IV
Referring to fig. 5, fig. 5 is a schematic structural diagram of another surgical teaching implementation device based on a wearable device according to an embodiment of the present invention. As shown in fig. 5, the surgical teaching implementation device based on the wearable device may include:
a memory 401 storing executable program codes;
a processor 402 coupled with the memory 401;
The processor 402 invokes the executable program code stored in the memory 401 to perform the steps in the wearable device-based surgical instruction implementation method described in the first or second embodiment of the present invention.
Example five
The embodiment of the invention discloses a computer storage medium which stores computer instructions for executing the steps in the wearable device-based surgical teaching implementation method described in the first or second embodiment of the invention when the computer instructions are called.
Example six
An embodiment of the present invention discloses a computer program product, which includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform the steps in the wearable device-based surgical teaching implementation method described in the first or second embodiment.
The apparatus embodiments described above are merely illustrative, wherein the modules illustrated as separate components may or may not be physically separate, and the components shown as modules may or may not be physical, i.e., may be located in one place, or may be distributed over a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above detailed description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course by means of hardware. Based on such understanding, the foregoing technical solutions may be embodied essentially or in part in the form of a software product that may be stored in a computer-readable storage medium including Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), programmable Read-Only Memory (Programmable Read-Only Memory, PROM), erasable programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM), one-time programmable Read-Only Memory (OTPROM), electrically erasable programmable Read-Only Memory (EEPROM), compact disc Read-Only Memory (Compact Disc Read-Only Memory, CD-ROM) or other optical disc Memory, magnetic disc Memory, tape Memory, or any other medium that can be used for computer-readable carrying or storing data.
Finally, it should be noted that: the embodiment of the invention discloses a surgical teaching implementation method and device based on wearable equipment, which are disclosed by the embodiment of the invention only for illustrating the technical scheme of the invention, but not limiting the technical scheme; although the invention has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art will understand that; the technical scheme recorded in the various embodiments can be modified or part of technical features in the technical scheme can be replaced equivalently; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.
Claims (5)
1. A method for implementing surgical teaching based on a wearable device, the method comprising:
Acquiring real-time operation information of a target object, wherein the target object is an object wearing the wearable device; the real-time surgical operation information comprises real-time operation image information of the target object;
Determining first surgical operation data of the target object based on the real-time surgical operation information; the first surgical operation data comprises one or more of surgical operation progress data of the target object, surgical operation safety data of the target object, surgical operation duration data of the target object and surgical operation action data of the target object;
determining second surgical operation data of the target object according to the first surgical operation data; the second surgical operation data comprises the performed surgical operation data of the target object and the surgical operation data to be performed of the target object;
judging whether the target object meets a preset operation teaching guide condition according to the second operation data;
When the target object is judged to meet the preset operation teaching guide condition, operation teaching guide information is generated according to the first operation data and the second operation data, the operation teaching guide information comprises teaching guide video information and teaching guide voice information, and the teaching guide video information comprises teaching guide image information and teaching guide video information;
Determining target operation teaching guide information from the operation teaching guide information, and transmitting the target operation teaching guide information to wearable equipment corresponding to the target object, so that the target object receives the target operation teaching guide information through the wearable equipment and executes target operation;
the generating the surgical teaching guidance information according to the first surgical operation data and the second surgical operation data includes:
determining target operation information of the target object according to the first operation data and the second operation data, wherein the target operation information comprises one or more of target operation appliance information, target operation area information and target operation category information;
Generating target mark information according to the target operation information, wherein the target mark information comprises operation mark information of an operation object corresponding to the target object, and the operation mark information is used for assisting the target object to complete corresponding operation on the operation object;
Generating physical operation marking information according to the operation marking point and the real-time operation information by combining a preset physical space conversion model, wherein the physical operation marking information comprises image guiding marking information of the operation marking point in the current field of view of the target object and voice guiding marking information of the operation marking point in the current field of view of the target object, the voice guiding marking information comprises voice guiding information of the operation marking point in the current field of view of the target object, and the voice guiding marking information is used for guiding the target object to search the operation marking point and executing corresponding operation on the operation marking point;
generating operation teaching guide information according to the entity operation marking information;
before generating the surgical teaching guidance information according to the first surgical operation data and the second surgical operation data, the method further includes:
Judging whether wearable equipment corresponding to the target object receives request help information from the target object or not; the request help information comprises one or more of body posture information, gesture information and voice information;
When the wearable equipment corresponding to the target object is judged to receive the request help information from the target object, analyzing the request help information to obtain a request help result;
determining a first solution matching the request help result based on the request help result;
Acquiring real-time environment information corresponding to the target object, determining a data transmission type matched with the target object according to the real-time environment information, and executing data type conversion operation on the first solution to obtain a second solution so as to enable the data type of the second solution to be matched with the data transmission type matched with the target object;
the generating the surgical teaching guidance information according to the first surgical operation data and the second surgical operation data includes:
Generating surgical teaching guide information according to the first surgical operation data, the second surgical operation data and the second solution;
Wherein, according to the operation marking point and the real-time operation information, combining a preset entity space conversion model to generate entity operation marking information, comprising:
combining the operation marking point with the real-time operation image information according to the operation marking point and the real-time operation image information by combining a preset entity space conversion model, and merging the operation marking point into the real-time operation image information to generate entity operation marking information corresponding to the current field of view of the target object;
The step of judging whether the target object meets the preset operation teaching guide condition according to the second operation data comprises the following steps:
Predicting a first stage surgical operation result of the target object according to the executed surgical operation data of the target object, wherein the first stage surgical operation result comprises a stage surgical result of the target object after the executed surgical operation data is completed;
Determining a second-stage operation result matched with the executed operation data in a preset teaching-stage operation result set according to the executed operation data of the target object, wherein an operation stage corresponding to the second-stage operation result is matched with an operation stage corresponding to the first-stage operation result of the target object; each teaching stage operation result included in the preset teaching stage operation result set meets the preset operation teaching result condition;
judging whether the first-stage operation result is matched with the second-stage operation result;
When the first-stage operation result is judged to be matched with the second-stage operation result, determining that the target object does not meet the preset operation teaching guide condition;
When the first-stage operation result is not matched with the second-stage operation result, determining that the target object meets the preset operation teaching guide condition;
the determining second surgical operation data of the target object according to the first surgical operation data includes:
Acquiring target operation information corresponding to the target object, wherein the target operation information corresponding to the target object comprises one or more of operation type information, operation duration information, operation object information and operation safety information of the target object;
determining operation data to be executed of the target object according to the first operation data of the target object and the target operation information corresponding to the target object; the surgical operation data to be executed of the target object comprises one or more of surgical operation duration data to be executed of the target object and surgical operation tool data to be executed;
Generating second surgical operation data of the target object based on the target surgical information corresponding to the target object and the surgical operation data to be executed of the target object;
wherein the determining, according to the performed operation data of the target object, a second stage operation result matched with the performed operation data in a preset teaching stage operation result set includes:
determining a target stage corresponding to the executed operation data according to the executed operation data, determining a target stage result corresponding to the target stage in a preset teaching stage operation result set, and determining the target stage result as a second stage operation result;
Transmitting the target teaching information to a wearable device corresponding to the target object, so that after the target object performs operation execution operation matched with the target teaching information based on the target teaching information in the wearable device, the method further comprises:
Collecting the operation execution result of the target object;
Calculating a surgical result matching parameter between the surgical execution result and a preset surgical target result; the preset operation target result is an operation target teaching result;
Judging whether the surgical result matching parameters meet preset surgical teaching conditions or not;
When the operation result matching parameters are judged to not meet the preset operation teaching conditions, analyzing target reasons that the operation result matching parameters do not meet the preset operation teaching conditions, and generating an optimized teaching scheme of the target object based on the target reasons;
the generating the optimized teaching scheme of the target object based on the target reason comprises the following steps:
And determining a reason keyword in the target reason, determining a target optimization scheme matched with the reason keyword in a preset optimization teaching scheme set according to the reason keyword, and determining the target optimization scheme as an optimization teaching scheme of the target object.
2. The surgical teaching implementation method based on a wearable device according to claim 1, wherein the wearable device comprises smart glasses and smart headphones, and the smart glasses and the smart headphones are worn on the target object; the target surgery teaching guide information at least comprises target teaching video information and target teaching voice information;
The transmitting the target surgical teaching guiding information to the wearable device corresponding to the target object, so that the target object receives the target surgical teaching guiding information through the wearable device and performs a target surgical operation, including:
Mapping the target teaching visual information to intelligent glasses corresponding to the target object, so that the target object views the target teaching visual information through the intelligent glasses, and executing target operation matched with the target teaching visual information; and
Transmitting the target teaching voice information to an intelligent earphone corresponding to the target object, so that the target object can answer the target teaching voice information through the intelligent earphone, and executing target operation matched with the target teaching voice information;
The matching degree between the target teaching video information and the target teaching voice information is larger than or equal to a preset matching threshold value.
3. Surgical teaching implementation device based on wearable equipment, characterized in that the device includes:
the acquisition module is used for acquiring real-time operation information of a target object, wherein the target object is an object wearing the wearable equipment; the real-time surgical operation information comprises real-time operation image information of the target object;
A determining module for determining first surgical operation data of the target object based on the real-time surgical operation information; the first surgical operation data comprises one or more of surgical operation progress data of the target object, surgical operation safety data of the target object, surgical operation duration data of the target object and surgical operation action data of the target object;
The determining module is further configured to determine second surgical operation data of the target object according to the first surgical operation data; the second surgical operation data comprises the performed surgical operation data of the target object and the surgical operation data to be performed of the target object;
The judging module is used for judging whether the target object meets the preset operation teaching guide condition according to the second operation data;
The generation module is used for generating surgical teaching guide information according to the first surgical operation data and the second surgical operation data when the judgment module judges that the target object meets the preset surgical teaching guide condition, wherein the surgical teaching guide information comprises teaching guide visual information and teaching guide voice information, and the teaching guide visual information comprises teaching guide image information and teaching guide video information;
the determining module is further used for determining target operation teaching guide information from the operation teaching guide information;
The transmission module is used for transmitting the target surgical teaching guide information to wearable equipment corresponding to the target object so that the target object receives the target surgical teaching guide information through the wearable equipment and performs target surgical operation;
the specific mode of generating the operation teaching guide information by the generating module according to the first operation data and the second operation data comprises the following steps:
determining target operation information of the target object according to the first operation data and the second operation data, wherein the target operation information comprises one or more of target operation appliance information, target operation area information and target operation category information;
Generating target mark information according to the target operation information, wherein the target mark information comprises operation mark information of an operation object corresponding to the target object, and the operation mark information is used for assisting the target object to complete corresponding operation on the operation object;
Generating physical operation marking information according to the operation marking point and the real-time operation information by combining a preset physical space conversion model, wherein the physical operation marking information comprises image guiding marking information of the operation marking point in the current field of view of the target object and voice guiding marking information of the operation marking point in the current field of view of the target object, the voice guiding marking information comprises voice guiding information of the operation marking point in the current field of view of the target object, and the voice guiding marking information is used for guiding the target object to search the operation marking point and executing corresponding operation on the operation marking point;
generating operation teaching guide information according to the entity operation marking information;
The specific mode of generating the physical operation marking information by the generating module according to the operation marking points and the real-time operation information and combining a preset physical space conversion model comprises the following steps:
combining the operation marking point with the real-time operation image information according to the operation marking point and the real-time operation image information by combining a preset entity space conversion model, and merging the operation marking point into the real-time operation image information to generate entity operation marking information corresponding to the current field of view of the target object;
The judging module is further configured to judge whether the wearable device corresponding to the target object receives the request help information from the target object before the generating module generates the surgical teaching guide information according to the first surgical operation data and the second surgical operation data; the request help information comprises one or more of body posture information, gesture information and voice information;
the apparatus further comprises:
the analysis module is used for analyzing the request help information to obtain a request help result when the judgment module judges that the wearable equipment corresponding to the target object receives the request help information from the target object;
The determining module is further configured to determine, based on the request help result, a first solution that matches the request help result;
the acquisition module is further used for acquiring real-time environment information corresponding to the target object;
The determining module is further used for determining a data transmission type matched with the target object according to the real-time environment information;
The conversion module is used for executing data type conversion operation on the first solution to obtain a second solution so as to enable the data type of the second solution to be matched with the data transmission type matched with the target object;
the specific mode of generating the operation teaching guide information by the generating module according to the first operation data and the second operation data comprises the following steps:
Generating surgical teaching guide information according to the first surgical operation data, the second surgical operation data and the second solution;
The specific mode of judging whether the target object meets the preset operation teaching guide condition according to the second operation data by the judging module comprises the following steps:
Predicting a first stage surgical operation result of the target object according to the executed surgical operation data of the target object, wherein the stage surgical operation result comprises a stage surgical result of the target object after the executed surgical operation data is completed;
Determining a second-stage operation result matched with the executed operation data in a preset teaching-stage operation result set according to the executed operation data of the target object, wherein an operation stage corresponding to the second-stage operation result is matched with an operation stage corresponding to the first-stage operation result of the target object; each teaching stage operation result included in the preset teaching stage operation result set meets the preset operation teaching result condition;
judging whether the first-stage operation result is matched with the second-stage operation result;
When the first-stage operation result is judged to be matched with the second-stage operation result, determining that the target object does not meet the preset operation teaching guide condition;
When the first-stage operation result is not matched with the second-stage operation result, determining that the target object meets the preset operation teaching guide condition;
the determining module determines, according to the first operation data, a specific manner of the second operation data of the target object, including:
Acquiring target operation information corresponding to the target object, wherein the target operation information corresponding to the target object comprises one or more of operation type information, operation duration information, operation object information and operation safety information of the target object;
determining operation data to be executed of the target object according to the first operation data of the target object and the target operation information corresponding to the target object; the surgical operation data to be executed of the target object comprises one or more of surgical operation duration data to be executed of the target object and surgical operation tool data to be executed;
Generating second surgical operation data of the target object based on the target surgical information corresponding to the target object and the surgical operation data to be executed of the target object;
The specific manner of determining the second stage surgical operation result matched with the executed surgical operation data in the preset teaching stage surgical result set by the judging module according to the executed surgical operation data of the target object includes:
determining a target stage corresponding to the executed operation data according to the executed operation data, determining a target stage result corresponding to the target stage in a preset teaching stage operation result set, and determining the target stage result as a second stage operation result;
The acquisition module is used for acquiring an operation execution result of the target object after the transmission module transmits the target teaching information to the wearable device corresponding to the target object so that the target object executes the operation execution operation matched with the target teaching information based on the target teaching information in the wearable device;
the calculation module is used for calculating a surgical result matching parameter between the surgical execution result and a preset surgical target result; the preset operation target result is an operation target teaching result;
The judging module is also used for judging whether the operation result matching parameters meet preset operation teaching conditions or not;
The analysis module is further used for analyzing a target reason that the operation result matching parameter does not meet the preset operation teaching condition when the judgment module judges that the operation result matching parameter does not meet the preset operation teaching condition;
the generating module is further used for generating an optimized teaching scheme of the target object based on the target reason;
the specific mode of generating the optimized teaching scheme of the target object based on the target reason by the generating module comprises the following steps:
And determining a reason keyword in the target reason, determining a target optimization scheme matched with the reason keyword in a preset optimization teaching scheme set according to the reason keyword, and determining the target optimization scheme as an optimization teaching scheme of the target object.
4. Surgical teaching implementation device based on wearable equipment, characterized in that the device includes:
A memory storing executable program code;
A processor coupled to the memory;
the processor invokes the executable program code stored in the memory to perform the wearable device-based surgical instruction implementation method of any of claims 1-2.
5. A computer storage medium storing computer instructions for performing the wearable device-based surgical teaching implementation method of any of claims 1-2 when invoked.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310273331.1A CN116343538B (en) | 2023-03-20 | 2023-03-20 | Method and device for surgical teaching based on wearable devices |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310273331.1A CN116343538B (en) | 2023-03-20 | 2023-03-20 | Method and device for surgical teaching based on wearable devices |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116343538A CN116343538A (en) | 2023-06-27 |
CN116343538B true CN116343538B (en) | 2024-11-12 |
Family
ID=86887189
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310273331.1A Active CN116343538B (en) | 2023-03-20 | 2023-03-20 | Method and device for surgical teaching based on wearable devices |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116343538B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20210067010A (en) * | 2019-11-28 | 2021-06-08 | (주)다울디엔에스 | Unity Plug-in for Image and Voice Recognition and Patient Information Processing for Surgery Using AR Glass and Deep Learning |
CN113768619A (en) * | 2020-06-10 | 2021-12-10 | 长庚大学 | Path positioning method, information display device, storage medium and integrated circuit chip |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10806518B2 (en) * | 2016-04-27 | 2020-10-20 | Arthrology Consulting, Llc | Methods for augmenting a surgical field with virtual guidance and tracking and adapting to deviation from a surgical plan |
CN109801193B (en) * | 2017-11-17 | 2020-09-15 | 深圳市鹰硕教育服务股份有限公司 | Follow-up teaching system with voice evaluation function |
EP3810013A1 (en) * | 2018-06-19 | 2021-04-28 | Tornier, Inc. | Neural network for recommendation of shoulder surgery type |
CN113133814A (en) * | 2021-04-01 | 2021-07-20 | 上海复拓知达医疗科技有限公司 | Augmented reality-based puncture surgery navigation device and computer-readable storage medium |
-
2023
- 2023-03-20 CN CN202310273331.1A patent/CN116343538B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20210067010A (en) * | 2019-11-28 | 2021-06-08 | (주)다울디엔에스 | Unity Plug-in for Image and Voice Recognition and Patient Information Processing for Surgery Using AR Glass and Deep Learning |
CN113768619A (en) * | 2020-06-10 | 2021-12-10 | 长庚大学 | Path positioning method, information display device, storage medium and integrated circuit chip |
Also Published As
Publication number | Publication date |
---|---|
CN116343538A (en) | 2023-06-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111275207B (en) | Semi-supervision-based transverse federal learning optimization method, equipment and storage medium | |
JP6816925B2 (en) | Data processing method and equipment for childcare robots | |
CN110931103A (en) | Control method and system of rehabilitation equipment | |
CN103559812B (en) | A kind of educational supervision's appraisal report generation system | |
CN111027486A (en) | Auxiliary analysis and evaluation system and method for big data of teaching effect of primary and secondary school classroom | |
CN111080593A (en) | Image processing device, method and storage medium | |
BR112021005417A2 (en) | system and method for enhancing user interaction by monitoring users' emotional state and reinforcing goal states | |
JP2022510350A (en) | Interactive health assessment method and its system | |
WO2023035980A1 (en) | Storage medium, robotic system, and computer device | |
CN113703574A (en) | VR medical learning method and system based on 5G | |
CN117133402A (en) | Method, device, equipment and readable storage medium for dynamically supervising patient rehabilitation | |
CN117158938A (en) | A health monitoring method, device and electronic equipment applied to smart watches | |
CN116343538B (en) | Method and device for surgical teaching based on wearable devices | |
KR102550839B1 (en) | Electronic apparatus for utilizing avatar matched to user's problem-solving ability, and learning management method | |
CN107811606A (en) | Intellectual vision measurer based on wireless sensor network | |
CN108985667A (en) | Home education auxiliary robot and home education auxiliary system | |
CN108814584A (en) | Electrocardiograph signal detection method, terminal and computer readable storage medium | |
CN117877741A (en) | Multimodal surgical large model system, interaction method, electronic device and storage medium | |
CN118016238A (en) | Assessment and training methods, systems, equipment and media for limb rehabilitation exercises | |
Park et al. | Differences in expectations of passing standards in communication skills for pre-clinical and clinical medical students | |
Penteado et al. | Detecting behavioral trajectories in continued education online courses | |
CN209447349U (en) | Wearable device and smart classroom system | |
CN112422901A (en) | Method and device for generating operation virtual reality video | |
CN113380399B (en) | Automatic generation method and system for oral examination record | |
CN110727883A (en) | Method and system for analyzing personalized growth map of child |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |