CN114694442B - Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment - Google Patents
Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment Download PDFInfo
- Publication number
- CN114694442B CN114694442B CN202011639679.0A CN202011639679A CN114694442B CN 114694442 B CN114694442 B CN 114694442B CN 202011639679 A CN202011639679 A CN 202011639679A CN 114694442 B CN114694442 B CN 114694442B
- Authority
- CN
- China
- Prior art keywords
- virtual
- ultrasonic
- dimensional
- probe
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012549 training Methods 0.000 title claims abstract description 102
- 238000000034 method Methods 0.000 title claims abstract description 43
- 239000000523 sample Substances 0.000 claims abstract description 84
- 238000004088 simulation Methods 0.000 claims abstract description 24
- 238000002604 ultrasonography Methods 0.000 claims description 31
- 230000015654 memory Effects 0.000 claims description 25
- 238000013528 artificial neural network Methods 0.000 claims description 6
- 238000000605 extraction Methods 0.000 claims description 6
- 230000004927 fusion Effects 0.000 claims description 6
- 238000007689 inspection Methods 0.000 claims description 6
- 230000004044 response Effects 0.000 claims description 6
- 230000000694 effects Effects 0.000 abstract description 11
- 238000005516 engineering process Methods 0.000 description 4
- 238000003745 diagnosis Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 208000024891 symptom Diseases 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000037086 body physiology Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 238000012631 diagnostic technique Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 235000019800 disodium phosphate Nutrition 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 238000012285 ultrasound imaging Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention discloses an ultrasonic training method and device based on virtual reality, a storage medium and ultrasonic equipment, wherein the method comprises the following steps: randomly loading a virtual three-dimensional training model and a virtual ultrasonic probe which at least comprise the checking position from a model library according to the checking position selected by a user, wherein the virtual three-dimensional training model comprises a target ultrasonic image; displaying a guide path on the surface of the virtual three-dimensional training model according to the real-time pose information of the virtual ultrasonic probe and the target pose information of the virtual ultrasonic probe corresponding to the target ultrasonic image; and controlling the virtual ultrasonic probe to move along the guide path through the simulation probe to check the virtual three-dimensional training model so as to obtain the target ultrasonic image. The invention can achieve the training effect almost the same as the physical operation, and has the advantages of low use cost, strong sense of reality and good teaching effect.
Description
Technical Field
The invention relates to the technical field of medical imaging, in particular to an ultrasonic training method and device based on virtual reality, a storage medium and ultrasonic equipment.
Background
Ultrasonic diagnosis is a diagnosis method which applies ultrasonic detection technology to human body, and obtains the data form of human body physiology or tissue structure by measuring specific parameters, finds out diseases and gives prompts. Ultrasound diagnosis has a high degree of "operator dependence", i.e. the operator must go through specialized ultrasound procedures and ultrasound imaging knowledge to get accurate ultrasound examination results. Thus, excellent ultrasound training is the basis for the clinical application of ultrasound diagnostic techniques.
The current ultrasonic training course is divided into two parts of teaching of classroom theory and clinical teaching. On one hand, the theoretical explanation and the actual operation are often quite different, so that a learner cannot intuitively grasp the key points of the operation method; on the other hand, clinical teaching is often limited to patients and operating environments, large-scale training cannot be performed, and students cannot directly perform ultrasonic operation on the patients, so that the conventional clinical teaching is difficult to observe ultrasonic manifestations of typical symptoms. All the above disadvantages lead to an unsatisfactory effect of medical ultrasound training, which results in that students cannot master clinical ultrasound skills well.
Disclosure of Invention
In view of the above, the embodiment of the invention provides an ultrasonic training method, an ultrasonic training device, a storage medium and ultrasonic equipment based on virtual reality, so as to solve the technical problem that the traditional ultrasonic training can not well master clinical ultrasonic skills.
The technical scheme provided by the invention is as follows:
the first aspect of the embodiment of the invention provides an ultrasonic training method based on virtual reality, which comprises the following steps:
Randomly loading a virtual three-dimensional training model and a virtual ultrasonic probe which at least comprise the checking position from a model library according to the checking position selected by a user, wherein the virtual three-dimensional training model comprises a target ultrasonic image;
displaying a guide path on the surface of the virtual three-dimensional training model according to the real-time pose information of the virtual ultrasonic probe and the target pose information of the virtual ultrasonic probe corresponding to the target ultrasonic image;
And controlling the virtual ultrasonic probe to move along the guide path through the simulation probe to check the virtual three-dimensional training model so as to obtain the target ultrasonic image.
Further, the method further comprises the following steps:
Displaying a virtual ultrasound image obtained by the virtual ultrasound probe in front of one eye of the user;
the target ultrasound image is displayed in front of the other eye of the user.
Further, the method further comprises the following steps:
Hiding the guide path in response to a control instruction of the user;
Calculating the inspection time of the user for controlling the virtual ultrasonic probe to obtain the target ultrasonic image through the simulation probe once;
and when the checking time is less than the preset time, judging that the user passes training.
Further, the virtual three-dimensional training model is obtained by the following method:
Controlling an ultrasonic probe to carry out ultrasonic scanning on a real examination part along a preset direction, and obtaining an ultrasonic image corresponding to each section of the real examination part;
Acquiring pose information of an ultrasonic probe corresponding to each ultrasonic image;
and inputting the ultrasonic image corresponding to each section and the pose information of the ultrasonic probe corresponding to each ultrasonic image into a trained three-dimensional reconstruction network model to obtain the virtual three-dimensional training model.
Further, the three-dimensional reconstruction network model is obtained through training by the following method:
inputting ultrasonic images of a plurality of real examination objects into a first convolution neural network for feature extraction to obtain image features;
inputting pose information of an ultrasonic probe corresponding to a plurality of ultrasonic images into a second convolution neural network for feature extraction to obtain pose features;
fusing image features corresponding to the ultrasonic images of the plurality of real inspection objects with pose features of the ultrasonic probes corresponding to the plurality of ultrasonic images to obtain fusion features;
and generating a three-dimensional reconstruction network model according to the fusion characteristics of the ultrasonic images of the plurality of real examination objects and the ultrasonic probes corresponding to the plurality of ultrasonic images.
Further, the virtual three-dimensional training model is obtained by the following method:
capturing real scene information at least comprising a real examination part to generate a real three-dimensional model;
obtaining a virtual three-dimensional ultrasonic model of a real examination part;
And fusing the virtual three-dimensional ultrasonic model and the real three-dimensional model to obtain a virtual three-dimensional training model.
Further, the virtual three-dimensional training model is checked by controlling the virtual ultrasonic probe to move along the guiding path through the simulation probe so as to obtain the target ultrasonic image, and the method specifically comprises the following steps:
Acquiring pose information of the simulation probe through one or more of a magnetic sensor, an IMU or a camera;
generating a pose adjusting instruction according to pose information of the simulation probe;
and synchronously controlling the virtual ultrasonic probe in response to the pose adjustment instruction so as to obtain the target ultrasonic image.
A second aspect of an embodiment of the present invention provides an ultrasound training system based on virtual reality, including: VR equipment randomly loads a virtual three-dimensional training model and a virtual ultrasonic probe, wherein the virtual three-dimensional training model at least comprises the checking part, and the virtual three-dimensional training model comprises a target ultrasonic image; the path guiding module displays a guiding path on the surface of the virtual three-dimensional training model according to the real-time pose information of the virtual ultrasonic probe and the target pose information of the virtual ultrasonic probe corresponding to the target ultrasonic image; and the control module is used for controlling the virtual ultrasonic probe to move along the guide path to check the virtual three-dimensional training model through the simulation probe so as to obtain the target ultrasonic image.
A third aspect of the embodiments of the present invention provides a computer-readable storage medium storing computer instructions for causing the computer to perform the virtual reality based ultrasound training method according to any one of the first aspect and the first aspect of the embodiments of the present invention.
A fourth aspect of an embodiment of the present invention provides an ultrasound apparatus, including: the ultrasonic training system comprises a memory and a processor, wherein the memory and the processor are in communication connection, the memory stores computer instructions, and the processor executes the computer instructions so as to execute the ultrasonic training method based on virtual reality according to any one of the first aspect and the first aspect of the embodiment of the invention.
The technical scheme provided by the invention has the following effects:
According to the ultrasonic training method, device, storage medium and ultrasonic equipment based on virtual reality, provided by the embodiment of the invention, the ultrasonic examination case in the real scene is generated into the virtual three-dimensional training model through the virtual reality technology, and inexperienced users can train for multiple times under the guidance of the guiding path so as to obtain the target ultrasonic image. The invention can achieve the training effect almost the same as the physical operation, and has the advantages of low use cost, strong sense of reality and good teaching effect.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a virtual reality based ultrasound training method according to an embodiment of the invention;
FIG. 2 is a block diagram of a virtual reality based ultrasound training system according to an embodiment of the invention;
FIG. 3 is a schematic diagram of a computer-readable storage medium provided according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an ultrasonic apparatus according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The current ultrasonic training course is divided into two parts of teaching of classroom theory and clinical teaching. On one hand, the theoretical explanation and the actual operation are often quite different, so that a learner cannot intuitively grasp the key points of the operation method; on the other hand, clinical teaching is often limited to patients and operating environments, large-scale training cannot be performed, and students cannot directly perform ultrasonic operation on the patients, so that the conventional clinical teaching is difficult to observe ultrasonic manifestations of typical symptoms. All the above disadvantages lead to an unsatisfactory effect of medical ultrasound training, which results in that students cannot master clinical ultrasound skills well.
The first aspect of the embodiment of the invention provides an ultrasonic training method based on virtual reality, which comprises the following steps:
S100, randomly loading a virtual three-dimensional training model and a virtual ultrasonic probe, wherein the virtual three-dimensional training model at least comprises the checking part, and the virtual three-dimensional training model comprises a target ultrasonic image, from a model library according to the checking part selected by a user;
in one embodiment the virtual three-dimensional training model is obtained by the following method:
Controlling an ultrasonic probe to carry out ultrasonic scanning on a real examination part along a preset direction, and obtaining an ultrasonic image corresponding to each section of the real examination part;
Acquiring pose information of an ultrasonic probe corresponding to each ultrasonic image;
and inputting the ultrasonic image corresponding to each section and the pose information of the ultrasonic probe corresponding to each ultrasonic image into a trained three-dimensional reconstruction network model to obtain the virtual three-dimensional training model.
Further, the three-dimensional reconstruction network model is obtained through training by the following method:
inputting ultrasonic images of a plurality of real examination objects into a first convolution neural network for feature extraction to obtain image features;
inputting pose information of an ultrasonic probe corresponding to a plurality of ultrasonic images into a second convolution neural network for feature extraction to obtain pose features;
fusing image features corresponding to the ultrasonic images of the plurality of real inspection objects with pose features of the ultrasonic probes corresponding to the plurality of ultrasonic images to obtain fusion features;
and generating a three-dimensional reconstruction network model according to the fusion characteristics of the ultrasonic images of the plurality of real examination objects and the ultrasonic probes corresponding to the plurality of ultrasonic images.
In another embodiment, the virtual three-dimensional training model is obtained by:
capturing real scene information at least comprising a real examination part to generate a real three-dimensional model;
obtaining a virtual three-dimensional ultrasonic model of a real examination part;
And fusing the virtual three-dimensional ultrasonic model and the real three-dimensional model to obtain a virtual three-dimensional training model.
S120, displaying a guide path on the surface of the virtual three-dimensional training model according to the real-time pose information of the virtual ultrasonic probe and the target pose information of the virtual ultrasonic probe corresponding to the target ultrasonic image;
and S130, controlling the virtual ultrasonic probe to move along the guide path through the simulation probe to check the virtual three-dimensional training model so as to obtain the target ultrasonic image. The method specifically comprises the following steps:
s131, acquiring pose information of the simulation probe through one or more of a magnetic sensor, an IMU or a camera; in an embodiment, a camera is arranged outside the simulation probe and used for collecting the pose information of the simulation probe, and the camera can be a three-dimensional camera.
In one embodiment, an inertial sensor (Inertial Measurement Unit, IMU) is disposed in the simulation probe to obtain real-time pose information of the simulation probe, such as real-time X-axis, Y-axis, and Z-axis coordinate information of the simulation probe. In addition, pose information of the simulation probe can be obtained through the magnetic sensor.
S132, generating a pose adjusting instruction according to pose information of the simulation probe;
and S133, synchronously controlling the virtual ultrasonic probe in response to the pose adjusting instruction so as to obtain the target ultrasonic image.
In an embodiment, further comprising: displaying a virtual ultrasound image obtained by the virtual ultrasound probe in front of one eye of the user; the target ultrasound image is displayed in front of the other eye of the user.
In order to check whether the user has met the requirements after a period of training, the invention further comprises:
S140, hiding the guide path in response to the control instruction of the user;
s150, calculating the inspection time of the user for controlling the virtual ultrasonic probe to obtain the target ultrasonic image through the simulation probe once;
and S160, judging that the user passes training when the checking time is smaller than a preset time.
According to the ultrasonic training method based on virtual reality, provided by the embodiment of the invention, the virtual three-dimensional training model is generated by the ultrasonic examination case under the real scene through the virtual reality technology, and inexperienced users can train for multiple times under the guidance of the guiding path so as to obtain the target ultrasonic image. The invention can achieve the training effect almost the same as the physical operation, and has the advantages of low use cost, strong sense of reality and good teaching effect.
As shown in fig. 2, a second aspect of the embodiment of the present invention provides an ultrasonic training system based on virtual reality, including: VR equipment randomly loads a virtual three-dimensional training model and a virtual ultrasonic probe, wherein the virtual three-dimensional training model at least comprises the checking part, and the virtual three-dimensional training model comprises a target ultrasonic image; the path guiding module displays a guiding path on the surface of the virtual three-dimensional training model according to the real-time pose information of the virtual ultrasonic probe and the target pose information of the virtual ultrasonic probe corresponding to the target ultrasonic image; and the control module is used for controlling the virtual ultrasonic probe to move along the guide path to check the virtual three-dimensional training model through the simulation probe so as to obtain the target ultrasonic image.
According to the ultrasonic training system based on virtual reality, provided by the embodiment of the invention, the virtual three-dimensional training model is generated by the ultrasonic examination case under the real scene through the virtual reality technology, and inexperienced users can train for multiple times under the guidance of the guiding path so as to obtain the target ultrasonic image. The invention can achieve the training effect almost the same as the physical operation, and has the advantages of low use cost, strong sense of reality and good teaching effect.
A third aspect of the embodiments of the present invention provides a computer-readable storage medium storing computer instructions for causing the computer to perform the virtual reality based ultrasound training method according to any one of the first aspect and the first aspect of the embodiments of the present invention.
As shown in fig. 3, a computer program 601 is stored thereon, which instructions when executed by a processor implement the steps of the virtual reality based ultrasound training method of the above embodiments. The storage medium also stores audio and video stream data, characteristic frame data, interactive request signaling, encrypted data, preset data size and the like. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a Flash Memory (Flash Memory), a hard disk (HARD DISK DRIVE, abbreviated as HDD), a Solid state disk (Solid-STATE DRIVE, SSD), or the like; the storage medium may also comprise a combination of memories of the kind described above.
It will be appreciated by those skilled in the art that implementing all or part of the above-described embodiment method may be implemented by a computer program to instruct related hardware, where the program may be stored in a computer readable storage medium, and the program may include the above-described embodiment method when executed. Wherein the storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a Flash Memory (Flash Memory), a hard disk (HARD DISK DRIVE, abbreviated as HDD), a Solid state disk (Solid-state-STATE DRIVE, SSD), or the like; the storage medium may also comprise a combination of memories of the kind described above.
A fourth aspect of an embodiment of the present invention provides an ultrasound apparatus, including: the ultrasonic training system comprises a memory and a processor, wherein the memory and the processor are in communication connection, the memory stores computer instructions, and the processor executes the computer instructions so as to execute the ultrasonic training method based on virtual reality according to any one of the first aspect and the first aspect of the embodiment of the invention.
As shown in fig. 4, the ultrasound device may include a processor 51 and a memory 52, where the processor 51 and the memory 52 may be connected by a bus or otherwise, as exemplified by a bus connection in fig. 4.
The processor 51 may be a central processing unit (Central Processing Unit, CPU). The Processor 51 may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL processors, DSPs), application SPECIFIC INTEGRATED Circuits (ASICs), field-Programmable gate arrays (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or combinations thereof.
The memory 52 serves as a non-transitory computer readable storage medium that may be used to store non-transitory software programs, non-transitory computer-executable programs, and modules, such as corresponding program instructions/modules in embodiments of the present invention. The processor 51 executes various functional applications of the processor and data processing by running non-transitory software programs, instructions, and modules stored in the memory 52, i.e., implementing the virtual reality-based ultrasound training method in the method embodiments described above.
Memory 52 may include a storage program area that may store an operating system, at least one application program required for functionality, and a storage data area; the storage data area may store data created by the processor 51, etc. In addition, memory 52 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 52 may optionally include memory located remotely from processor 51, which may be connected to processor 51 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Although embodiments of the present invention have been described in connection with the accompanying drawings, various modifications and variations may be made by those skilled in the art without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope of the invention as defined by the appended claims.
Claims (9)
1. An ultrasonic training method based on virtual reality is characterized by comprising the following steps:
Randomly loading a virtual three-dimensional training model and a virtual ultrasonic probe which at least comprise the checking position from a model library according to the checking position selected by a user, wherein the virtual three-dimensional training model comprises a target ultrasonic image;
displaying a guide path on the surface of the virtual three-dimensional training model according to the real-time pose information of the virtual ultrasonic probe and the target pose information of the virtual ultrasonic probe corresponding to the target ultrasonic image;
Controlling the virtual ultrasonic probe to move along the guiding path through the simulation probe to check the virtual three-dimensional training model so as to obtain the target ultrasonic image;
The virtual three-dimensional training model is obtained by the following method:
Controlling an ultrasonic probe to carry out ultrasonic scanning on a real examination part along a preset direction, and obtaining an ultrasonic image corresponding to each section of the real examination part;
Acquiring pose information of an ultrasonic probe corresponding to each ultrasonic image;
and inputting the ultrasonic image corresponding to each section and the pose information of the ultrasonic probe corresponding to each ultrasonic image into a trained three-dimensional reconstruction network model to obtain the virtual three-dimensional training model.
2. The virtual reality-based ultrasound training method of claim 1, further comprising:
Displaying a virtual ultrasound image obtained by the virtual ultrasound probe in front of one eye of the user;
the target ultrasound image is displayed in front of the other eye of the user.
3. The virtual reality-based ultrasound training method of claim 1, further comprising:
Hiding the guide path in response to a control instruction of the user;
Calculating the inspection time of the user for controlling the virtual ultrasonic probe to obtain the target ultrasonic image through the simulation probe once;
and when the checking time is less than the preset time, judging that the user passes training.
4. The virtual reality-based ultrasound training method of claim 1, wherein the three-dimensional reconstruction network model is obtained by training by:
inputting ultrasonic images of a plurality of real examination objects into a first convolution neural network for feature extraction to obtain image features;
inputting pose information of an ultrasonic probe corresponding to a plurality of ultrasonic images into a second convolution neural network for feature extraction to obtain pose features;
fusing image features corresponding to the ultrasonic images of the plurality of real inspection objects with pose features of the ultrasonic probes corresponding to the plurality of ultrasonic images to obtain fusion features;
and generating a three-dimensional reconstruction network model according to the fusion characteristics of the ultrasonic images of the plurality of real examination objects and the ultrasonic probes corresponding to the plurality of ultrasonic images.
5. A virtual reality-based ultrasound training method according to any of claims 1-3, characterized in that the virtual three-dimensional training model is obtained by:
capturing real scene information at least comprising a real examination part to generate a real three-dimensional model;
obtaining a virtual three-dimensional ultrasonic model of a real examination part;
And fusing the virtual three-dimensional ultrasonic model and the real three-dimensional model to obtain a virtual three-dimensional training model.
6. A virtual reality-based ultrasound training method according to any of claims 1-3, characterized in that the virtual three-dimensional training model is examined by controlling the virtual ultrasound probe to move along the guide path by means of a simulation probe to obtain the target ultrasound image, in particular comprising:
Acquiring pose information of the simulation probe through one or more of a magnetic sensor, an IMU or a camera;
generating a pose adjusting instruction according to pose information of the simulation probe;
and synchronously controlling the virtual ultrasonic probe in response to the pose adjustment instruction so as to obtain the target ultrasonic image.
7. An ultrasonic training system based on virtual reality, comprising:
VR equipment randomly loads a virtual three-dimensional training model and a virtual ultrasonic probe, wherein the virtual three-dimensional training model at least comprises the checking part, and the virtual three-dimensional training model comprises a target ultrasonic image;
The path guiding module displays a guiding path on the surface of the virtual three-dimensional training model according to the real-time pose information of the virtual ultrasonic probe and the target pose information of the virtual ultrasonic probe corresponding to the target ultrasonic image;
The control module is used for controlling the virtual ultrasonic probe to move along the guide path to check the virtual three-dimensional training model through the simulation probe so as to obtain the target ultrasonic image;
The virtual three-dimensional training model is obtained by the following method:
Controlling an ultrasonic probe to carry out ultrasonic scanning on a real examination part along a preset direction, and obtaining an ultrasonic image corresponding to each section of the real examination part;
Acquiring pose information of an ultrasonic probe corresponding to each ultrasonic image;
and inputting the ultrasonic image corresponding to each section and the pose information of the ultrasonic probe corresponding to each ultrasonic image into a trained three-dimensional reconstruction network model to obtain the virtual three-dimensional training model.
8. A computer-readable storage medium storing computer instructions for causing the computer to perform the virtual reality-based ultrasound training method of any of claims 1-6.
9. An ultrasound device, comprising: a memory and a processor communicatively coupled to each other, the memory storing computer instructions, the processor executing the computer instructions to perform the virtual reality-based ultrasound training method of any of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011639679.0A CN114694442B (en) | 2020-12-31 | 2020-12-31 | Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011639679.0A CN114694442B (en) | 2020-12-31 | 2020-12-31 | Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114694442A CN114694442A (en) | 2022-07-01 |
CN114694442B true CN114694442B (en) | 2024-07-02 |
Family
ID=82135766
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011639679.0A Active CN114694442B (en) | 2020-12-31 | 2020-12-31 | Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114694442B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115938181B (en) * | 2022-12-22 | 2025-03-28 | 先临三维科技股份有限公司 | Scanning boot method, device and system, medium and computer equipment |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109310396A (en) * | 2016-06-20 | 2019-02-05 | 蝴蝶网络有限公司 | For assisting the automated graphics of user's operation Vltrasonic device to obtain |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3662827B2 (en) * | 2000-10-02 | 2005-06-22 | アロカ株式会社 | Ultrasonic probe and ultrasonic diagnostic apparatus |
IN2014CN04516A (en) * | 2011-12-03 | 2015-09-11 | Koninkl Philips Nv | |
JP6643741B2 (en) * | 2016-04-15 | 2020-02-12 | 株式会社ソシオネクスト | Ultrasonic probe control method and program |
CN110689792A (en) * | 2019-11-19 | 2020-01-14 | 南方医科大学深圳医院 | Ultrasound examination virtual diagnosis training system and method |
CN111415564B (en) * | 2020-03-02 | 2022-03-18 | 武汉大学 | Navigation method and system for pancreatic endoscopic ultrasonography based on artificial intelligence |
-
2020
- 2020-12-31 CN CN202011639679.0A patent/CN114694442B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109310396A (en) * | 2016-06-20 | 2019-02-05 | 蝴蝶网络有限公司 | For assisting the automated graphics of user's operation Vltrasonic device to obtain |
Also Published As
Publication number | Publication date |
---|---|
CN114694442A (en) | 2022-07-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Basdogan et al. | VR-based simulators for training in minimally invasive surgery | |
RU2478980C2 (en) | System and method for automatic calibration of tracked ultrasound | |
CN112331049B (en) | Ultrasonic simulation training method and device, storage medium and ultrasonic equipment | |
US9974618B2 (en) | Method for determining an imaging specification and image-assisted navigation as well as device for image-assisted navigation | |
US10672125B2 (en) | Method and system for supporting medical personnel | |
JP2022527007A (en) | Auxiliary imaging device, control method and device for analysis of movement disorder disease | |
CN113116386A (en) | Ultrasound imaging guidance method, ultrasound apparatus, and storage medium | |
CN106156398A (en) | For the operating equipment of area of computer aided simulation and method | |
Huang et al. | On mimicking human’s manipulation for robot-assisted spine ultrasound imaging | |
CN115804652A (en) | Surgical operating system and method | |
KR101791927B1 (en) | Method and apparatus for estimating roughness of skin surface for haptic feedback apparatus based on perceptual analysis | |
WO2018119676A1 (en) | Display data processing method and apparatus | |
CN114694442B (en) | Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment | |
CN117289802A (en) | Eye movement measuring method, system and equipment based on eye movement tracking and AR display | |
KR20180080848A (en) | Rehabilitation system and the method for the same which use virtual reality provided by game engine | |
Bogar et al. | Validation of a novel, low-fidelity virtual reality simulator and an artificial intelligence assessment approach for peg transfer laparoscopic training | |
US11766234B2 (en) | System and method for identifying and navigating anatomical objects using deep learning networks | |
US20230137369A1 (en) | Aiding a user to perform a medical ultrasound examination | |
WO2024140749A1 (en) | Ultrasonic scanning method, apparatus and system, and electronic device and storage medium | |
Wagner et al. | Intraocular surgery on a virtual eye | |
CN118873169A (en) | Ultrasonic imaging method, system, device, medium and program product | |
WO2021171464A1 (en) | Processing device, endoscope system, and captured image processing method | |
CN115953532A (en) | Method and device for displaying ultrasonic image for teaching and teaching system of ultrasonic image | |
CN113870636B (en) | Ultrasonic simulation training method, ultrasonic device and storage medium | |
JP7250279B2 (en) | Information processing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |