[go: up one dir, main page]

CN108335599B - Operation model training method based on three-dimensional modeling image technology - Google Patents

Operation model training method based on three-dimensional modeling image technology Download PDF

Info

Publication number
CN108335599B
CN108335599B CN201810052392.4A CN201810052392A CN108335599B CN 108335599 B CN108335599 B CN 108335599B CN 201810052392 A CN201810052392 A CN 201810052392A CN 108335599 B CN108335599 B CN 108335599B
Authority
CN
China
Prior art keywords
information
dimensional
training model
training
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810052392.4A
Other languages
Chinese (zh)
Other versions
CN108335599A (en
Inventor
苏道庆
王伟峰
纪振刚
张萌
孟元
钟迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liaocheng People's Hospital
Original Assignee
Liaocheng people's hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liaocheng people's hospital filed Critical Liaocheng people's hospital
Priority to CN201810052392.4A priority Critical patent/CN108335599B/en
Publication of CN108335599A publication Critical patent/CN108335599A/en
Application granted granted Critical
Publication of CN108335599B publication Critical patent/CN108335599B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Medicinal Chemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Algebra (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Medical Informatics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A surgical model training method based on a three-dimensional modeling image technology comprises the following steps: s1, acquiring three-dimensional image information of different tissues and organs in different body types, ages and sexes, and performing key point calibration on the three-dimensional image information to obtain a three-dimensional training model set comprising corresponding different three-dimensional training model information; s2, establishing triggering performance information of key points of the three-dimensional training model information; generating a random burst condition instance set; s3, generating instrument control conversion parameters according to different operation types corresponding to different tissues and organs; acquiring operation type training request information of a user, and determining corresponding instrument control conversion parameters according to the operation type training request information; and S4, extracting three-dimensional training model information from the three-dimensional training model set as an object of surgery simulation training, receiving operation information of a user, and converting the operation information into operation information of the three-dimensional training model information through instrument control conversion parameters according to the operation information of the user.

Description

Operation model training method based on three-dimensional modeling image technology
Technical Field
The invention relates to the technical field of operation simulation training, in particular to an operation model training method based on a three-dimensional modeling image technology.
Background
The existing surgical operation training often occurs in medical colleges and subsidiary hospitals of the medical colleges, and because the training opportunities of clinical operations are limited, medical trainees can not obtain good training. Some operation training devices also appear in the prior art, but real operation scenes cannot be simulated completely, and most importantly, one operation training device can only adapt to one type of operation after being used for a long time, the level cannot be improved again in one type of operation training, and the existing operation simulation training method is very teaching, and various sudden conditions in the operation process cannot be simulated really.
Disclosure of Invention
In view of this, the present invention provides a method for training an operation model based on a three-dimensional modeling image technology.
A surgical model training method based on a three-dimensional modeling image technology comprises the following steps:
s1, acquiring three-dimensional image information of different tissues and organs in different body types, ages and sexes, and performing key point calibration on the three-dimensional image information to obtain a three-dimensional training model set comprising corresponding different three-dimensional training model information;
s2, establishing triggering performance information of key points of the three-dimensional training model information; generating a random burst condition instance set;
s3, generating instrument control conversion parameters according to different operation types corresponding to different tissues and organs; acquiring operation type training request information of a user, and determining corresponding instrument control conversion parameters according to the operation type training request information;
s4, extracting three-dimensional training model information from the three-dimensional training model set as an object of surgery simulation training, receiving operation information of a user, and converting the operation information into operation information of the three-dimensional training model information through instrument control conversion parameters according to the operation information of the user;
s5, matching and recording the operation result of the user according to the triggering performance information of the key points of the three-dimensional training model information to obtain a first record set;
s6, randomly extracting a random burst condition example from the random burst condition example set, generating burst triggering expression information according to the random burst condition example, transforming the three-dimensional training model to obtain burst three-dimensional training model information, converting the operation information of the user into operation information of the burst three-dimensional training model information through an instrument control conversion parameter, and matching and recording the operation result of the user according to the burst triggering expression information of the burst three-dimensional training model information to obtain a second record set;
and S7, generating a model training evaluation result on the hand of the user according to the first record set and the second record set.
In the surgical model training method based on the three-dimensional modeling image technology, the step S1 includes:
acquiring image information of different tissues and organs corresponding to different angles in different body types, ages and sexes;
establishing corresponding three-dimensional image information according to the image information of different angles, wherein the three-dimensional image information is represented by three-dimensional coordinate points;
and carrying out key point calibration on the corresponding three-dimensional coordinate points according to the probability and the weight distribution of the tissue and the organ corresponding to different positions and depths in different operation simulation training types.
In the surgical model training method based on the three-dimensional modeling image technology of the invention,
the triggering performance information of the key points for establishing the three-dimensional training model information in step S2 includes:
establishing a trigger performance information sequence, wherein the trigger performance information sequence comprises: sequence information, processing time information, position information, precision information, and deformation threshold information;
configuring a mapping relation between a trigger expression information sequence and an operation simulation training type;
and performing serial code conversion on the value of the mapping relation and the trigger performance information sequence, and configuring the serial code value to a three-dimensional coordinate point after the key point is calibrated.
In the surgical model training method based on the three-dimensional modeling image technology of the invention,
the step S2 of generating the random burst condition instance set includes:
configuring burst implementation of a surgery simulation training type, and configuring control parameters corresponding to each burst instance, wherein the control parameters comprise: three-dimensional training model deformation control parameters and three-dimensional training model expansion control parameters.
The beneficial technical effects are as follows: compared with the prior art, the invention can realize that: performing simulation training on different operation types according to three-dimensional image information of different tissues and organs in different body types, ages and sexes; and by introducing the random burst condition example set, various burst conditions can be simulated in the surgical training process, and the capability of a user for coping with the burst conditions is effectively trained.
Drawings
Fig. 1 is a flowchart of a surgical model training method based on a three-dimensional modeling image technology according to an embodiment of the present invention.
Detailed Description
As shown in fig. 1, in an embodiment of the present invention, a method for training a surgical model based on a three-dimensional modeling image technology includes the following steps:
s1, three-dimensional image information of different tissues and organs in different body types, ages and sexes is obtained, and key point calibration is carried out on the three-dimensional image information to obtain a three-dimensional training model set comprising corresponding different three-dimensional training model information.
Optionally, the step S1 includes:
acquiring image information of different tissues and organs corresponding to different angles in different body types, ages and sexes;
establishing corresponding three-dimensional image information according to the image information of different angles, wherein the three-dimensional image information is represented by three-dimensional coordinate points;
and carrying out key point calibration on the corresponding three-dimensional coordinate points according to the probability and the weight distribution of the tissue and the organ corresponding to different positions and depths in different operation simulation training types.
In the preferred embodiment of the present invention, the positions of the surgical instruments contacting the organs are different due to different etiologies of different tissues and organs in the surgical training, and the probability of the depth is used to represent the contact possibility of the surgical instruments and the organs at different positions, and the weight distribution refers to the influence on the whole surgical training when the surgery is applied to the organs. Therefore, the three-dimensional training model information formed by the three-dimensional coordinate points after the key points are calibrated is beneficial to the precision of training results of different operation types, and is particularly suitable for operation training scenes based on the virtual reality technology.
S2, establishing triggering performance information of key points of the three-dimensional training model information; and generates a set of random burst condition instances.
Alternatively,
the triggering performance information of the key points for establishing the three-dimensional training model information in step S2 includes:
establishing a trigger performance information sequence, wherein the trigger performance information sequence comprises: sequence information, processing time information, position information, precision information, and deformation threshold information;
configuring a mapping relation between a trigger expression information sequence and an operation simulation training type;
and performing serial code conversion on the value of the mapping relation and the trigger performance information sequence, and configuring the serial code value to a three-dimensional coordinate point after the key point is calibrated.
The step S2 of generating the random burst condition instance set includes:
configuring burst implementation of a surgery simulation training type, and configuring control parameters corresponding to each burst instance, wherein the control parameters comprise: three-dimensional training model deformation control parameters and three-dimensional training model expansion control parameters.
The three-dimensional training model deformation control parameters mean that the shape, the size and the like of the three-dimensional training model are changed; the three-dimensional training model expansion control parameters refer to that in addition to the three-dimensional training model, another three-dimensional training model is added, and the additionally added three-dimensional training model also needs to be processed by a user.
S3, generating instrument control conversion parameters according to different operation types corresponding to different tissues and organs; acquiring operation type training request information of a user, and determining corresponding instrument control conversion parameters according to the operation type training request information;
optionally, the determining a model of the corresponding instrument control conversion parameter according to the surgical type training request information includes:
Figure BDA0001552733440000041
wherein λ is a control conversion parameter model, α is a progress parameter, β is a feedback force parameter, χ is a rotation parameter, e (t) is a sequence value based on time t, θ is a calculation factor, M, N, Q is a 6-order motion matrix, a 6-order feedback force matrix and a 6-order force conversion matrix, respectively, and λ, α and β are obtained through statistical analysis.
By implementing the preferred embodiment, the control strength applied to the instrument in the direction of the degree of freedom of the user can be accurately converted into the operation of the instrument on the organ by the user in the surgical training, and the misjudgment of the surgical training result of the user due to too large error in the conversion process can be avoided.
S4, extracting three-dimensional training model information from the three-dimensional training model set as an object of surgery simulation training, receiving operation information of a user, and converting the operation information into operation information of the three-dimensional training model information through instrument control conversion parameters according to the operation information of the user;
s5, matching and recording the operation result of the user according to the triggering performance information of the key points of the three-dimensional training model information to obtain a first record set;
s6, randomly extracting a random burst condition example from the random burst condition example set, generating burst triggering expression information according to the random burst condition example, transforming the three-dimensional training model to obtain burst three-dimensional training model information, converting the operation information of the user into operation information of the burst three-dimensional training model information through an instrument control conversion parameter, and matching and recording the operation result of the user according to the burst triggering expression information of the burst three-dimensional training model information to obtain a second record set;
step S6 is a major improvement point in the embodiments of the present invention, where different random burst condition instances are configured in the random burst condition instance set by setting the random burst condition instance set, and the different random burst condition instances transform the three-dimensional training model to obtain burst three-dimensional training model information, and at the same time, may also generate burst trigger performance information. For example, one random burst condition example is excessive bleeding, which affects the color of organs and forms a blood pool around the organs, which affects the shape of organ models, and burst three-dimensional training model information is obtained by adding the random burst condition example, which can greatly affect the performance of users during surgical training and is closer to the real surgical scene.
And S7, generating a model training evaluation result on the hand of the user according to the first record set and the second record set.
Compared with the prior art, the embodiment of the invention can subdivide the operation training process through three-dimensional modeling and key point calibration technologies, not only simply carry out three-dimensional digitization on organs, but also establish an accurate corresponding relation between the operation of a user and the change of the organs, thereby carrying out accurate evaluation on the training of the user.
Persons skilled in the art can know that both organs and instruments can be realized in a three-dimensional modeling mode, and the operation of a user can acquire parameters through sensing devices such as a force feedback device and a speed acquisition device, so that the virtual reality modeling is realized.
The beneficial technical effects are as follows: compared with the prior art, the invention can realize that: performing simulation training on different operation types according to three-dimensional image information of different tissues and organs in different body types, ages and sexes; and by introducing the random burst condition example set, various burst conditions can be simulated in the surgical training process, and the capability of a user for coping with the burst conditions is effectively trained.
It is understood that various other changes and modifications may be made by those skilled in the art based on the technical idea of the present invention, and all such changes and modifications should fall within the protective scope of the claims of the present invention.

Claims (2)

1. A surgical model training method based on a three-dimensional modeling image technology is characterized by comprising the following steps:
s1, acquiring three-dimensional image information of different tissues and organs in different body types, ages and sexes, and performing key point calibration on the three-dimensional image information to obtain a three-dimensional training model set comprising corresponding different three-dimensional training model information;
s2, establishing triggering performance information of key points of the three-dimensional training model information; generating a random burst condition instance set;
s3, generating instrument control conversion parameters according to different operation types corresponding to different tissues and organs; acquiring operation type training request information of a user, and determining corresponding instrument control conversion parameters according to the operation type training request information; the model for determining the corresponding instrument control conversion parameter according to the operation type training request information comprises the following steps:
Figure FDA0002204121110000011
wherein λ is a control conversion parameter model, α is a progress parameter, β is a feedback force parameter, χ is a rotation parameter, e (t) is a sequence value based on time t, θ is a calculation factor, M, N, Q is a 6-order motion matrix, a 6-order feedback force matrix and a 6-order force conversion matrix, wherein λ, α and β are obtained through statistical analysis;
s4, extracting three-dimensional training model information from the three-dimensional training model set as an object of surgery simulation training, receiving operation information of a user, and converting the operation information into operation information of the three-dimensional training model information through instrument control conversion parameters according to the operation information of the user;
s5, matching and recording the operation result of the user according to the triggering performance information of the key points of the three-dimensional training model information to obtain a first record set;
s6, randomly extracting a random burst condition example from the random burst condition example set, generating burst triggering expression information according to the random burst condition example, transforming the three-dimensional training model to obtain burst three-dimensional training model information, converting the operation information of the user into operation information of the burst three-dimensional training model information through an instrument control conversion parameter, and matching and recording the operation result of the user according to the burst triggering expression information of the burst three-dimensional training model information to obtain a second record set;
s7, generating a model training evaluation result on the hand of the user according to the first record set and the second record set;
the step S1 includes:
acquiring image information of different tissues and organs corresponding to different angles in different body types, ages and sexes;
establishing corresponding three-dimensional image information according to the image information of different angles, wherein the three-dimensional image information is represented by three-dimensional coordinate points;
performing key point calibration on corresponding three-dimensional coordinate points according to the probability and weight distribution of different positions and depths corresponding to tissues and organs in different operation simulation training types;
the triggering performance information of the key points for establishing the three-dimensional training model information in step S2 includes:
establishing a trigger performance information sequence, wherein the trigger performance information sequence comprises: sequence information, processing time information, position information, precision information, and deformation threshold information;
configuring a mapping relation between a trigger expression information sequence and an operation simulation training type;
and performing serial code conversion on the value of the mapping relation and the trigger performance information sequence, and configuring the serial code value to a three-dimensional coordinate point after the key point is calibrated.
2. The surgical model training method based on three-dimensional modeling image technology according to claim 1,
the step S2 of generating the random burst condition instance set includes: configuring burst implementation of a surgery simulation training type, and configuring control parameters corresponding to each burst instance, wherein the control parameters comprise: deformation control parameters of the three-dimensional training model and expansion control parameters of the three-dimensional training model; the three-dimensional training model deformation control parameters mean that the shape, the size and the like of the three-dimensional training model are changed; the three-dimensional training model expansion control parameters refer to that in addition to the three-dimensional training model, another three-dimensional training model is added, and the additionally added three-dimensional training model also needs to be processed by a user.
CN201810052392.4A 2018-01-19 2018-01-19 Operation model training method based on three-dimensional modeling image technology Expired - Fee Related CN108335599B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810052392.4A CN108335599B (en) 2018-01-19 2018-01-19 Operation model training method based on three-dimensional modeling image technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810052392.4A CN108335599B (en) 2018-01-19 2018-01-19 Operation model training method based on three-dimensional modeling image technology

Publications (2)

Publication Number Publication Date
CN108335599A CN108335599A (en) 2018-07-27
CN108335599B true CN108335599B (en) 2020-02-04

Family

ID=62925146

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810052392.4A Expired - Fee Related CN108335599B (en) 2018-01-19 2018-01-19 Operation model training method based on three-dimensional modeling image technology

Country Status (1)

Country Link
CN (1) CN108335599B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109410721A (en) * 2018-10-30 2019-03-01 深圳市墨优科技开发有限公司 A kind of emergency care training method and terminal
CN112734704B (en) * 2020-12-29 2023-05-16 上海索验智能科技有限公司 Skill training evaluation method under neural network machine learning recognition objective lens
CN114121218A (en) * 2021-11-26 2022-03-01 中科麦迪人工智能研究院(苏州)有限公司 Virtual scene construction method, device, equipment and medium applied to operation

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105825752A (en) * 2016-04-22 2016-08-03 吉林大学 Force feedback device-based virtual corneal surgery training system
CN107067856A (en) * 2016-12-31 2017-08-18 歌尔科技有限公司 A kind of medical simulation training system and method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101889642B1 (en) * 2010-12-08 2018-08-17 바이엘 헬스케어 엘엘씨 Generating a suitable model for estimating patient radiation dose resulting from medical imaging scans
CN103961179B (en) * 2014-04-09 2016-04-27 深圳先进技术研究院 Surgical instrument movement analogy method
CN204029245U (en) * 2014-08-01 2014-12-17 卓思生命科技有限公司 A surgical simulation system
CN106297471A (en) * 2016-10-25 2017-01-04 深圳市科创数字显示技术有限公司 The removable cornea intelligent operation training system that AR and VR combines
CN107331272B (en) * 2017-08-25 2019-07-02 福州盛世凌云环保科技有限公司 Based on the medical teaching of emulation technology manikin application method and use system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105825752A (en) * 2016-04-22 2016-08-03 吉林大学 Force feedback device-based virtual corneal surgery training system
CN107067856A (en) * 2016-12-31 2017-08-18 歌尔科技有限公司 A kind of medical simulation training system and method

Also Published As

Publication number Publication date
CN108335599A (en) 2018-07-27

Similar Documents

Publication Publication Date Title
CN104685551B (en) Mixed reality emulation mode and system
CN108320645B (en) Medical simulation training method
CN108682456B (en) Surgical simulation training method based on virtual reality technology
CN108335599B (en) Operation model training method based on three-dimensional modeling image technology
KR20190100011A (en) Method and apparatus for providing surgical information using surgical video
CN111026269B (en) Haptic feedback method, device and equipment for biological tissue structure based on force feedback
TW202038867A (en) Optical tracking system and training system for medical equipment
JP2008134373A (en) Method and system of preparing biological data for operation simulation, operation simulation method, and operation simulator
Ashapkina et al. Metric for exercise recognition for telemedicine systems
d’Aulignac et al. Modeling the dynamics of the human thigh for a realistic echographic simulator with force feedback
KR20210150633A (en) System and method for measuring angle and depth of implant surgical instrument
KR102666380B1 (en) Artificial intelligence-based initial registration apparatus and method for surgical navigation
WO2021064912A1 (en) Correction method, correction program, and information processing system
CN109171604A (en) A kind of intelligent endoscope operating system having AR function
CN117579979B (en) Game panoramic sound generation method, device, equipment and storage medium
US20230045451A1 (en) Architecture, system, and method for modeling, viewing, and performing a medical procedure or activity in a computer model, live, and combinations thereof
KR20230071737A (en) Method and apparatus for user motion accuracy evaluation
EP3889738A1 (en) A system and a method for calibrating a user interface
Owlia et al. Real-time tracking of laparoscopic instruments using kinect for training in virtual reality
CN114266831A (en) Data processing method, device, equipment, medium and system for assisting operation
KR102341673B1 (en) Evaluating system of surgical navigation device and method for evaluating surgical navigation device using the same
CN113838219A (en) Virtual dance training method and device based on human motion capture
US20250014295A1 (en) Systems and methods for determining material property values for a three-dimensional virtual model of an anatomical object
WO2024055493A1 (en) Heterogeneous and three-dimensional observation registration method based on deep phase correlation, and medium and device
WO2020210967A1 (en) Optical tracking system and training system for medical instruments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Su Daoqing

Inventor after: Wang Weifeng

Inventor after: Ji Zhengang

Inventor after: Zhang Meng

Inventor after: Meng Yuan

Inventor after: Zhong Di

Inventor before: Zhong Di

CB03 Change of inventor or designer information
TA01 Transfer of patent application right

Effective date of registration: 20200106

Address after: 252000 Shandong city of Liaocheng Province Dongchang Road No. 67

Applicant after: Liaocheng People's Hospital

Address before: 430000 Wuhan Donghu New Technological Development Zone, Hubei Province, Guannan Science and Technology Industrial Park, Phase I, Level 3, Unit 21, No. 9 (Admitted to Wuhan Chuangyijia Business Secretary Service Co., Ltd; Trusteeship No. 000348)

Applicant before: Wuhan Kang Huiran Information Technology Consulting Co., Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200204

Termination date: 20210119

CF01 Termination of patent right due to non-payment of annual fee