CN109475385B - System and method for intraoperative surgical planning - Google Patents
System and method for intraoperative surgical planning Download PDFInfo
- Publication number
- CN109475385B CN109475385B CN201780045886.3A CN201780045886A CN109475385B CN 109475385 B CN109475385 B CN 109475385B CN 201780045886 A CN201780045886 A CN 201780045886A CN 109475385 B CN109475385 B CN 109475385B
- Authority
- CN
- China
- Prior art keywords
- distal
- medical image
- proximal
- component
- interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title abstract description 104
- 210000000988 bone and bone Anatomy 0.000 claims abstract description 67
- 239000007943 implant Substances 0.000 claims abstract description 20
- 239000012634 fragment Substances 0.000 claims description 63
- 230000015654 memory Effects 0.000 claims description 31
- 238000012800 visualization Methods 0.000 claims description 18
- 239000003550 marker Substances 0.000 claims description 10
- 230000008878 coupling Effects 0.000 claims description 9
- 238000010168 coupling process Methods 0.000 claims description 9
- 238000005859 coupling reaction Methods 0.000 claims description 9
- 238000002278 reconstructive surgery Methods 0.000 abstract 1
- 230000008569 process Effects 0.000 description 24
- 238000004891 communication Methods 0.000 description 16
- 238000010586 diagram Methods 0.000 description 15
- 210000002758 humerus Anatomy 0.000 description 12
- 230000002452 interceptive effect Effects 0.000 description 12
- 230000006870 function Effects 0.000 description 11
- 210000000323 shoulder joint Anatomy 0.000 description 11
- 238000001356 surgical procedure Methods 0.000 description 10
- 238000002513 implantation Methods 0.000 description 9
- 230000005540 biological transmission Effects 0.000 description 8
- 230000033001 locomotion Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 210000003484 anatomy Anatomy 0.000 description 5
- 230000008901 benefit Effects 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 5
- 230000000670 limiting effect Effects 0.000 description 5
- 238000007726 management method Methods 0.000 description 5
- 238000011882 arthroplasty Methods 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 208000014674 injury Diseases 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000008439 repair process Effects 0.000 description 4
- 230000014509 gene expression Effects 0.000 description 3
- 210000004095 humeral head Anatomy 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000002441 reversible effect Effects 0.000 description 3
- 241001653121 Glenoides Species 0.000 description 2
- 239000008186 active pharmaceutical agent Substances 0.000 description 2
- 238000007792 addition Methods 0.000 description 2
- 238000002591 computed tomography Methods 0.000 description 2
- 239000000356 contaminant Substances 0.000 description 2
- 210000002310 elbow joint Anatomy 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 210000004394 hip joint Anatomy 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 210000001503 joint Anatomy 0.000 description 2
- 210000000629 knee joint Anatomy 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 210000001991 scapula Anatomy 0.000 description 2
- 230000008733 trauma Effects 0.000 description 2
- 230000008736 traumatic injury Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 210000003857 wrist joint Anatomy 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 210000000544 articulatio talocruralis Anatomy 0.000 description 1
- 238000007630 basic procedure Methods 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 210000003109 clavicle Anatomy 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 238000002594 fluoroscopy Methods 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 231100001261 hazardous Toxicity 0.000 description 1
- 238000011540 hip replacement Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000007620 mathematical function Methods 0.000 description 1
- 230000013011 mating Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000000399 orthopedic effect Effects 0.000 description 1
- 238000011002 quantification Methods 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000008261 resistance mechanism Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 230000004266 retinal recognition Effects 0.000 description 1
- 238000004513 sizing Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 210000004243 sweat Anatomy 0.000 description 1
- 239000003826 tablet Substances 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/30—Joints
- A61F2/3094—Designing or manufacturing processes
- A61F2/30942—Designing or manufacturing processes for designing or making customized prostheses, e.g. using templates, CT or NMR scans, finite-element analysis or CAD-CAM techniques
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/30—Joints
- A61F2/46—Special tools for implanting artificial joints
- A61F2/4657—Measuring instruments used for implanting artificial joints
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/50—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00203—Electrical control of surgical instruments with speech control or speech recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00207—Electrical control of surgical instruments with hand gesture control or hand gesture recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00216—Electrical control of surgical instruments with eye tracking or head position tracking control
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/108—Computer aided selection or customisation of medical implants or cutting guides
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2048—Tracking techniques using an accelerometer or inertia sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
- A61B2034/252—User interfaces for surgical systems indicating steps of a surgical procedure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
- A61B2034/254—User interfaces for surgical systems being adapted depending on the stage of the surgical procedure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
- A61B2034/256—User interfaces for surgical systems having a database of accessory information, e.g. including context sensitive help or scientific articles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/061—Measuring instruments not otherwise provided for for measuring dimensions, e.g. length
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/067—Measuring instruments not otherwise provided for for measuring angles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/363—Use of fiducial points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3966—Radiopaque markers visible in an X-ray image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/30—Joints
- A61F2/40—Joints for shoulders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/30—Joints
- A61F2/40—Joints for shoulders
- A61F2/4014—Humeral heads or necks; Connections of endoprosthetic heads or necks to endoprosthetic humeral shafts
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/30—Joints
- A61F2/40—Joints for shoulders
- A61F2/4059—Humeral shafts
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/30—Joints
- A61F2002/30001—Additional features of subject-matter classified in A61F2/28, A61F2/30 and subgroups thereof
- A61F2002/30003—Material related properties of the prosthesis or of a coating on the prosthesis
- A61F2002/3006—Properties of materials and coating materials
- A61F2002/3008—Properties of materials and coating materials radio-opaque, e.g. radio-opaque markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/30—Joints
- A61F2/40—Joints for shoulders
- A61F2/4014—Humeral heads or necks; Connections of endoprosthetic heads or necks to endoprosthetic humeral shafts
- A61F2002/4018—Heads or epiphyseal parts of humerus
- A61F2002/4022—Heads or epiphyseal parts of humerus having a concave shape, e.g. hemispherical cups
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/30—Joints
- A61F2/46—Special tools for implanting artificial joints
- A61F2002/4632—Special tools for implanting artificial joints using computer-controlled surgery, e.g. robotic surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/30—Joints
- A61F2/46—Special tools for implanting artificial joints
- A61F2002/4632—Special tools for implanting artificial joints using computer-controlled surgery, e.g. robotic surgery
- A61F2002/4633—Special tools for implanting artificial joints using computer-controlled surgery, e.g. robotic surgery for selection of endoprosthetic joints or for pre-operative planning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/30—Joints
- A61F2/46—Special tools for implanting artificial joints
- A61F2/4657—Measuring instruments used for implanting artificial joints
- A61F2002/4658—Measuring instruments used for implanting artificial joints for measuring dimensions, e.g. length
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Public Health (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Robotics (AREA)
- Transplantation (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Vascular Medicine (AREA)
- Cardiology (AREA)
- Human Computer Interaction (AREA)
- Data Mining & Analysis (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Geometry (AREA)
- Manufacturing & Machinery (AREA)
- Physical Education & Sports Medicine (AREA)
- Prostheses (AREA)
Abstract
The present subject matter includes systems, methods, and prosthetic devices for joint reconstructive surgery. A computer-assisted intraoperative planning method may include accessing a first medical image providing a first view of a joint within a surgical site and receiving a selection of a first component of a modular prosthetic device implanted in a first bone of the joint. The method continues by displaying a graphical representation of a first component of the modular prosthetic device overlaid on the first medical image, and updating the graphical representation of the first component based on receiving positioning input representing an implant position of the first component relative to a landmark on the first bone visible in the first medical image. The method concludes by presenting a selection interface that visualizes other components of the modular prosthetic device that are virtually connected to the first component and overlaid on the first medical image.
Description
Priority declaration
This application claims the benefit of U.S. provisional patent application serial No. 62/351,564 filed on 17.6.2016, the entire contents of which are hereby incorporated by reference herein, and the benefit of priority thereto.
Background
The shoulder joint is a complex joint with the scapula, clavicle and humerus all clustered together to achieve a wide range of motion, at least in one joint that functions normally. In a normally functioning shoulder joint, the humeral head fits within a shallow socket in the scapula, which is commonly referred to as the glenoid. The articulation of the shoulder joint involves movement of the humeral head within the glenoid, with the configuration of the mating surfaces and surrounding tissue providing a wide range of motion.
The complexity of the shoulder joint makes any traumatic injury particularly difficult to repair. Fractures can lead to bone fragment migration and other challenging problems that need to be addressed by orthopedic surgeons, often with limited preoperative planning.
Disclosure of Invention
The present inventors have recognized that the problem to be solved may include providing a trauma surgeon with tools that allow for available pre-and intra-operative planning of joint reconstruction procedures. The background section briefly introduces some of the challenges presented by trauma to the shoulder joint, which is used as an exemplary joint in much of the discussion below. The shoulder joint provides an excellent tool for describing the capabilities of the systems and methods discussed below. The systems and methods for anatomic shoulder arthroplasty are primarily described herein, however, the systems and methods are equally applicable to reverse shoulder arthroplasty. In addition, the systems and methods are similarly applicable to other joints of the body, such as the wrist, knee, hip or elbow joints. In particular, certain hip replacement systems are implanted using compatible methods and will benefit from implementation of the concepts discussed herein. In addition, the types of intraoperative plans discussed below are novel to joint reconstruction and/or replacement procedures.
The concepts discussed herein relate to the use of medical images, such as X-rays, fluoroscopy, Computed Tomography (CT), etc., in conjunction with a prosthesis system including various fiducial markers to enable pre-operative placement and intra-operative refinement. The systems and methods discussed below may also be used to assist in the placement of bone fragments intraoperatively.
In one example, the surgeon uses a handheld device (e.g., from Apple Computer of cupertino, california)) A planning system is accessed. The surgeon accesses the planning application, accesses the patient's documentation, and invokes the master planning interface. The planning interface loads an image of the target joint. For comparison purposes, the image may include an anterior-posterior (AP) view of the target joint and optionally the contralateral joint. Comparison with a "normal" contralateral joint helps to determine proper placement and size of the prosthesis, particularly when the target joint is severely damaged, and for bone fragment placement.
Once the patient images are loaded, the surgeon may begin the preoperative plan by selecting a first component of the prosthesis system and sizing the first component from the provided images. In some examples, the surgeon may skip the pre-operative planning step and proceed directly to implant the first component. In the example of shoulder reconstruction, the first component would be the distal stem of the modular shoulder prosthesis system. The systems and methods discussed herein use multi-component prostheses for reconstruction, but these methods are also applicable to single-component prostheses.
Once the first component is implanted, the surgeon may continue to use the planning interface intraoperatively by adjusting the virtual representation of the first component according to the fiducial markers on the implanted first component. The virtual first component includes fiducial markers corresponding to those on the implanted first component that provide the surgeon with visual cues as to how to end the position of the implanted first component relative to the anatomy.
Next, the surgeon may select and virtually position a second component of the modular prosthesis within the planned interface. Each additional component of the modular prosthesis has fiducial markers visible on the physical component and reproduced in a planning interface on the virtual representation to aid the surgeon in placement during implantation. By allowing the surgeon to update the virtual representation with the actual implanted object, the fiducial markers can also improve the intraoperative planning through the planning interface as the surgery progresses.
After implanting all of the components of the modular prosthesis system, the surgeon may use the planning interface to assist in bone fragment placement. The planning interface may display multiple views of the target joint and the contralateral joint to assist in this portion of the procedure.
This summary is intended to provide an overview of the subject matter of this document. This summary discusses the subject matter of the present invention in a general, non-limiting manner to provide an introduction to the more detailed description provided below with reference to the various figures included in this disclosure. It is not intended to provide an exclusive or exhaustive explanation of the invention. Detailed descriptions are included to provide further information about the present document.
Drawings
The figures are not necessarily to scale, and like reference numerals may depict similar parts throughout the different views. Like numerals having different letter suffixes may represent different instances of similar components. The drawings illustrate by way of example, and not by way of limitation, various embodiments discussed in the present document.
Fig. 1 is a block diagram illustrating an intraoperative planning system within a surgical environment, according to some example embodiments.
Fig. 2 is a block diagram illustrating an intraoperative planning system, according to some example embodiments.
Fig. 3 is a diagram illustrating a modular shoulder prosthesis, according to some example embodiments.
FIG. 4 is a flowchart illustrating interactions between a surgical procedure and an interaction planning program, according to some example embodiments.
Fig. 5A-5F are diagrams illustrating intraoperative planning interfaces (interfaces), according to some example embodiments.
Fig. 6 is a flow diagram illustrating a planning procedure according to some example embodiments.
Fig. 7 is a block diagram illustrating an example of a software architecture that may be installed on a machine, according to some example embodiments.
Fig. 8 illustrates a diagrammatic representation of a machine in the form of a computer system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed, according to an example embodiment.
Headings are provided herein for convenience only and do not necessarily affect the scope or meaning of the terms used.
Detailed Description
Joint reconstruction and/or replacement surgery, such as shoulder arthroplasty (total or reverse), is a complex procedure in which experience can produce significant differences in patient outcomes. While experience may provide insight in proper prosthesis selection and placement, the intraoperative planning systems and procedures discussed herein may provide even the most experienced surgeon with additional assistance to achieve the best results.
Fig. 1 is a block diagram illustrating an intraoperative planning system 120 within a surgical environment 100, according to some example embodiments. In this example, surgical environment 100 includes a patient 150, a surgeon 160, a set of modular implants 110, an imaging device 130, an intraoperative planning system 120, and optionally a medical image database 140. In this example, a surgeon 160 (or other medical personnel) uses the imaging device 130 to obtain a preoperative medical image of a joint of the patient 150 targeted for repair or replacement. The imaging device 130 may be an X-ray similar to medical imaging devices typically used for imaging during (or prior to) arthroplasty procedures. The medical images may be transferred to and retained by the medical image database 140 and then may be accessed by the intraoperative planning system 120. In another example, the medical image database 140 may be integrated within the intraoperative planning system 120, or the images may be sent directly to the intraoperative planning system 120 bypassing the medical image database 140.
Once the pre-operative imaging is complete, the surgeon 160 may perform the desired reconstruction or replacement procedure on the patient 150 using the modular implant 110 and the intra-operative planning system 120. The following discussion of fig. 2-6 describes example procedures relating to shoulder reconstruction or replacement using a modular shoulder prosthesis. The basic procedure and related concepts are applicable to reconstruct or replace other joints, such as wrist, elbow, hip, knee or ankle joints.
Fig. 2 is a block diagram of an intraoperative planning system 200, according to some example embodiments. In this example, the intraoperative planning system 200 includes a computing device 120, a medical image database 140, an input device 230, a prosthesis database 240, and a display device 250. In some examples, the computing device 120 integrates some or all of these components, such as the medical image database 140, the prosthesis database 240, the input device 230, and the display device 250. For example, intraoperative programming and interfaces as discussed herein may be usedOr similar applications running in the handheld device. In one example of this, the first and second sensors are,external data, such as data from the medical image database 140 and/or the prosthesis database 240, may be accessed over a network and the input device 230 and the display device 250 integrated into the host computing device 120. In another example of the above-described method,at least portions of the medical image database 140 and the prosthesis database 240 may also be managed.
In this example, computing device 120 includes an image module 222, an interactive interface module 224, and an input module 226. The image module 222 handles operations related to the intraoperative planning of medical images used in a procedure, such as accessing and displaying these images. The image module 222 may scale the image as needed to allow for the proper relationship between the virtual prosthesis and the medical image. The interactive interface module 224 handles operations that support the planning procedure, such as displaying and manipulating virtual prosthesis models within the planning interface. The input module 226 processes user inputs received to the computing device 120, such as receiving touch inputs that manipulate the position of a virtual representation of a portion of the modular prosthesis within the planning interface.
Fig. 3 is a diagram illustrating a modular anatomical shoulder prosthesis 300, according to an example embodiment. In this example, shoulder prosthesis 300 includes components such as a distal stem 310, a proximal stem 330 (referred to as an adapter 330), and a modular head 350. The distal rod 310 has fiducial markers (custom marks) for both height 315 and rotation angle (version) 320. The rotational angle fiducial marker 320 is shown as occurring around the circumference of the proximal portion of the distal rod 310 and is projected onto the top cylindrical distal surface of the distal rod 310. The proximal shaft or adapter 330 also has fiducial markers for height 335 and rotation angle (rotation) 340 that can be used to intraoperatively position the proximal shaft 330 at the proper height and rotation relative to the distal shaft 310. The proximal stem 330 has features to connect to other modular components, including a modular humeral head 350.
Fig. 4 is a flow diagram illustrating a process 400 according to some example embodiments, the process 400 including an illustration of an interactive cooperation between a surgical procedure and an interactive planning procedure. In this example, process 400 includes operations such as: capturing a medical image at 405, implanting a distal rod at 410, placing a virtual distal rod relative to the image at 415, analyzing a contralateral (comparative) image at 420, selecting a (virtual) adapter at 425, positioning the virtual adapter at 430, implanting the adapter at 435, measuring the actual placement of the adapter at 440, updating the angle of rotation and height in the implant to the actual position at 445, selecting a head at 450, implanting a head at 455, optionally planning bone fragment placement at 460, and optionally attaching bone fragments at 465.
In this example, the process 400 begins at 405 with medical personnel capturing x-ray or similar images of the joint targeted for reconstruction and the contralateral joint. While capturing images of the contralateral joint is optional, the planning interface provides some additional benefits in visualization, implant selection, and positioning when including contralateral images. Even without contralateral images, the planning interface may provide visualization, selection, and positioning benefits not readily available to surgeons in the operating room. The angle or view captured may depend on the details of the target joint. For example, in shoulder reconstruction, AP views of the target joint and the contralateral joint will be captured, and optionally an axial view and a medial-lateral (ML) view may be captured. The planning interface may be configured to utilize any standard angle for views used in a particular joint procedure.
At 410, the process 400 continues with the surgeon implanting the distal stem (the first component of the shoulder prosthesis). The surgeon implants the distal rod to a desired depth depending on the patient anatomy. The planning procedure is specifically designed to accommodate the conditions encountered during the surgical procedure. At 415, the process 400 continues within the intraoperative planning system 200 with the surgeon (or support medical personnel) selecting the distal rod and positioning the virtual representation of the distal rod relative to the medical images displayed within the planning interface. For example, the intraoperative planning system 200 can display AP views of the target joint and the contralateral joint and allow placement of the distal stem (the first component of the modular prosthesis system) with reference to these images. The planning interface, described further below with reference to fig. 5A-5F, also includes a graphical representation of the fiducials on the distal rod. Fiducial markers on the distal shaft allow for greater visualization and quantification of the position of the implant in the anatomy. The conversion of physical fiducial markers into a virtual planning interface allows the intraoperative plan to adjust to the actual reality of the surgical procedure.
At 420, the process 400 continues with the surgeon analyzing the contralateral image using the intraoperative planning system 200, which may include determining or checking the target height and rotation angle of the implant from the contralateral image. In this example, the intraoperative planning system 200 displays AP and/or axial views of the target joint and the contralateral joint side-by-side. The surgeon may experimentally position a portion of the prosthesis system using the planning interface to help determine a target height rotation angle for implantation of the first portion of the prosthesis system.
At 425, the process 400 continues with the surgeon selecting an adapter (proximal shaft in the shoulder example) using the intraoperative planning system 200. At 425 and 430, the intraoperative planning system 200 allows the surgeon to virtually try different sizes and positions for the next portion of the modular prosthesis system, such as the proximal shaft. At operation 430 within the planning interface, the surgeon may position and rotate the virtual proximal rod and visualize it relative to the target joint and the contralateral joint, which still has a complete anatomical structure.
At 435, the process 400 continues with the surgeon implanting the proximal shaft according to the plan generated within the intraoperative planning system 200. In this example, the proximal shaft includes fiducial markers to help the surgeon replicate the rotation angle (rotation) and height (if adjustable) as planned within intraoperative planning system 200. Once the proximal shaft is implanted, the process 400 continues at 440 with the surgeon measuring the actual tip position of the proximal shaft. The measurements can be made using fiducial markers on the proximal shaft (prosthetic component) to reference bony landmarks or the distal shaft. In some procedures, the surgeon may alter the planned implant location based on experience or conditions encountered within the surgical site. Therefore, measurements of the actual implant location are made and used to adjust the planning procedure. In the shoulder example, the height and rotation angle (rotation) of the proximal shaft can be determined after implantation and adjustment of the actual implant.
At 445, the process 400 continues with the surgeon or medical personnel updating the plan within the intraoperative planning system 200 to reflect the actual situation within the patient. In some examples, the process 400 continues at 450 with selection of a head or third component of the modular prosthesis system. Repeatedly, the intraoperative planning system 200 enables a surgeon to attempt different sizes and orientations to plan implantation of a headpiece. The intraoperative planning system 200 may be retrieved from a database of available prosthetic components, such as the prosthetic database 240. Finally, the process 400 may end at 455 with the surgeon implanting the head according to the plan.
Optionally, at 460, process 400 may include planning bone fragment placement. In this alternative example, intraoperative planning system 200 includes an interface that enables a surgeon to identify, locate, collect, and reposition (e.g., position) bone fragments. In reconstructions caused by traumatic injury, it is not uncommon for an orthopaedic surgeon to need to locate and align bone fragments. Being able to perform this procedure in a virtual manner may help to speed up this part of the surgical procedure. Process 400 shows the bone fragment planning completed intraoperatively at the end of the entire process, which is generally preferred because the implant location is now known in process 400. However, bone fragment planning may occur at any time after the images are loaded into intraoperative planning system 200. At 465, the process 400 continues with the surgeon attaching bone fragments as planned.
Fig. 5A-5F are diagrams illustrating intraoperative planning interfaces, according to some example embodiments. These figures illustrate some example interfaces that implement at least some portions of the processes discussed in fig. 4 and 6. The interfaces shown may be inOr similar hand-held touch screen device. Fig. 5A depicts an interface 500A with instructions for a surgical procedure, which in this example is to implant a distal implantSide bars.
Fig. 5B depicts a component selection and positioning interface 500B. The interface 500B shows a humerus 505, a distal stem 510, virtual fiducial markers 520, a longitudinal (or anatomical) axis 530 of the humerus, and a stem selection control 540. Virtual fiducial markers 520 correspond to physical fiducial markers on a distal shaft implanted in the patient. Interface 500B provides visualization between the virtual distal shaft and the implanted distal shaft through graphical elements such as virtual fiducial markers 520 and longitudinal axis 530.
Fig. 5C depicts another planning interface 500C, which in this example includes a humerus 505, a distal rod 510, a proximal rod 512, a head 514, virtual fiducial markers 520 on the distal rod, virtual fiducial markers 525 on the proximal rod, a longitudinal (or anatomical) axis 530 of the humerus, and a head selection control 545. In this example, a second set of virtual fiducial markers 525 is associated with the proximal shaft 512. The head selection control 545 enables selection of different sized heads to evaluate the best fit. Fig. 5D illustrates a planning interface 500D, which in this example depicts a planning interface that enables a surgeon to adjust the angle of rotation of the proximal shaft. In this example, the planning interface 500D includes an axial image of the humerus 505, which is overlaid with a proximal stem 512, a rotational angle indicator 555, a medial-lateral plane indicator 560, and a rotational angle selection control 550. In some examples, the rotation angle selection control 550 provides finer control, but depends on the increments allowed by the physical prosthetic component. This interface is shown in fig. 5C for anatomical reconstruction of the shoulder, which is also applicable to reverse shoulder reconstruction.
FIG. 5E depicts an interface 500E with instructions for a surgical procedure; in this example, the surgical procedure is to implant the proximal shaft and adjust the rotation angle. Fig. 5F depicts an interface 500F, which in this example is a bone fragment placement planning interface. Interface 500F includes a distal rod 510, a proximal rod 512, a debris recognition control 570, and a debris recognition tool 575. The interface 500F overlays information on four different images, the AP and ML images of the target joint and the AP and ML images of the contralateral joint. The fragments will be processed within the target joint image while the contralateral joint is shown for placement assistance by the surgeon. The fragment recognition tool 575 may be utilized for fragment recognition, which in this example includes drawing a contour line around a bone fragment in the image. The intraoperative planning system 200 allows a surgeon to move and rotate bone fragments to position them at a desired location around an implant.
Fig. 6 is a flow diagram of a planning procedure 600 according to some example embodiments. In this example, the planning procedure 600 includes operations such as: accessing a medical image at 605, receiving a selection of a first component at 610, displaying the first component overlaid on the medical image at 615, updating a position of the first component based on an implant position at 620, presenting a selection interface at 625, optionally updating a visualization at 630, optionally displaying an adjustment interface at 635, optionally determining whether other components are to be planned at 640, displaying a plan for implanting other components at 645, optionally displaying a bone fragment interface at 650, optionally receiving an input identifying bone fragments at 655, and optionally receiving an input locating bone fragments at 660.
At 605, the planning procedure 600 begins with the intraoperative planning system 200 accessing medical image(s) associated with a target joint being repaired or reconstructed. The intraoperative planning system 200 may have locally stored medical images or access them over a network on a medical image database (e.g., medical image database 240). At 610, the planning procedure 600 continues with the intraoperative planning system receiving a selection of a first component via a user interface, such as those discussed above with reference to fig. 5A-5F. The selection of the first component corresponds to the first component of the modular prosthesis system used in joint repair or reconstruction. In some examples, the selection of the first component corresponds to the first component having been implanted in the patient. In other examples, the surgeon uses the intraoperative planning system 200 to confirm the size and ideal implant location. For example, in the shoulder example discussed above, the surgeon may use the intraoperative planning system 200 to visualize the distal stem size and seating depth within the patient's humerus prior to performing the implantation procedure.
At 615, the planning procedure 600 continues with the intraoperative planning system 200 displaying the first component overlaid on the medical image within the planning interface. As described above, the surgeon may manipulate the position of the first component relative to the medical image, which is scaled to properly display the selected first component relative to the patient's anatomy. At 620, the planning procedure 600 continues with the intraoperative planning system 200 updating the position of the virtual first component based on the input representing the actual position of the implanted first component. Enabling the surgeon to adjust the planning procedure 600 based on the intra-operative results allows for greater accuracy in placement and visualization of other components of the modular prosthesis system.
At 620, the planning procedure 600 continues with the intraoperative planning system 200 presenting a selection interface to visualize selection and positioning of other components of the modular prosthesis system, such as the proximal stem in the modular shoulder prosthesis. At 630, the planning procedure 600 optionally continues with the intraoperative planning system 200 updating the planning interface visualization with the selected additional components. Planning procedure 600 optionally continues with intraoperative planning system 200 displaying an adjustment interface for additional components, allowing for positional shift or rotational adjustment. At 640, if the modular prosthesis system includes additional components for size and position planning, the planning program 600 loops back to add the additional components. For example, in the shoulder reconstruction in question, there is still a need to select and position the head of the prosthesis system.
At 645, after all components are planned, the planning procedure 600 may continue with the intraoperative planning system 200 displaying an implantation plan for implanting additional components. In some examples, the intraoperative planning system 200 displays the implantation plan after adding each additional component to facilitate repeated implantation and planning of each additional component.
Optionally, at 650, the planning procedure 600 continues with the intraoperative planning system displaying a bone fragment planning interface capable of identifying and locating bone fragments. At 655, the planning program 600 continues with the intraoperative planning system 200 receiving input identifying bone fragments. The input identifying the bone fragments may comprise touch input to individual bone fragments or drawing a contour line around the set of bone fragments. Once the bone fragments are identified, the planning procedure 600 may continue at 660 with the intraoperative planning system 200 receiving input of the location of the bone fragments within the interface, displaying a plurality of medical images from different angles to aid in the location. The surgeon may then use the visualization provided by intraoperative planning system 200 to implement a bone fragment repair plan.
Software architecture
Fig. 7 is a block diagram 700 illustrating a representative software architecture 702 that may be used in connection with the various hardware architectures described herein. FIG. 7 is only a non-limiting example of a software architecture, and it should be understood that many other architectures can be implemented to facilitate the functionality described herein. The software architecture 702 may be executed on hardware, such as the machine 800 of fig. 8, where the machine 800 includes a processor 810, a memory 830, and I/O components 850. A representative hardware layer 704 is shown and may represent, for example, the machine 800 of fig. 8. The representative hardware layer 704 includes one or more processing units 706 having associated executable instructions 708. Executable instructions 708 represent executable instructions of software architecture 702, including implementations of the methods, modules, and the like of fig. 4-6. The hardware layer 704 also includes memory and/or storage modules 710, which also have executable instructions 708. The hardware layer 704 may also include other hardware, represented by 712, that represents any other hardware of the hardware layer 704, such as other hardware illustrated as part of the machine 800.
In the example architecture of fig. 7, the software 702 may be conceptualized as a stack of layers, where each layer provides specific functionality. For example, software 702 may include layers such as an operating system 714, libraries 716, framework/middleware 718, application programs 720, and presentation layers 722. In operation, the application 720 and/or other components in these layers may request an Application Programming Interface (API) call 724 through the software stack and receive a response, return value, etc., illustrated as a message 726 in response to the API call 724. The layers shown are representative in nature and not all software architectures have all layers. For example, some mobile or special-purpose operating systems may not provide the framework/middleware layer 718, while other operating systems may provide such a layer. Other software architectures may include additional or different layers.
Operating system 714 may manage hardware resourcesAnd provides public services. Operating system 714 may include, for example, a kernel 728, services 730, and drivers 732. The core 728 may act as an abstraction layer between hardware and other software layers. For example, kernel 728 may be responsible for memory management, processor management (e.g., scheduling), component management, networking, security settings, and the like. Service 730 may provide other common services for other software layers. The driver 732 may be responsible for controlling or interfacing with the underlying hardware. The driver 732 may include, for example, a display driver, a camera driver,a drive, a flash drive, a serial communication drive (e.g., a Universal Serial Bus (USB) drive),drivers, audio drivers, power management drivers, etc., depending on the hardware configuration.
The library 716 may provide a common infrastructure that may be used by the application 720 and/or other components and/or layers. The library 716 generally provides functionality that allows other software modules to perform tasks in a manner that is easier than directly interfacing with the underlying operating system 714 functionality (e.g., kernel 728, services 730, and/or drivers 732). The library 716 may include a system 734 library (e.g., a C-standard library) that may provide functions such as memory allocation functions, string manipulation functions, mathematical functions, and the like. In addition, the libraries 716 may include API libraries 736 such as media libraries (e.g., libraries for supporting rendering and operations in various media formats, such as MPREG4, h.264, MP3, AAC, AMR, JPG, PNG), graphics libraries (e.g., OpenGL framework that may be used to render 2D and 3D in graphical content on a display), databases (e.g., SQLite that may provide various relational database functions), web page libraries (e.g., WebKit that may provide web browsing functions), and so forth. The library 716 may also include various other libraries 738 to provide many other APIs to the application 720 and other software components/modules.
Framework 718 (also sometimes referred to as middleware) may provide a higher level of public infrastructure (ifrasstructure) that may be used by applications 720 and/or other software components/modules. For example, the framework 718 may provide various Graphical User Interface (GUI) functionality, advanced resource management, advanced location services, and the like. Framework 718 may provide a broad spectrum of other APIs that may be used by application programs 720 and/or other software components/modules, some of which may be specific to a particular operating system or platform.
The applications 720 include built-in applications 740, third party applications 742, or intraoperative planning applications 744. Examples of representative built-in applications 740 may include, but are not limited to, a contacts application, a browser application, a book reader application, a location application, a media application, a messaging application, and/or a gaming application. Third party applications 742 may include any of a wide assortment of built-in applications as well as other applications. In a particular example, the third-party application 742 (e.g., Android used by an entity other than the vendor of the particular platform)TMOr iOSTMApplications developed by Software Development Kit (SDK) may be in a process such as iOSTM,AndroidTM,Or other mobile operating system. In this example, the third party application 742 may request an API call 724 provided by a mobile operating system, such as the operating system 714, to facilitate the functionality described herein. Intraoperative planning application 744 may include programming logic to implement the methods and user interfaces discussed above, providing intraoperative planning capabilities discussed herein. The intraoperative planning application 744 is used to improve the operation of a computing device for use by a surgeon or related medical personnel within an orthopaedic surgical environment. In this example, without the intraoperative programming application 744, the computing device would not be able to perform any of the functions discussed with reference to fig. 4-6.
The application programs 720 may utilize built-in operating system functions (e.g., kernel 728, services 730, and/or drivers 732), libraries (e.g., system 734, API 736, and other libraries 738), and framework/middleware 718 to create a user interface to interact with system users. Alternatively or additionally, in some systems, interaction with a user may occur through a presentation layer (e.g., presentation layer 744). In these systems, the application/module "logic" may be separate from aspects of the application/module that interact with the user.
Some software architectures utilize virtual machines. In the example of FIG. 7, this is illustrated by virtual machine 748. The virtual machine creates a software environment where applications/modules can execute as if executing on a hardware machine (e.g., the machine of fig. 8). The virtual machines are managed by the host operating system (operating system 714 in FIG. 8), and typically, but not always, have a virtual machine monitor 746 that manages the operation of the virtual machines and the interface with the host operating system (i.e., operating system 714). The software architecture executes within virtual machines such as operating system 750, libraries 752, framework/middleware 754, applications 756, and/or a presentation layer 758. These software architecture layers executing within virtual machine 748 may be the same as or may be different from the corresponding layers previously described.
Example machine Structure and machine-readable Medium
Fig. 8 is a block diagram illustrating components of a machine 800, the machine 800 capable of reading instructions from a machine-readable medium (e.g., a machine-readable storage medium) and performing any one or more of the methodologies discussed herein, according to some example embodiments. In particular, fig. 8 illustrates a diagrammatic representation of machine 800 in the example form of a computer system within which instructions 816 (e.g., software, a program, an application, an applet, an application, or other executable code) for causing the machine 800 to perform any one or more of the methodologies discussed herein may be executed. For example, the instructions may cause the machine to perform the flow diagrams of fig. 4 and 6. Additionally or alternatively, the instructions may implement block 222 and 226 of FIG. 2, and so on. Further, the instructions may generate the planning interfaces shown in FIGS. 5A-5F. The instructions change from a generally unprogrammed machine to a specific machine that is programmed to perform the functions described and illustrated in the manner described. In alternative embodiments, the machine 800 operates as a standalone device or may be coupled (e.g., networked) to other machines. In a networked deployment, the machine 800 may operate in the capacity of a server machine or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine 800 may include, but is not limited to, a server computer, a client computer, a Personal Computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a Personal Digital Assistant (PDA), an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), or any machine capable of executing the instructions 816 sequentially or otherwise that specify actions to be performed by the machine 800. Further, while only a single machine 800 is illustrated, the term "machine" shall also be taken to include a collection of machines 800 that individually or jointly execute the instructions 816 to perform any one or more of the methodologies discussed herein.
The machine 800 may include a processor 810, a memory 830, and I/O components 850, the I/O components 850 may be configured to communicate with each other, for example, via a bus 802. In an example embodiment, processor 810 (e.g., a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Radio Frequency Integrated Circuit (RFIC), another processor, or any suitable combination thereof) may include, for example, processor 88 and processor 814 that may execute instructions 816. The term "processor" is intended to include multicore processors, which may include two or more independent processors (sometimes referred to as "cores") capable of executing instructions simultaneously. Although fig. 8 illustrates multiple processors, the machine 800 may include a single processor with a single core, a single processor with multiple cores (e.g., a multi-core process), multiple processors with a single core, multiple processors with multiple cores, or any combination thereof.
Memory/storage 830 may include a memory 832, such as a main memory, or other memory, and a storage unit 836, both of which may be accessed by processor 810, such as via bus 802. The memory unit 836 and the memory 832 store instructions 816 embodying any one or more of the methodologies or functions described herein. The instructions 816 may also reside, completely or partially, within the memory 832, within the memory unit 836, within at least one of the processors 810 (e.g., within a cache memory of a processor), or any suitable combination thereof during execution thereof by the machine 800. Thus, memory 832, storage unit 836, and the memory of processor 810 are examples of machine-readable media.
As used herein, a "machine-readable medium" represents a device capable of storing instructions and data, either temporarily or permanently, and may include, but is not limited to, Random Access Memory (RAM), Read Only Memory (ROM), cache memory, flash memory, optical media, magnetic media, cache memory, other types of storage devices (e.g., erasable programmable read only memory (EEPROM)), and/or any suitable combination thereof. The term "machine-readable medium" shall be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) that are capable of storing instructions 816. The term "machine-readable medium" shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions 816) for execution by a machine (e.g., machine 800), such that the instructions, when executed by one or more processors of machine 800 (e.g., processor 810), cause machine 800 to perform any one or more of the methodologies described herein. Thus, "machine-readable medium" refers to a single storage device or apparatus, as well as a "cloud-based" storage system or storage network that includes multiple storage devices or apparatuses. The term "machine-readable medium" does not include the signal itself.
The I/O components 850 may include a wide variety of components to receive input, provide output, generate output, send information, exchange information, capture measurements, and the like. The particular I/O components 850 included in a particular machine will depend on the type of machine. For example, a portable machine, such as a mobile phone, may include a touch input device or other such input mechanism, while a headless server machine may not include such a touch input device. It will be appreciated that the I/O component 850 may include many other components not shown in fig. 8. The I/O components 850 are grouped by function only to simplify the following discussion, and the grouping is in no way limiting. In various example embodiments, I/O components 850 may include output components 852 and input components 854. Output components 852 may include visual components (e.g., a display such as a Plasma Display Panel (PDP), a Light Emitting Diode (LED) display, a Liquid Crystal Display (LCD), a projector, or a Cathode Ray Tube (CRT)), acoustic components (e.g., a speaker), tactile components (e.g., a vibration motor, a resistance mechanism), other signal generators, and so forth. The input components 854 may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, an optical keyboard, or other alphanumeric input components), point-based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), tactile input components (e.g., physical buttons, a touch screen that provides a location and/or force of a touch or touch gesture, or other tactile input components), audio input components (e.g., a microphone), and so forth.
In further exemplary embodiments, the I/O component 850 may include a biometric component 856, a motion component 858, an environmental component 860, or a location component 862 in a wide array of other components. For example, the biometric component 856 can include components for detecting expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measuring biometric signals (e.g., blood pressure, heart rate, body temperature, sweat, or brain waves), identifying a person (e.g., voice recognition, retinal recognition, facial recognition, fingerprint recognition, or electroencephalogram-based recognition), and so forth. The motion component 858 may include an acceleration sensor component (e.g., an accelerometer), a gravity sensor component, a rotation sensor component (e.g., a gyroscope), and the like. The environmental components 860 may include, for example, lighting sensor components (e.g., a photometer), temperature sensor components (e.g., one or more thermometers that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., a barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., an infrared sensor that detects nearby objects), gas sensors (e.g., a gas detection sensor that detects a concentration of a hazardous gas to ensure safety or to measure contaminants in a contaminant), or other components that may provide an indication, measurement, or signal corresponding to the surrounding physical environment. The location component 862 can include a location sensor component (e.g., a Global Positioning System (GPS) receiver component), an altitude sensor component (e.g., an altimeter or barometer that detects barometric pressure from which altitude can be derived), a direction sensor component (e.g., a magnetometer), and so forth.
Communication may be accomplished using a wide variety of techniques. The I/O components 850 may include a communications component 864 operable to couple the machine 800 to a network 880 or a device 870 via a coupling 882 and a coupling 872, respectively. For example, the communication component 864 can include a network interface component or other suitable means of interfacing with the network 880. In further examples, communications component 864 may include a wired communications component, a wireless communications component, a cellular communications component, a Near Field Communications (NFC) component,(for example,low energy),components, and other communication components that provide communication by other means. The device 870 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).
Further, the communication component 864 can detect the identifier or include a component operable to detect the identifier. For example, the communication component 864 may include a Radio Frequency Identification (RFID) tag reader component, an NFC smart tag detection component, an optical reader component (e.g., an optical sensor for detecting one-dimensional barcodes, such as Universal Product Code (UPC) barcodes, multi-dimensional barcodes, such as Quick Response (QR) codes, Aztec codes, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, UCC RSS-2D bar codes, and other optical codes), or acoustic detection components (e.g., microphones for identifying tagged audio signals). In addition, various information can be derived via the communications component 864, such as location via an Internet Protocol (IP) geo-location, viaSignal triangulation results in location, location via detection of NFC beacon signals that may indicate a particular location, and so on.
Transmission medium
In various exemplary embodiments, one or more portions of network 880 may be an ad hoc network (ad hoc network), an intranet, an extranet, a Virtual Private Network (VPN), a Local Area Network (LAN), a wireless LAN (wlan), a Wide Area Network (WAN), a Wireless Wide Area Network (WWAN), a Metropolitan Area Network (MAN), the internet, a portion of the Public Switched Telephone Network (PSTN), a Plain Old Telephone Service (POTS) network, a cellular telephone network, a wireless network,a network, another type of network, or a combination of two or more such networks. For example, network 880 or a portion of network 880 may include a wireless or cellular network, and coupling 882 may be a Code Division Multiple Access (CDMA) connection, a global system for mobile communications (GSM) connection, or other type of cellular or wireless coupling. In this example, coupling 882 may implement any of various types of data transmission techniques, such as single carrier radio transmission technology (1xRTT), evolution-data optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, enhanced data rates for GSM evolution (EDGE) technology, third generation partnership project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) standards, other standards defined by various standards-setting organizations, other remote protocols, or other data transmission techniques.
The instructions 816 may be transmitted or received over a network 880 using a transmission medium via a network interface device (e.g., a network interface component included in the communications component 864) and utilizing any of a number of well-known transmission protocols (e.g., the hypertext transfer protocol (HTTP)). Similarly, the instructions 816 may be transmitted or received to the apparatus 870 via the coupling 872 (e.g., a peer-to-peer coupling) using a transmission medium. The term "transmission medium" shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions 816 for execution by the machine 800, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
Language(s)
Throughout the specification, multiple instances may implement a component, an operation, or a structure described as a single instance. While the individual operations of one or more methods are illustrated and described as separate operations, one or more of the separate operations may be performed concurrently and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in the example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements may fall within the scope of the subject matter herein.
Although the summary of the present subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to the embodiments without departing from the broader scope of the embodiments of the disclosure. Such embodiments of the inventive subject matter may be referred to herein, individually or collectively, by the term "invention" merely for convenience and without intending to voluntarily limit the scope of this application to any single disclosure or inventive concept if more than one is in fact disclosed.
The embodiments illustrated herein are described in sufficient detail to enable those skilled in the art to practice the disclosed teachings. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The detailed description is, therefore, not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled. As used herein, the term "or" may be interpreted in an inclusive or exclusive sense. Furthermore, multiple instances may be provided for a resource, operation, or structure described herein as a single instance. In addition, the boundaries between the various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific illustrative configurations. Other allocations of functionality are contemplated and may fall within the scope of various embodiments of the disclosure. In general, structures and functionality presented as separate resources in example configurations may be implemented as a combined structure or resource. Similarly, the structure and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements may fall within the scope of the embodiments of the disclosure as represented by the claims that follow. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Various comments & examples
Each of the following non-limiting examples may exist independently, or may be combined in various permutations or with one or more of the other examples.
The foregoing detailed description includes references to the accompanying drawings, which form a part hereof. The drawings show, by way of illustration, specific embodiments in which the invention may be practiced. These embodiments are also referred to herein as "examples". "these examples may include elements in addition to those shown or described. However, the inventors also contemplate examples providing only those elements shown or described. Moreover, the inventors also contemplate examples using any combination or permutation of those elements (or one or more aspects thereof) shown or described with respect to a particular example (or one or more aspects thereof) or with respect to other examples (or one or more aspects thereof) shown or described herein.
If usage between this document and any document incorporated by reference is inconsistent, then usage in this document controls.
No quantitative term modification in this document, as is common in patent documents, includes one or more than one, independent of any other instances or usages of "at least one" or "one or more". In this document, the term "or" is used to indicate nonexclusivity, such that "a or B" includes "a but not B", "B but not a" and "a and B", unless otherwise indicated. As used herein, the terms "including" and "in which" are used as the plain-English equivalents of the respective terms "comprising" and "wherein". Furthermore, in the following claims, the terms "comprises" and "comprising" are open-ended, i.e., a system, apparatus, article, composition, formulation, or process that comprises an element other than the element listed after the term is still considered to be within the scope of the claims. Furthermore, in the following claims, the terms "first," "second," and "third," etc. are used merely as labels, and are not intended to impose numerical requirements on their objects.
The method examples described herein may be machine or computer-implemented at least in part. Some examples may include a computer-readable or machine-readable medium encoded with instructions operable to configure an electronic device to perform a method as described in the above examples. Implementations of such methods may include code, such as microcode, assembly language code, higher level language code, and the like. Such code may include computer readable instructions for performing various methods. The code may form part of a computer program product. Further, in an example, the code can be tangibly stored on one or more volatile, non-transitory, or non-volatile tangible computer-readable media, e.g., during execution or at other times. Examples of such tangible computer-readable media may include, but are not limited to, hard disks, removable magnetic disks, removable optical disks (e.g., compact disks and digital video disks), magnetic tape, memory cards or sticks, Random Access Memories (RAMs), Read Only Memories (ROMs), and the like.
The above description is intended to be illustrative and not restrictive. For example, the above-described examples (or one or more aspects thereof) may be used in combination with each other. Other embodiments may be utilized, such as by one of ordinary skill in the art, upon reading the foregoing description. The abstract is provided to comply with 37c.f.r. § 1.72(b), allowing the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Also, in the foregoing detailed description, various features may be combined together to simplify the present disclosure. This should not be interpreted as an admission that a disclosed feature not written in a claim is essential to any claim. Rather, inventive subject matter may lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that these embodiments may be combined with each other in various combinations or permutations. The scope of the invention should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
Example (c):
example 1 describes an intra-operative surgical planning technique operable on a computing device available in an operating room. In this example, the present techniques may be performed by accessing, on a computing device operating an intra-operative surgical planning interface, a first medical image providing a first view of a joint within a surgical site. The present techniques may continue by receiving a selection of a first component of a modular prosthetic device within the intraoperative surgical planning interface, the first component implanted in a first bone of the joint. The technique further includes an operation for displaying, within the intraoperative surgical planning interface, a graphical representation of the first component of the modular prosthetic device overlaid on the first medical image. The technique further includes updating the graphical representation of the first component based on receiving a positioning input within the intraoperative surgical planning interface, the positioning input representing an implant position of the first component relative to a landmark on the first bone visible in the first medical image. In this example, the present techniques may be concluded by: presenting a selection interface within the intraoperative surgical planning interface such that other components of the modular prosthetic device that are virtually connected to the first component and overlaid on the first medical image are visualized.
In example 2, the subject matter of example 1 can optionally include the first component implanted in a first bone including a plurality of fiducial markers indicating positions, and the technique further includes receiving a positioning input including an indication of a relationship between at least one fiducial marker of the plurality of fiducial markers on the first component and the first bone.
In example 3, the subject matter of any of examples 1 and 2 can optionally include displaying the graphical representation of the first component by: displaying fiducial markers on the first component corresponding to fiducial markers on the implanted first component within the intraoperative surgical planning interface to help associate the intraoperative surgical planning interface with the surgical site.
In example 4, the subject matter of any of examples 1 to 3 can optionally include, after receiving a selection of a second component via the selection interface, the techniques can include updating the graphical representation to include the first component coupled to the second component of the modular prosthetic device.
In example 5, the subject matter of example 4 can optionally include presenting an adjustment interface within the intraoperative surgical planning interface, the adjustment interface enabling adjustment of the second component relative to the first component, wherein adjustments available within the intraoperative surgical planning interface are constrained by available physical adjustments between the first and second components of the modular prosthetic device.
In example 6, the subject matter of example 5 can optionally include the available physical adjustments, such as height adjustment along a longitudinal axis and rotational adjustment relative to the longitudinal axis.
In example 7, the subject matter of any of examples 5 and 6 can optionally include presenting the adjustment interface by: fiducial markers on the first and second components corresponding to the physical fiducial markers on the first and second components are displayed to enable adjustments performed within the adjustment interface to be translated to the modular prosthetic device implanted within the joint.
In example 8, the subject matter of any of examples 5 to 7 can optionally include accessing the first medical image together with accessing a second medical image that provides a second view of a contralateral joint, wherein presenting the adjustment includes presenting a second graphical representation of the modular prosthetic device overlaid on the second medical image.
In example 9, the subject matter of any of examples 1 to 8 can optionally include accessing the first medical image and accessing a second medical image instrument that provides a second view of a contralateral joint, wherein presenting the selection interface includes presenting a second graphical representation of the modular prosthetic device overlaid on the second medical image.
In example 10, the subject matter of any of examples 1 to 9 can optionally include that the technique further comprises presenting a bone fragment positioning interface within the intraoperative surgical planning interface such that identification and positioning of bone fragments within the first medical image relative to the modular prosthetic device is enabled.
In example 11, the subject matter of example 10 can optionally include presenting the bone fragment locating interface by receiving, via the bone fragment locating interface, a bone fragment recognition input that recognizes the first bone fragment.
In example 12, the subject matter of example 11 can optionally include presenting a bone fragment positioning interface by receiving a positioning input via the bone fragment positioning interface to position the first bone fragment relative to the modular prosthetic device.
Example 13 describes a computer-assisted shoulder reconstruction procedure method that can be used in an operating room during a reconstruction procedure. The method can include implanting a distal stem of a modular shoulder prosthesis within the humerus to a depth providing good fixation, the distal stem including a fiducial marker along at least a portion of the longitudinal length. In this example, the method may include accessing, from a computing device running a surgical planning application, a plurality of medical images of a shoulder joint involved in the shoulder reconstruction, the plurality of medical images including a first medical image depicting a first view including the shoulder joint and a second medical image depicting a second view including a contralateral shoulder joint. The method may further include selecting a virtual distal stem in the surgical planning application corresponding to a distal stem implanted in a humerus. The method may continue by: in the surgical planning application, the position of the virtual distal shaft is adjusted with reference to the representation of the humerus in the first and second medical images using a comparison between fiducial markers on the shaft and corresponding virtual fiducial markers on the virtual distal shaft. The method may further include selecting a proximal bar and additional modular components based on an interactive visualization provided within the surgical planning application, the interactive visualization presented with reference to a virtual distal bar in the first and second medical images. In this example, the process may end by (con) as follows: implanting the selected proximal shaft and additional modular components based on the interactive visualization provided by the surgical planning application. In another example, the present methods discussed herein may end up with an implantation plan for implanting a proximal shaft, and additional modular components selected based on the interactive visualization provided by the surgical planning application may be output for use by a surgeon.
In example 14, the subject matter of example 13 can optionally include selecting the proximal bar by: manipulating the position of the virtual proximal shaft with reference to the virtual distal shaft visible within the first medical image and the second medical image in an interactive visualization provided by the surgical planning application.
In example 15, the subject matter of example 14 can optionally include manipulating a position of the virtual proximal shaft relative to the virtual distal shaft constrained by physical limitations coupling the proximal shaft with the distal shaft.
In example 16, the subject matter of example 15 can optionally include manipulating a position of the virtual proximal stem by adjusting a telescoping height along a longitudinal axis of the modular shoulder prosthesis to adjust a head height.
In example 17, the subject matter of any of examples 14 to 16 can optionally include manipulating a position of the virtual proximal stem relative to the virtual distal stem by adjusting rotation of the virtual proximal stem within a second interactive visualization interface comprising a third medical image depicting a first axial view of a humerus and a fourth medical image depicting a second axial view of a contralateral humerus.
In example 18, the subject matter of example 18 can optionally include adjusting rotation of the virtual proximal shaft by starting rotation in predetermined degree increments relative to the distal shaft to set a rotation angle of the modular shoulder prosthesis.
In example 19, the subject matter of any of examples 17 and 18 can optionally include implanting the proximal shaft by rotating the proximal shaft relative to the distal shaft to match rotation of the virtual proximal shaft within the second interactive visualization interface.
In example 20, the subject matter of any of examples 13 to 19 can optionally include implanting the proximal shaft and the head includes matching fiducial markers on the virtual proximal shaft and the virtual head within the interactive visualization with fiducial markers on the proximal shaft and the head of the modular shoulder prosthesis.
In example 21, the subject matter of any of examples 13 to 20 can optionally include utilizing a fragment interface within the surgical planning application to identify bone fragments to be reattached during the shoulder reconstruction.
In example 22, the subject matter of example 21 can optionally include identifying the bone fragment further comprising manipulating a position or rotation of the bone fragment relative to the virtual representation of the modular shoulder prosthesis.
In example 23, the subject matter of example 22 can optionally include manipulating a position or rotation of a bone fragment by visualizing the bone fragment in the first and second medical images.
In example 24, the subject matter of example 23 can optionally include visualizing the bone fragment by: the position and orientation of the bone fragments are depicted in a third medical image providing a third view of the shoulder joint including the modular prosthesis overlaid thereon and a fourth medical image providing a fourth view of the contralateral shoulder joint including the modular prosthesis overlaid thereon.
In example 25, the subject matter of any of examples 22 to 24 can optionally include attaching the bone fragment based on a visualization of a final placement of the bone fragment within the fragment interface provided in the surgical planning application.
Example 26 describes a modular shoulder prosthesis used in conjunction with any of examples 1-25. The modular shoulder prosthesis includes a distal stem including a fiducial marker along at least a portion of a longitudinal length. The prosthesis also includes a proximal shaft that is detachably coupleable with the distal shaft, the proximal shaft including a height-indicating fiducial marker along at least a portion of a longitudinal length and a rotation-indicating fiducial marker around a circumference of a portion of the proximal shaft. Finally, the prosthesis includes a head or counter-tray or other modular component that can be removably coupled with the proximal stem.
In example 27, the prosthesis of example 26 may optionally comprise: the distal shaft includes rotation-indicating fiducial markers around a circumference of the proximal portion that correspond to the rotation-indicating fiducial markers on the proximal shaft.
Example 28 describes a shoulder reconstruction system for use in conjunction with any of examples 1-25. The system includes a plurality of distal rods, each of the plurality of distal rods having a different diameter or length and including a fiducial marker along at least a portion of the longitudinal length. The system also includes a plurality of proximal rods, each of the proximal rods being detachably coupleable with a distal rod of the plurality of distal rods, each proximal rod including a height-indicating fiducial marker along at least a portion of the longitudinal length and a rotation-indicating fiducial marker around a circumference of a portion of the proximal rod. The system also includes a plurality of heads, trays, or other modular components that can be removably coupled with a proximal rod of the plurality of proximal rods.
Claims (14)
1. A modular shoulder prosthesis system, comprising:
a distal shaft comprising a fiducial marker along at least a portion of a longitudinal length;
a proximal shaft detachably coupleable with the distal shaft, the proximal shaft including a height-indicating fiducial mark along at least a portion of a longitudinal length and a rotation-indicating fiducial mark around a circumference of a portion of the proximal shaft;
a head or counter-tray or other modular component capable of removably coupling with the proximal stem; and
a computer system comprising a memory, a processor, and a display, the memory containing instructions that when executed by the processor cause the computer system to:
accessing a first medical image providing a first view of a joint within a surgical site;
receiving a selection of a first component of a modular prosthetic device implanted in a first bone of the joint, the first component corresponding to the distal stem; displaying a virtual graphical representation of the distal shaft overlaid on the medical image of the joint, the virtual graphical representation including virtual fiducial markers corresponding to respective fiducial markers of the distal shaft;
updating a graphical representation of the first component based on receiving a positioning input representing an implant position of the distal stem relative to a landmark on the first bone visible in the first medical image; and
presenting a selection interface enabling visualization of other components of the modular prosthetic device that are virtually connected to the first component and overlaid on the first medical image.
2. The modular shoulder prosthesis system of claim 1, wherein receiving a positioning input comprises receiving an indication of a relationship between at least one of the fiducial markers on the distal stem and a bone displayed within the medical image.
3. The modular shoulder prosthesis system of claim 1 or 2, wherein the memory contains additional instructions that cause a computer system to update the virtual graphical representation to include a proximal stem coupled to a distal stem.
4. The modular shoulder prosthesis system of claim 3, wherein the memory contains additional instructions that cause a computer system to: an adjustment interface is presented enabling adjustment of the proximal stem relative to the virtual representation of the distal stem, wherein adjustments available within the intraoperative surgical planning interface are constrained by available physical adjustments between the distal stem and the proximal stem of the modular shoulder prosthesis.
5. The modular shoulder prosthesis system of claim 4, wherein the available physical adjustment between the distal stem and proximal stem comprises:
height adjustment along a longitudinal axis; and
rotational adjustment relative to the longitudinal axis.
6. The modular shoulder prosthesis system of claim 5, wherein the proximal stem comprises a telescoping height adjustment relative to the distal stem.
7. The modular shoulder prosthesis system of claim 4, wherein presenting an adjustment interface includes displaying fiducial markers on the virtual representation of the distal and proximal stems that correspond to physical fiducial markers on the distal and proximal stems, such that adjustments performed within the adjustment interface can be translated to the connected distal and proximal stems.
8. The modular shoulder prosthesis system of claim 3, wherein the memory contains additional instructions that cause a computer system to:
accessing a second medical image providing a second view of the contralateral joint; and is
Wherein presenting the virtual graphical representation comprises presenting a second graphical representation of a distal rod and a proximal rod overlaid on the second medical image.
9. The modular shoulder prosthesis system of claim 8, wherein the memory contains additional instructions that cause a computer system to: an adjustment interface is presented that enables manipulation of a position of the virtual representation of the distal rod relative to the virtual representation of the proximal rod overlaid in view within the medical image and the second medical image.
10. The modular shoulder prosthesis system of claim 9, wherein the memory contains additional instructions that cause a computer system to: a bone fragment positioning interface is presented enabling identification and virtual positioning of bone fragments within the medical image relative to at least one of the distal and proximal shafts.
11. The modular shoulder prosthesis system of claim 10, wherein presenting a bone fragment positioning interface comprises receiving a bone fragment identification input identifying a first bone fragment via the bone fragment positioning interface.
12. The modular shoulder prosthesis system of claim 11, wherein presenting a bone fragment positioning interface comprises receiving a positioning input via the bone fragment positioning interface to position a first bone fragment relative to at least one of the distal stem and proximal stem.
13. The modular shoulder prosthesis system of claim 12, wherein positioning a first bone fragment comprises manipulating the position or rotation of a bone fragment.
14. The modular shoulder prosthesis system of claim 13, wherein manipulating the position or rotation of bone fragments comprises visualizing bone fragments within the medical image and second medical image.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662351564P | 2016-06-17 | 2016-06-17 | |
US62/351,564 | 2016-06-17 | ||
PCT/US2017/037932 WO2017218929A1 (en) | 2016-06-17 | 2017-06-16 | System and method for intraoperative surgical planning |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109475385A CN109475385A (en) | 2019-03-15 |
CN109475385B true CN109475385B (en) | 2021-06-18 |
Family
ID=59254036
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201780045886.3A Active CN109475385B (en) | 2016-06-17 | 2017-06-16 | System and method for intraoperative surgical planning |
Country Status (5)
Country | Link |
---|---|
US (3) | US10390887B2 (en) |
EP (2) | EP4238522A3 (en) |
CN (1) | CN109475385B (en) |
AU (2) | AU2017283631B2 (en) |
WO (1) | WO2017218929A1 (en) |
Families Citing this family (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11331149B2 (en) | 2012-05-16 | 2022-05-17 | Feops Nv | Method and system for determining a risk of hemodynamic compromise after cardiac intervention |
US10736697B2 (en) | 2013-10-10 | 2020-08-11 | Imascap Sas | Methods, systems and devices for pre-operatively planned shoulder surgery guides and implants |
WO2015068035A1 (en) | 2013-11-08 | 2015-05-14 | Imascap | Methods, systems and devices for pre-operatively planned adaptive glenoid implants |
EP3068317B1 (en) | 2013-11-13 | 2018-08-22 | Tornier | Shoulder patient specific instrument |
AU2017283631B2 (en) * | 2016-06-17 | 2019-05-16 | Zimmer, Inc. | System and method for intraoperative surgical planning |
AU2017295728B2 (en) | 2016-07-15 | 2021-03-25 | Mako Surgical Corp. | Systems for a robotic-assisted revision procedure |
US11612307B2 (en) | 2016-11-24 | 2023-03-28 | University Of Washington | Light field capture and rendering for head-mounted displays |
EP3609424A1 (en) * | 2017-04-14 | 2020-02-19 | Stryker Corporation | Surgical systems and methods for facilitating ad-hoc intraoperative planning of surgical procedures |
US11432875B2 (en) | 2017-09-28 | 2022-09-06 | Siemens Medical Solutions Usa, Inc. | Left atrial appendage closure guidance in medical imaging |
US12207817B2 (en) * | 2017-12-28 | 2025-01-28 | Cilag Gmbh International | Safety systems for smart powered surgical stapling |
US11376002B2 (en) | 2017-12-28 | 2022-07-05 | Cilag Gmbh International | Surgical instrument cartridge sensor assemblies |
EP3542757A1 (en) | 2018-03-23 | 2019-09-25 | FEops NV | Method and system for patient-specific virtual percutaneous structural heart intervention |
DE102018213872A1 (en) * | 2018-05-06 | 2019-11-07 | Carl Zeiss Ag | Apparatus and method for imaging in the implantation of retinal implants |
EP3810013A1 (en) | 2018-06-19 | 2021-04-28 | Tornier, Inc. | Neural network for recommendation of shoulder surgery type |
US11589928B2 (en) * | 2018-09-12 | 2023-02-28 | Orthogrid Systems Holdings, Llc | Artificial intelligence intra-operative surgical guidance system and method of use |
US11540794B2 (en) * | 2018-09-12 | 2023-01-03 | Orthogrid Systesm Holdings, LLC | Artificial intelligence intra-operative surgical guidance system and method of use |
US10623660B1 (en) | 2018-09-27 | 2020-04-14 | Eloupes, Inc. | Camera array for a mediated-reality system |
JP7242898B2 (en) | 2019-03-29 | 2023-03-20 | ホウメディカ・オステオニクス・コーポレイション | Premorbid Characterization of Anatomical Objects Using Statistical Shape Modeling (SSM) |
CN110313973B (en) * | 2019-07-05 | 2020-04-28 | 北京积水潭医院 | Angle Measurement System and Data Processing Method for Rotational Osteotomy of Long Bone |
US12349979B2 (en) | 2019-10-29 | 2025-07-08 | Howmedica Osteonics Corp. | Use of bony landmarks in computerized orthopedic surgical planning |
CN110827960A (en) * | 2019-11-04 | 2020-02-21 | 杭州依图医疗技术有限公司 | Medical image display method and display equipment |
US11918307B1 (en) | 2019-11-15 | 2024-03-05 | Verily Life Sciences Llc | Integrating applications in a surgeon console user interface of a robotic surgical system |
US11931119B1 (en) | 2019-11-15 | 2024-03-19 | Verily Life Sciences Llc | Integrating applications in a surgeon console user interface of a robotic surgical system |
US11701176B2 (en) | 2020-05-06 | 2023-07-18 | Warsaw Orthopedic, Inc. | Spinal surgery system and methods of use |
US10949986B1 (en) | 2020-05-12 | 2021-03-16 | Proprio, Inc. | Methods and systems for imaging a scene, such as a medical scene, and tracking objects within the scene |
US11295460B1 (en) | 2021-01-04 | 2022-04-05 | Proprio, Inc. | Methods and systems for registering preoperative image data to intraoperative image data of a scene, such as a surgical scene |
US11890058B2 (en) | 2021-01-21 | 2024-02-06 | Arthrex, Inc. | Orthopaedic planning systems and methods of repair |
US12178515B2 (en) | 2021-04-26 | 2024-12-31 | Arthrex, Inc. | Systems and methods for density calibration |
US11759216B2 (en) | 2021-09-22 | 2023-09-19 | Arthrex, Inc. | Orthopaedic fusion planning systems and methods of repair |
US12261988B2 (en) | 2021-11-08 | 2025-03-25 | Proprio, Inc. | Methods for generating stereoscopic views in multicamera systems, and associated devices and systems |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5880976A (en) * | 1997-02-21 | 1999-03-09 | Carnegie Mellon University | Apparatus and method for facilitating the implantation of artificial components in joints |
WO2015171577A1 (en) * | 2014-05-05 | 2015-11-12 | Noww, Llc | Acetabular component anteversion and abduction measurement system and method |
Family Cites Families (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB9403165D0 (en) | 1994-02-18 | 1994-04-06 | Johnson Robert | Knee joint prostheses |
US5507817A (en) * | 1994-02-22 | 1996-04-16 | Kirschner Medical Corporation | Modular humeral prosthesis for reconstruction of the humerus |
US5769092A (en) * | 1996-02-22 | 1998-06-23 | Integrated Surgical Systems, Inc. | Computer-aided system for revision total hip replacement surgery |
GB9707371D0 (en) * | 1997-04-11 | 1997-05-28 | Minnesota Mining & Mfg | A modular humeral prosthesis |
GB9707853D0 (en) * | 1997-04-18 | 1997-06-04 | Headcorn Instrumentation Ltd | Prosthesis |
US6942699B2 (en) * | 2001-07-11 | 2005-09-13 | Biomet, Inc. | Shoulder prosthesis |
US7542791B2 (en) * | 2003-01-30 | 2009-06-02 | Medtronic Navigation, Inc. | Method and apparatus for preplanning a surgical procedure |
US7660623B2 (en) * | 2003-01-30 | 2010-02-09 | Medtronic Navigation, Inc. | Six degree of freedom alignment display for medical procedures |
US7517364B2 (en) * | 2003-03-31 | 2009-04-14 | Depuy Products, Inc. | Extended articulation orthopaedic implant and associated method |
US8070820B2 (en) * | 2003-10-08 | 2011-12-06 | Biomet Manufacturing Corp. | Shoulder implant assembly |
US20050267353A1 (en) * | 2004-02-04 | 2005-12-01 | Joel Marquart | Computer-assisted knee replacement apparatus and method |
US7763080B2 (en) * | 2004-04-30 | 2010-07-27 | Depuy Products, Inc. | Implant system with migration measurement capacity |
US7567834B2 (en) * | 2004-05-03 | 2009-07-28 | Medtronic Navigation, Inc. | Method and apparatus for implantation between two vertebral bodies |
US20060161052A1 (en) * | 2004-12-08 | 2006-07-20 | Perception Raisonnement Action En Medecine | Computer assisted orthopaedic surgery system for ligament graft reconstruction |
EP1908023A1 (en) * | 2005-06-02 | 2008-04-09 | Depuy International Limited | Surgical system and method |
US20080319491A1 (en) * | 2007-06-19 | 2008-12-25 | Ryan Schoenefeld | Patient-matched surgical component and methods of use |
US8265949B2 (en) * | 2007-09-27 | 2012-09-11 | Depuy Products, Inc. | Customized patient surgical plan |
US8617171B2 (en) * | 2007-12-18 | 2013-12-31 | Otismed Corporation | Preoperatively planning an arthroplasty procedure and generating a corresponding patient specific arthroplasty resection guide |
CA2736525C (en) * | 2008-09-10 | 2019-10-22 | OrthAlign, Inc. | Hip surgery systems and methods |
US8961526B2 (en) * | 2010-11-23 | 2015-02-24 | University Of Massachusetts | System and method for orienting orthopedic implants |
BR112013032144A2 (en) * | 2011-06-16 | 2016-12-13 | Smith & Nephew Inc | surgical alignment using references |
US8977021B2 (en) * | 2011-12-30 | 2015-03-10 | Mako Surgical Corp. | Systems and methods for customizing interactive haptic boundaries |
US9694223B2 (en) | 2012-02-13 | 2017-07-04 | Factory Mutual Insurance Company | System and components for evaluating the performance of fire safety protection devices |
US20150133945A1 (en) * | 2012-05-02 | 2015-05-14 | Stryker Global Technology Center | Handheld tracking system and devices for aligning implant systems during surgery |
US10743945B2 (en) * | 2012-10-02 | 2020-08-18 | Radlink, Inc. | Surgical method and workflow |
TWI548403B (en) * | 2013-03-07 | 2016-09-11 | 國立成功大學 | Implant implant path planning method and system |
US9247998B2 (en) * | 2013-03-15 | 2016-02-02 | Intellijoint Surgical Inc. | System and method for intra-operative leg position measurement |
US9585768B2 (en) * | 2013-03-15 | 2017-03-07 | DePuy Synthes Products, Inc. | Acetabular cup prosthesis alignment system and method |
US11793424B2 (en) * | 2013-03-18 | 2023-10-24 | Orthosensor, Inc. | Kinetic assessment and alignment of the muscular-skeletal system and method therefor |
US10070929B2 (en) * | 2013-06-11 | 2018-09-11 | Atsushi Tanji | Surgical operation support system, surgical operation support apparatus, surgical operation support method, surgical operation support program, and information processing apparatus |
US9237936B2 (en) * | 2013-07-12 | 2016-01-19 | Pacesetter, Inc. | System and method for integrating candidate implant location test results with real-time tissue images for use with implantable device leads |
US9839448B2 (en) * | 2013-10-15 | 2017-12-12 | Si-Bone Inc. | Implant placement |
US10433914B2 (en) * | 2014-02-25 | 2019-10-08 | JointPoint, Inc. | Systems and methods for intra-operative image analysis |
US10758198B2 (en) * | 2014-02-25 | 2020-09-01 | DePuy Synthes Products, Inc. | Systems and methods for intra-operative image analysis |
WO2016022856A1 (en) * | 2014-08-06 | 2016-02-11 | Somersault Orthopedics Inc. | Method for creating a customized arthroplasty resection guide utilizing two-dimensional imaging |
US20160287337A1 (en) * | 2015-03-31 | 2016-10-06 | Luke J. Aram | Orthopaedic surgical system and method for patient-specific surgical procedure |
EP3782587B1 (en) * | 2015-07-08 | 2024-03-13 | Zimmer, Inc. | Sensor-based shoulder system |
US10092361B2 (en) * | 2015-09-11 | 2018-10-09 | AOD Holdings, LLC | Intraoperative systems and methods for determining and providing for display a virtual image overlaid onto a visual image of a bone |
US10201320B2 (en) * | 2015-12-18 | 2019-02-12 | OrthoGrid Systems, Inc | Deformed grid based intra-operative system and method of use |
WO2017160651A1 (en) * | 2016-03-12 | 2017-09-21 | Lang Philipp K | Devices and methods for surgery |
AU2017283631B2 (en) * | 2016-06-17 | 2019-05-16 | Zimmer, Inc. | System and method for intraoperative surgical planning |
WO2018132835A1 (en) * | 2017-01-16 | 2018-07-19 | Smith & Nephew, Inc. | Tracked surgical tool with controlled extension |
FR3062297B1 (en) * | 2017-02-01 | 2022-07-15 | Laurent Cazal | METHOD AND DEVICE FOR ASSISTING THE PLACEMENT OF A PROSTHESIS, PARTICULARLY OF THE HIP, BY A SURGEON FOLLOWING DIFFERENT SURGICAL PROTOCOLS |
EP3596658A1 (en) * | 2017-03-13 | 2020-01-22 | Zimmer, Inc. | Augmented reality diagnosis guidance |
US11007013B2 (en) * | 2018-05-02 | 2021-05-18 | Intellijoint Surgical Inc. | System, method and apparatus for automatic registration in computer assisted bone milling surgery |
US11065065B2 (en) * | 2019-02-22 | 2021-07-20 | Warsaw Orthopedic, Inc. | Spinal implant system and methods of use |
-
2017
- 2017-06-16 AU AU2017283631A patent/AU2017283631B2/en active Active
- 2017-06-16 US US15/625,260 patent/US10390887B2/en active Active
- 2017-06-16 CN CN201780045886.3A patent/CN109475385B/en active Active
- 2017-06-16 WO PCT/US2017/037932 patent/WO2017218929A1/en unknown
- 2017-06-16 EP EP23182479.8A patent/EP4238522A3/en active Pending
- 2017-06-16 EP EP17734205.2A patent/EP3471646B1/en active Active
-
2019
- 2019-07-19 US US16/516,700 patent/US10849692B2/en active Active
- 2019-08-13 AU AU2019216618A patent/AU2019216618B2/en active Active
-
2020
- 2020-10-22 US US17/077,372 patent/US11490965B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5880976A (en) * | 1997-02-21 | 1999-03-09 | Carnegie Mellon University | Apparatus and method for facilitating the implantation of artificial components in joints |
WO2015171577A1 (en) * | 2014-05-05 | 2015-11-12 | Noww, Llc | Acetabular component anteversion and abduction measurement system and method |
Also Published As
Publication number | Publication date |
---|---|
US20210038317A1 (en) | 2021-02-11 |
US11490965B2 (en) | 2022-11-08 |
US10849692B2 (en) | 2020-12-01 |
US20190336223A1 (en) | 2019-11-07 |
AU2019216618B2 (en) | 2020-08-27 |
EP3471646B1 (en) | 2023-07-05 |
US10390887B2 (en) | 2019-08-27 |
CN109475385A (en) | 2019-03-15 |
WO2017218929A1 (en) | 2017-12-21 |
US20170360510A1 (en) | 2017-12-21 |
EP4238522A3 (en) | 2023-11-15 |
AU2017283631A1 (en) | 2019-01-03 |
EP4238522A2 (en) | 2023-09-06 |
AU2017283631B2 (en) | 2019-05-16 |
AU2019216618A1 (en) | 2019-09-05 |
EP3471646A1 (en) | 2019-04-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109475385B (en) | System and method for intraoperative surgical planning | |
US11883110B2 (en) | Sensor-based shoulder system and method | |
US9855106B2 (en) | System and methods for positioning bone cut guide | |
EP3968887A1 (en) | Bone wall tracking and guidance for orthopedic implant placement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |