[go: up one dir, main page]

WO2025189285A1 - Movable surgical tracker - Google Patents

Movable surgical tracker

Info

Publication number
WO2025189285A1
WO2025189285A1 PCT/CA2025/050335 CA2025050335W WO2025189285A1 WO 2025189285 A1 WO2025189285 A1 WO 2025189285A1 CA 2025050335 W CA2025050335 W CA 2025050335W WO 2025189285 A1 WO2025189285 A1 WO 2025189285A1
Authority
WO
WIPO (PCT)
Prior art keywords
joint
tracker
tracking
movement
trackable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/CA2025/050335
Other languages
French (fr)
Inventor
Sharif Sharifzadeh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orthosoft ULC
Original Assignee
Orthosoft ULC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orthosoft ULC filed Critical Orthosoft ULC
Publication of WO2025189285A1 publication Critical patent/WO2025189285A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00725Calibration or performance testing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2059Mechanical position encoders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags

Definitions

  • the present disclosure relates to computer-assisted surgery including bone and tool tracking, and to trackers tied to the patient in the context of computer-assisted surgery.
  • CAS computer- assisted surgery
  • the end effector, the tools, bodily parts are tracked for position and/or orientation in such a way that relative navigation information pertaining to bodily parts is obtained.
  • the information is then used in various interventions (e.g., orthopedic surgery, neurological surgery) with respect to the body, such as bone alterations, implant positioning, incisions and the like during surgery.
  • optical tracking is commonly used in different forms, for instance by the presence of optically-detectable trackers on the tools, on the end effector and/or operating end of a robotic arm, in addition to being present on the patient.
  • the optically-detectable trackers are passive retro reflective components on tools and bones, though other types of trackers may be used.
  • the trackers are viewed by a tracking device, such as a tracking camera (e.g., Navitracker®), a depth camera, and by triangulation the position and orientation of the tracker device is calculable to output navigation data.
  • the robot arm may also be equipped with a tracker device.
  • Figs. 1A and 1 B are exemplary trackers 1 that are respectively attached to the tibia and to the femur.
  • Both trackers include a base 1 A that is fixed to the bone by fasteners 1 B, an arm 1C that supports a trackable portion 1 D at a distance from the bone, in such a way that the trackable portion 1 D is on display for the tracker device.
  • the trackable portion 1 D is consequently in a fixed relation relative to the bone.
  • a system for tracking an object in computer-assisted surgery comprising: a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: tracking the object optically using a tracker having a trackable portion with a joint enabling movement, the trackable portion being viewed by a tracker device; receiving signalling indicative of a change of position and/or orientation of the trackable portion relative to the object, by movement at the joint; and tracking the object optically using the tracker having the trackable portion viewed by the tracker device using the change of position and/or orientation.
  • receiving signalling includes receiving signals from a joint in the tracker.
  • receiving signals from the joint in the tracker includes receiving signals quantifying the change.
  • tracking the object using the change of position and/or orientation includes tracking the object continuously while receiving the signals.
  • receiving signalling includes receiving an interaction requesting a pause in the tracking.
  • the tracker may be recalibrated to quantify the change after the pause.
  • reference points are obtained on a surface of the object prior to or during the tracking of the object.
  • the object is a bone and the tracker is fixed to the bone.
  • the tracker is including with the joint and the trackable portion.
  • the joint is a joint providing at least one rotational degree of freedom.
  • the joint is a spherical joint.
  • the joint is a joint providing at least one translational degree of freedom.
  • the joint is a cylindrical joint.
  • the joint has encoding capacity to emit signals quantifying movement at the joint.
  • the tracker device is included for optically tracking the tracker.
  • a tracker for optical tracking during computer-assisted surgery comprising: a base configured to be secured to a bone; a trackable portion having at least one optically detectable element; at least one stem spacing the base from the trackable portion; and a joint in the tracker enabling at least one degree of freedom of movement between the base and the trackable portion, the joint having a sensor configured to emit a signal quantifying the movement.
  • the joint is in the stem, whereby the stem has a first segment from the base to the joint, and a second segment from the joint to the trackable portion.
  • a ratio of length between the second segment and the first segment is at least 5:1 .
  • the joint is between the stem and the base.
  • the joint is a spherical joint providing at least two rotational degrees of freedom of movement.
  • a ball portion of the spherical joint is fixed relative to the base.
  • a channel extends from a surface of the ball portion and into the base, the channel configured to receive a fastener.
  • the senor is an assembly including a 3D Hall-effect sensor and magnet.
  • the senor includes a wireless chip configured for wireless communication of the signal.
  • the joint is a cylindrical joint providing one rotational degree of freedom of movement and one translational degree of freedom of movement.
  • a strap holder is on the stem.
  • Figs. 1 A and 1 B are trackers on bone in accordance with the prior art
  • FIG. 2A is a perspective view of a movable surgical tracker with movable adjustment capacity in accordance with an aspect of the present disclosure
  • FIG. 2B is a perspective view of a movable surgical tracker with movable adjustment capacity in accordance with another aspect of the present disclosure
  • FIG. 2C is a perspective view of a movable surgical tracker with movable adjustment capacity in accordance with yet another aspect of the present disclosure
  • Fig. 2D is a perspective view of the movable surgical tracker with movable adjustment capacity of Fig. 2D, having its base secured to a bone;
  • Fig. 2E is an assembly view of the movable surgical tracker with movable adjustment capacity of Fig. 2D, having a trackable portion thereof directed toward the base for assembly;
  • Fig. 2F is a perspective view showing a pair of the movable surgical tracker with movable adjustment capacity of Fig. 2D, respectively secured to the tibia and femur during knee surgery;
  • FIG. 3 is a schematic view of a robotic surgery system in accordance with an aspect of the present disclosure, relative to a patient, using the movable surgical tracker of any one of Fig. 2A to 2C;
  • Fig. 4 is a block diagram of the tracking system for robotized computer- assisted surgery of Fig. 3;
  • Fig. 5 is a flow chart of a method for calibrating a tracker device with a CAS system.
  • a robotic surgery system for computer-assisted surgery (CAS) system is generally shown at 10, and is used to provide surgery assistance to an operator.
  • Robotic capability is merely optional, as the system 10 may be without a robot.
  • the system 10 will be referred to herein as the CAS system 10.
  • the system 10 is shown relative to a dummy patient in prone decubitus, but only as an example.
  • the system 10 could be used for any body parts, including non- exhaustively hip joint, spine, and shoulder bones, for orthopedic surgery, but could also be used in other types of surgery.
  • the system 10 could be used for surgery of all sorts, such as brain surgery, and soft tissue surgery.
  • the CAS system 10 may be robotized in a variant, and has, may include or may be used with a robot 20.
  • the CAS system 10 may further include optical trackers such as movable surgical tracker device 30, a tracker device 40, a CAS controller 50 (also known as a super controller 50), a tracking module 60, and a robot controller 70 (also known as a robot driver), present if a robot is used, or any combination thereof:
  • the robot 20, shown by its robot arm 20A may optionally be present as the working end of the system 10, and may be used to perform or guide bone alterations as planned by an operator and/or the CAS controller 50 and as controlled by the CAS controller 50.
  • the robot arm 20A may also be configured for collaborative/cooperative mode in which the operator may manipulate the robot arm 20, or the tool supported by the robot arm 20, though the tool may be operated by a human operator.
  • the tooling end also known as end effector, may be manipulated by the operator while supported by the robot arm 20A.
  • the robot 20 may be the coordinate measuring machine (CMM) of the CAS system 10;
  • Optical trackers such as the trackers 1 of Figs. 1A and 1 B and the movable surgical tracker 30 of Figs. 2A, 2B and 2C are positioned on the robot 20, on patient tissue (e.g., bones B), and/or on the tool(s) T and surgical instruments, and provide tracking data for the robot 20, the patient and/or tools.
  • the tracking device 40 also known as a sensor device, apparatus, etc performs optical tracking of the optical trackers 30, so as to enable the tracking in space (a.k.a., navigation) of the robot 20, the patient and/or tools;
  • the CAS controller 50 also known as the super controller, includes the processor(s) and appropriate hardware and software to run a computer- assisted surgery procedure in accordance with one or more workflows.
  • the CAS controller 50 may include or operate the tracking device 40, the tracking module 60, and/or the robot controller 70. As described hereinafter, the CAS controller 50 may also drive the robot arm 20A through a planned surgical procedure;
  • the tracking module 60 is tasked with determining the position and/or orientation of the various relevant objects during the surgery procedure, such as the end effector of the robot arm 20, bone(s) B and tool(s) T, using data acquired by the tracking device 40 and by the robot 20, and/or obtained from the robot controller 70.
  • the position and/or orientation may be used by the CAS controller 50 to control the robot arm 20A;
  • the robot controller 70 is optionally present, and is tasked with powering or controlling the various joints of the robot arm 20A, based on operator demands or on surgery planning, and may also be referred to as a robot controller module that is part of the super controller 50.
  • the robot controller 70 may also optionally calculate robot movements of the robot arm 20A, so as to control movements of the robot arm 20A autonomously in some instances, i.e., without intervention from the CAS controller 50;
  • An additional camera(s) may be present, for instance as a complementary registration tool.
  • the camera may for instance be mounted on the robot 20A, such as on the robot arm, such that the point of view of the camera is known in the frame of reference, also known as the coordinate system.
  • the robot 20 may have the robot arm 20A stand from a base 20B, for instance in a fixed relation relative to the operating-room (OR) table supporting the patient, whether it is attached or detached from the table.
  • the robot arm 20A has a plurality of joints 21 and links 22, of any appropriate form, to support an end effector 23 that may interface with the patient, or may be used during surgery without interfacing with the patient.
  • the end effector or tool head may optionally incorporate a force/torque sensor for collaborative/cooperative control mode, in which an operator manipulates the robot arm 20A.
  • the robot arm 20A is shown being a serial mechanism, arranged for the tool head 23 to be displaceable in a desired number of degrees of freedom (DOF).
  • DOF degrees of freedom
  • the tool head 23 may for example be a support that is not actuated, the support being used to support a tool, with the robot arm 20A used to position the tool relative to the patient.
  • the robot arm 20A controls 6-DOF movements of the tool head, i.e., X, Y, Z in the coordinate system, and pitch, roll and yaw. Fewer or additional DOFs may be present.
  • X, Y, Z in the coordinate system
  • pitch, roll and yaw aw.
  • Fewer or additional DOFs may be present.
  • only a fragmented illustration of the joints 21 and links 22 is provided, but more joints 21 of different types may be present to move the end effector 23 in the manner described above.
  • the joints 21 are powered for the robot arm 20A to move as controlled by the CAS controller 50 in the six DOFs, and in such a way that the position and orientation of the end effector 23 in the coordinate system may be known, for instance by readings from encoders on the various joints 21 . Therefore, the powering of the joints is such that the end effector 23 of the robot arm 20A may execute precise movements, such as moving along a single direction in one translation DOF, or being restricted to moving along a plane, among possibilities.
  • Such robot arms 20A are known, for instance as described in United States Patent Application Serial no. 11/610,728, and incorporated herein by reference.
  • the end effector 23 of robot arm 20A may be defined by a chuck or like tool interface, typically actuatable in rotation.
  • numerous tools may be used as end effector for the robot arm 20, such tools including a registration pointer as shown in Fig. 1 , equipped with a tracker device 30, a reamer (e.g., cylindrical, tapered), a reciprocating saw, a retractor, a camera, an ultrasound unit, a laser rangefinder or light-emitting device (e.g., the indicator device of US Patent No. 8,882,777), a laminar spreader, an instrument holder, or a cutting guide, depending on the nature of the surgery.
  • the various tools may be part of a multi-mandible configuration or may be interchangeable, whether with human assistance, or as an automated process.
  • the installation of a tool in the tool head may then require some calibration in order to track the installed tool in the X, Y, Z coordinate system of the robot arm 20.
  • the end effector 23 of the robot arm 20A may be positioned by the robot 20 relative to surgical area A in a desired orientation according to a surgical plan, such as a plan based on preoperative imaging. Due to the proximity between the robot 20 and the surgical area A, the robot 20 may be covered partially with a surgical drape D, also known as a surgical robotic drape.
  • the surgical drape D is a sterile panel (or panels), tubes, bags or the like that form(s) a physical barrier between the sterile zone (e.g., surgical area) and some equipment that may not fully comply with sterilization standards, such as the robot 20.
  • the surgical drape D is transparent such that one can see through the drape D.
  • the robot is entirely covered with the surgical drape D, and this includes the base 20B, but with the exception of the end effector 23. Indeed, as the end effector 23 interacts or may interact with the human body, it may be sterilized and may not need to be covered by the surgical drape D, to access the patient. Some part of the robot 20 may also be on the sterile side of the surgical drape D. In a variant, a portion of the robot arm 20 is covered by the surgical drape D.
  • the surgical drape D may be in accordance with United States Patent Application No. 15/803,247, filed on November s, 2017 and incorporated herein by reference.
  • the CAS controller 50 can manipulate the robot arm 20A automatically (without human intervention), or by a surgeon manually operating the robot arm 20A (e.g. physically manipulating, via a remote controller through the interface l/F) to move the end effector 23 of the robot arm 20A to the desired location, e.g., a location called for by a surgical plan to align an instrument relative to the anatomy.
  • a step of a surgical procedure can be performed, such as by using the end effector 23.
  • a tracker device 30 may optionally be secured to the distalmost link, and may be distinct from the tracker device 30 on the instrument supported by the end effector 23.
  • the robot arm 20A may include sensors 25 in its various joints 21 and links 22.
  • the sensors 25 may be of any appropriate type, such as rotary encoders, optical sensors, position switches that are a non-exhaustive list of potential sensors, for the position and orientation of the end effector, and of the tool in the end effector 23 to be known.
  • the tracking module 60 may determine the position and orientation of the robot 20 in a frame of reference of the robot 20, such as by obtaining the position (x,y,z) and orientation (phi, theta, ro) of the end effector 23 from the CAS controller 50 using the sensors 25 in the robot arm 20A, i.e., robot coordinates may be an integrated function of the robot 20 in that it may determine the position and orientation of its end effector 23 with respect to its coordinate system.
  • the robot 20 may be the coordinate measuring machine (CMM) of the CAS system 10, with a frame of reference (e.g., coordinate system, referential system) of the procedure being relative to the fixed position of the base 20B of the robot 20.
  • the sensors 25 must provide the precision and accuracy appropriate for surgical procedures.
  • the coupling of tools to the robot arm 20A may automatically cause a registration of the position and orientation of the tools in the frame of reference of the robot 20, though steps of calibration could be performed, as explained below.
  • the tracker 30 may be known as trackable elements, markers, trackable reference, reference tracker, navigation markers, active sensors (e.g., wired or wireless) that may for example include infrared emitters.
  • the tracker 30 has a base 31 , by which the tracker 30 may be secured to a bone.
  • the base 31 may be a plate, a block, a bracket, etc. It may be attached in any appropriate way to a bone, such as via surgical screws 32, pins, bolts, spikes, etc, or in non-invasive manners such as via a strap, belt, etc.
  • a shaft 33 may project from the base 31 and spaces a trackable portion 34 from the base 31 .
  • the shaft 33 may be referred to as an arm, a stem, a support, a spacer, etc.
  • the shaft 33 may be separated into two segments, shown as 33A and 33B, separated by a joint 35.
  • the joint 35 is described in further detail hereinafter, and may serve to enable the movement of the trackable portion 34 relative to the base 31 .
  • the location of the joint 35 is approximate, as it is possible to have the joint 35 closer to the base 31 , or closer to the trackable portion 34. In a variant, there may be more than one such joint 35.
  • the trackable portion 34 may have passive retro-reflective elements 34’, that reflect light.
  • the tracker 30 has a known geometry so as to be recognizably through detection by the tracker device 40.
  • the retro-reflective elements 34’ are spheres (i.e., quasi-spheres).
  • the retro-reflective elements 34’ are arranged in a given geometrical pattern to be recognized by the optical tracker device 40 of the CAS system 10.
  • the retro-reflective elements 34’ are arranged in a scalene triangle defined by the centers of the optical elements 34’. There may be more or fewer optically detected elements.
  • triangular geometrical patterns are illustrated, it is contemplated to use other geometrical patterns, such as lines and various polygonal shapes.
  • the joint 35 is shown as being a spherical joint, i.e., a joint that provide two or three rotational degrees of freedom (DOF) between the segments 33A and 33B.
  • the joint 35 could also be a pivot joint, or any other such joint providing a single degree of freedom of rotation.
  • the joint 35 has self-tracking capacity, i.e., sensor technology by which the movement and relation between the segments 33A and 33B may be tracked and quantified.
  • the joint 35 may integrate encoder technology for any movement in the joint 35 to be known in real-time, or other sensing technologies such as a 3D Hall-effect sensor and magnet.
  • the joint 35 may provide joint signals indicative of any movement that occurs at the joint 35, and therefore joint signals representative of movement between the segments 33A and 33B.
  • the joint 35 may have a locking device, such as set screw 35A, though this is optional.
  • the joint 35 may be self-powered, e.g., by way of a battery, and may communicate with the CAS controller 50 in any appropriate way (e.g., wired, wireless, etc). In a variant, there is no joint signals electronically output by the joint 35.
  • joint 35 While a single joint 35 is shown, alternative configurations are possible, such as providing two separate rotational DOF joints, to separate the shaft 33 into three or more segments. As another possibility, there may be joints 35 with one or more rotational DOFs between the base 31 and the shaft 33, and between the shaft 33 and the trackable portion 34. As shown in Fig. 2B, the joint 35 may also allow a translation, such as by being a telescopic joint. The telescopic joint could also optionally provide joint signals.
  • the tracker 30 may be known as trackable elements, markers, trackable reference, reference tracker, navigation markers, active sensors (e.g., wired or wireless) that may for example include infrared emitters.
  • the tracker 30 has a base 31 , by which the tracker 30 may be secured to a bone.
  • the base 31 may be a plate, a block, a bracket, etc. It may be attached in any appropriate way to a bone, such as via surgical screws 32, pins, bolts, spikes, etc, or in non-invasive manners such as via a strap, belt, etc.
  • a shaft 33 may project from the base 31 and spaces a trackable portion 34 from the base 31 .
  • the shaft 33 may be referred to as an arm, a support, a spacer, etc.
  • the shaft 33 may be separated into two segments, shown as 33A and 33B, separated by a joint 35.
  • the joint 35 is described in further detail hereinafter, and may serve to enable the movement of the trackable portion 34 relative to the base 31.
  • the location of the joint 35 is approximate, as it is possible to have the joint 35 closer to the base 31 , or closer to the trackable portion 34. In a variant, there may be more than one such joint 35.
  • the trackable portion 34 may have passive retro-reflective elements, that reflect light.
  • the trackers 30 have a known geometry so as to be recognizably through detection by the tracker device 40.
  • the trackers 30 may be retro-reflective lenses.
  • the trackable portion 34 may be as described in U.S. Patent No. 8,386,022.
  • the tracker 30 may thus be known as a multifaceted tracker.
  • the tracker 30 of the exemplary embodiment has three tracker ends 130’ supported by arms 130” that interface the tracker ends 130’ to the shaft 33.
  • Each tracker end 130” is provided in three sets of three detectable elements.
  • the tracker ends 130” are each provided with a pyramidal body having faces 131 A, 131 E3, 131C (concurrently, the faces 131).
  • the faces 131 each define an opening 132 having a given geometrical shape.
  • the given geometrical shape is a circle.
  • Retro-reflective surfaces are positioned in the openings 132, so as to form circular optical elements 133A, 133B, and 133C of the tracker ends 130’. Other shapes are also considered for the optical elements 133A, 133B, and 133C.
  • the retro- reflective surfaces are made of a retro-reflective material that will be detected by the optical tracker device 40 associated with the CAS system 10. For instance, the material Scotch LiteTM is suited to be used as retro-reflective surface.
  • the optical elements 133A, 133B, and 133C must be in a given geometrical pattern to be recognized by the optical tracker device 40 of the CAS system 10.
  • the optical elements 133A, 133B, and 133C are regrouped in one embodiment in sets of three. Referring to Fig. 2B, a first set of three optical elements consists of the optical elements 133A, each of which is in a different one of the tracker ends 130”. Similarly, a second set consists of the elements 133B, and a third set consists of the elements 133C.
  • each of the elements of a same set (e.g., the first set of elements 133A) are parallel to a same plane. Accordingly, the elements 133A are visible from a same field of view.
  • the sets of elements 133A, 133B, 133C are strategically positioned with respect to one another so as to optimize a range of visibility of the tracker device 10. More specifically, the sets are positioned such that once the tracker device 40 of the CAS system 10 loses sight of one of the sets, another set is visible. This ensures the continuous tracking of the tool T having a tracker device 30 within a given range of field of view.
  • the sets each form a geometrical pattern that is recognized by the tracking module 60 of the CAS system 10.
  • the combination of circular openings and retro-reflective surface gives a circular shape to the optical elements 133A, 133B, 133C. According to the angle of view of the tracker device 40, these circles will not always appear as being circular in shape. Therefore, the position of the center of the circles can be calculated as a function of the shape perceived from the angle of view by the optical sensor apparatus.
  • the geometrical pattern therefore consists of a triangle defined by the centers of the optical elements 133A, 133B or 133C of the sets. It is suggested that the three triangles of the three different sets of optical elements 133A, 133B, 133C be of different shape, with each triangle being associated with a specific orientation with respect to the tool. Alternatively, the three triangles formed by the three different sets may be the same, but the perceived shape of the circular reflective surfaces must be used to identify which of the three sets of reflective surfaces is seen. There may be more or less optical elements, and sets of optical elements, as described in U.S. Patent No. 8,386,022. Moreover, although triangular geometrical patterns are illustrated, it is contemplated to use other geometrical patterns, such as lines and various polygonal shapes.
  • the joint 35 is shown as being a cylindrical joint, i.e., a joint that provide two degrees of freedom (DOF) between the segments 33A and 33B, one rotational DOF and one translational DOF.
  • the joint 35 has self-tracking capacity, i.e., sensor technology by which the movement and relation between the segments 33A and 33B may be tracked and quantified.
  • the joint 35 may integrate encoder technology for any movement in the joint 35 to be known in real-time, 3D Hall-effect sensor and magnet, etc.
  • the joint 35 may provide joint signals indicative of any movement that occurs at the joint 35, and therefore joint signals representative of movement between the segments 33A and 33B.
  • the joint 35 may have a locking device, such as set screw 35A, though this is optional.
  • the joint 35 may be self-powered, e.g., by way of a battery, and may communicate with the CAS controller 50 in any appropriate way (e.g., wired, wireless, etc). In a variant, there is no joint signals electronically output by the joint 35.
  • the trackable portion 34 may be embodied by other detectable devices, patterns, etc.
  • the trackable portion 34 could include a QR code laid on a flat surface, or other optically-detectable element(s), and this applies to all trackers 30 described herein.
  • the trackers 30 may be active emitters.
  • FIG. 2C-2E another variant of the tracker 30 is shown.
  • the tracker 30 has components that may be found in the trackers 30 of Fig. 2A and/or Fig. 2B, whereby like reference numerals will pertain to like components.
  • the tracker 30 has a base 31 , by which the tracker 30 may be secured to a bone.
  • the base 31 is illustrated as being a plate 31A, of circular shape, but this is only an option, as it could have other shapes.
  • a shaft segment 33A may project from the base 31 , and may be relatively short in contrast to a shaft segment 33B, that may also be referred to as an arm (e.g., the shaft segment 33B may be at least 5 times as long as the shaft segment 33B).
  • the segments 33A and 33B may be separated by a joint 35, shown as being a spherical joint.
  • the spherical joint 35 may have a ball portion 35B.
  • the ball portion 35B may be mounted to the segment 33A.
  • the plate 31A, the shaft segment 33A and the ball portion 35B form a monoblock, referred to as a base component.
  • a central passage may be defined in the ball portion 35B, so as to extend into the shaft segment 33A and the plate 31A. Therefore, fastener 32 may be inserted into the central passage, via the ball portion 35B, to secure the base component to the bone.
  • An appropriate shoulder may be defined in the channel (e.g., a counterbore) for the head of the fastener 32 to be abutted and concealed in the ball portion 35B.
  • the base component is therefore fixed to the bone and immovable when fixed to it.
  • the fastener 32 may be a surgical screw, pin, bolt, spike, etc.
  • the central passage allows the centralizing of the fastener 32 and may therefore contribute to the stability of the securing of the base component.
  • the joint 35 may further include a socket 35C.
  • the socket 35C may be part of a receptable or housing 35D from which the segment 33B projects.
  • the socket 35C is a cavity that may have the geometry of a truncated sphere, such that a spherical joint is formed when the ball portion 35B is housed into the socket 35C.
  • Appropriate components such as a set screw, a circlip, etc may be used for the ball portion 35B to be captive in the socket 35C, yet rotatable in two or more rotational degrees of freedom.
  • the housing 35D may include electronic components 35D’, a battery, the 3D Hall-effect sensor, etc.
  • the electronic components may include a communications chip, such as a Bluetooth® chip, a wi-fi chip, etc, for joint signals to be emitted by the joint 35, as an option.
  • a communications chip such as a Bluetooth® chip, a wi-fi chip, etc.
  • the shaft segment 33B may not be circular in cross-sectional shape. It is illustrated as being an elongated strip or bar, that may not be straight all along its length.
  • the shaft segment 33B is shown having various segments, in such a way that one of its subsegments, shown as 33C, may be brought closer to the patient body.
  • the subsegment 33C may have a bracket 33D that may be used to receive and hold a strap 33E (Fig. 2F).
  • the bracket 33D defines a slot for the strap to pass through it.
  • the bracket 33D is optional or could be located elsewhere. It is for example possible to use the strap 33E without the bracket 33D, or to have the bracket 33D at other locations on the segment 33B.
  • a post 33F may optionally be present at an end of the shaft segment 33B.
  • the post 33F may project in a different direction than a remainder of the shaft segment 33B, and is used to support the trackable portion 34.
  • the post 33F may have elbows so as not to be straight, for instance to be oriented in a particular direction based on the typical set-up in an operating room.
  • the trackable portion 34 is similar to that shown in Fig. 2B, i.e., a multi-faceted tracker.
  • the trackable portion 34 could be mounted directly to the post 33F.
  • a support 33G may be provided at the free end of the post 33F.
  • the support 33G may emulate the shape of the trackable portion 34, and hence define two or more connection points (three shown). This may contribute to the rigidity of the assembly.
  • Fig. 2F shows two of the tracker 30 on a limb, with one of the trackers 30 secured to the tibia, and another secured to the femur.
  • the trackers 30 are strapped to the limbs by way of straps 33E, such that the trackers 30 are relatively immovable. However, a part of the trackers 30 are on soft tissue, such that some small movements are possible. If movement occurs relative to the base 31 (and ball portion 35B), the movement may be quantified by the electronic components in the joints 35, and adjustments can be made in the tracking. Because of the configuration of the tracker 30 of Fig.
  • the optical detection may be observed as taking less volume, with the low profile arrangement being such that the segment 33B is stowed along the limb, as opposed to projecting away from the limb as in prior art trackers (e.g., Figs. 1A and 1 B).
  • the relative low height of the segment 33A may also contribute to the low profile of the tracker 30.
  • the joint 35 is shown separating the shaft 33 in a pair of segments 33A and 33B (or stems, rods, bars), it is possible to provide the joint 35 directly on the base 31 , such that the stem 33 projects directly from the base 31.
  • the embodiment of Fig. 2C does not have a shaft or stem segment 33A, considering its small height.
  • the joint 35 could be directly at the trackable portion 34, such that the tracker 30 may be without shaft segment 33B.
  • the shaft 33 i.e., stem, bar, arm
  • the tracker device 40 is shown as being embodied by an image capture device, capable of illuminating its environment.
  • the tracker device 40 may have two (or more) points of view, such that triangulation can be used to determine the position of the tracker devices 30 in space, i.e., the coordinate system of the CAS system 10.
  • the tracker device 40 may emit light, or use ambient light, to observe the trackers 30 from its points of view, so as to determine a position of the trackers 30 relative to itself.
  • the tracker device 40 can produce navigation data enabling the locating of objects within the coordinate system of the CAS system 10.
  • the tracker device 40 is of the type known as the Polaris products by Northern Digital Inc.
  • the tracker device 40 may form the complementary part of the CMM function of the CAS system 10, with the trackers 30 on the robot base 20A for example.
  • the tracker device 40 may be a depth camera as another possibility, such as when a QR token is used as an alternative to the retro-reflective elements shown in Figs. 2A, 2B and 2C.
  • the CAS controller 50 is shown in greater detail relative to the other components of the CAS system 10.
  • the CAS controller 50 has a processor unit 51 and a non-transitory computer-readable memory 52 communicatively coupled to the processing unit 51 and configured for executing computer-readable program instructions executable by the processing unit 51 to perform some functions, such as tracking the patient tissue and tools, using the position and orientation data from the robot 20, signals from the joint 35 if available, and the readings from the tracker device 40.
  • the computer-readable program instructions may include an operating system that may be viewed by a user or operator as a GUI on one or more of the interfaces of the CAS system 10.
  • the CAS system 10 may comprise various types of interfaces l/F, for the information to be provided to the operator.
  • the interfaces l/F may include and/or screens including wireless portable devices (e.g., phones, tablets), audio guidance, LED displays, head-mounted display for virtual reality, augmented reality, mixed reality, among many other possibilities.
  • the interface l/F comprises a graphic-user interface (GUI) operated by the system 10.
  • GUI graphic-user interface
  • the CAS controller 50 may also display images captured pre-operatively, or using cameras associated with the procedure (e.g., 3D camera, laparoscopic cameras, tool mounted cameras), for instance to be used in the collaborative/cooperative control mode of the system 10, or for visual supervision by the operator of the system 10, with augmented reality for example.
  • the CAS controller 50 may drive the robot arm 20A, in performing the surgical procedure based on the surgery planning achieved pre-operatively, or in maintaining a given position and orientation to support a tool.
  • the CAS controller 50 may run various modules, in the form of algorithms, code, non-transient executable instructions, etc, in order to operate the CAS system 10 in the manner described herein.
  • the CAS controller 50 may be part of any suitable processor unit, such as a personal computer or computers including laptops and desktops, tablets, server, etc.
  • the tracking module 60 may be a subpart of the CAS controller 50, or an independent module or system.
  • the tracking module 60 receives the position and orientation data from the robot 20, the joint signals from the joint(s) 35 and the readings from the tracker device 40.
  • the tracking module 60 may hence determine the relative position of the objects in the referential system of the CAS system 10, such as relative to the robot arm 20A in a manner described below.
  • the tracking module 60 may track an object such as a bone using the movable surgical tracker 30 of Fig. 2A, 2B or Fig. 2C.
  • the movable surgical tracker 30 is in a fixed position and orientation relative to the bone, such that the tracking by the tracking module 60 is achieved using readings from the tracker device 40 or like camera.
  • the joint 35 has signalling capacity and can produce signals to indicate movement between the links it joins (e.g., segments 33A and 33B)
  • the joint signals can be used by the tracking module 60 as a transform indicative of the relation between the trackable portion 34 and the bone.
  • the trackable portion 34 has changed its position and/or orientation relative to the bone, such that the tracking module 60 must take this changed position and/or orientation into consideration for the tracking.
  • a calibration step may be performed for the tracking module 60 to track the bone using the new position and/or orientation.
  • the tracking module 60 may also be provided with models of the objects to be tracked.
  • the tracking module 60 may track bones and tools, and may use virtual bone models and tool models that may be merged with optically captured data.
  • the bone models may be acquired from pre-operative imaging (e.g., MRI, CT-scans), for example in 3D or in multiple 2D views, including with 2D X-ray to 3D bone model technologies.
  • the virtual bone models may also include some image processing done preoperatively, for example to remove soft tissue or refine the surfaces that will be exposed and tracked.
  • the virtual bone models may be of greater resolution at the parts of the bone that will be tracked during surgery, such as the knee articulation in knee surgery.
  • the bone models may also carry additional orientation data, such as various axes (e.g., longitudinal axis, mechanical axis, etc).
  • the bone models may therefore be patient specific. It is also considered to obtain bone models from a bone model library, with the data obtained from the video images used to match a generated 3D surface of the bone with a bone from the bone atlas.
  • the virtual tool models may be provided by the tool manufacturer, or may also be generated in any appropriate way so as to be a virtual 3D representation of the tool(s).
  • Additional data may also be available, such as tool orientation (e.g., axis data and geometry).
  • tool orientation e.g., axis data and geometry
  • the tracking module 60 may obtain additional information, such as the axes related to bones or tools.
  • the CAS controller 50 may have the robot controller 70 integrated therein, if a robot is used in the CAS system 10. However, the robot controller 70 may be physically separated from the CAS controller 50, for instance by being integrated into the robot 20 (e.g., in the robot base 20B). The robot controller 70 is tasked with powering and/or controlling the various joints of the robot arm 20A. The robot controller 70 may also optionally calculate robot movements of the robot arm 20A, so as to control movements of the robot arm 20A autonomously in some instances, i.e., without intervention from the CAS controller 50. There may be some force feedback provided by the robot arm 20A to avoid damaging the bones, to avoid impacting other parts of the patient or equipment and/or personnel.
  • the robot controller 70 may perform actions based on a surgery planning.
  • the surgery planning may be a module programmed specifically for any given patient, according to the parameters of surgery desired by an operator such as an engineer and/or surgeon.
  • the parameters may include geometry of selected, planned bone cuts, planned cut depths, sequence or workflow of alterations with a sequence of surgical steps and tools, tools used, etc.
  • the trackers 30 and the tracker device 40 may be complementary tracking technology.
  • the position and orientation of the surgical tool calculated by the tracking module 60 using optical tracking and joint signals from the joint(s) 35 may be redundant over the tracking data provided by the robot controller 70 and/or the CAS controller 50 and its embedded robot arm sensors 25, referred to as maneuvering data for the robot arm 20A.
  • the redundancy may assist in ensuring the accuracy of the tracking of the surgical tool, and end effector 23.
  • the calibration file may include all three of the geometrical patterns (e.g., the triangular patterns of 134A, of 134B and of 134C).
  • a contemplated procedure performed with the CAS system 10 or with a similar CAS system is set forth, with reference to a flow chart 100 illustrative of a method for tracking an object in computer-assisted surgery with a movable surgical tracker such as those shown Figs. 2A, 2B and 2C, the method itself depicted in Fig. 5.
  • the method is an example of a procedure that may be performed by the CAS controller 50 and/or other parts of the CAS system 10 of the present disclosure.
  • the method 100 may be computer-readable program instructions in the non-transitory computer-readable memory 52 for example, and executable by the processing unit 51 communicatively coupled to the processing unit 51 .
  • a tracker such as the tracker 30 of Figs. 2A, 2B and 2C is on an object, such as a bone or tool.
  • the object is a bone.
  • the user may adjust the position and/or orientation of the trackable portion 34 relative to the base 31 , for a line of sight to be present between the trackable portion 34 and the tracker 40. For instance, this may include a set-up with two or more trackers, such as in Fig. 2F. This may include imparting movement via the joint(s) 35. If the joint 35 is lockable or blockable, such as via a set screw 35A (Figs. 2A, 2B and 2C), the joint 35 may be unlocked for the movement and then locked. In a variant, the joint 35 has inherent friction to prevent movement at the joint 35 unless an appreciable amount of force is applied to the tracker 30.
  • the object may be tracked, using the tracker device 34 optically viewing the tracker 30.
  • the tracking may be continuous, i.e., it may be nonstop during one or more steps.
  • the continuous tracking may include pauses.
  • reference points may be digitized for the object, e.g., the bone. This may include one or more axes, surfacic data such as a cloud of points. 103 may occur before 102, or during 102, for example. In some instances, models may be associated to the trackers 30, via calibration steps, model generation with depth cameras, etc, such that the digitizing (also known as registering) may be minimal or unnecessary.
  • the reference points or like object data may be recorded in the coordinate system of the CAS system 10. The coordinate system may be on the bone.
  • the reference points or like object data may be trackable with reference to the tracker 30.
  • signalling may be received indicative of a change of position and/or orientation of the trackable portion 34 relative to the object, by movement at the joint 35.
  • the signalling may come from a user interface.
  • the user may ask for a pause of tracking while the trackable portion 34 is moved to another position and/or orientation using the joint 35, for instance to liberate some space or enhance the ergonomy of the surgical site. This may include unlocking and locking the joint 35, and/or removing the strap 33E, relocating the segment 33B (e.g., Figs. 2A-2C) and reinstalling the strap 33E.
  • the bone is kept immobile during movement of the trackable portion 34.
  • the trackable module 60 may recalibrate the trackable portion 34 using the reference points on the bone as reference to set the new position and/or orientation of the trackable portion 34. This may include recording points on the bone as per 103, or using the segment 33A as reference as another possibility. Other approaches are considered, such as using image processing to quantify the change.
  • the joint 35 provides signals through its electronic components, to quantify the movement.
  • the signals may be provided in realtime, for the tracking module 60 to receive transform data.
  • the trackable module 60 may recalibrate the trackable portion 34 using the reference points on the bone as reference and the transform data from the joint 35 (if electronic) to set the new position and/or orientation of the trackable portion 34.
  • the object is tracked optically using the tracker having the trackable portion viewed by the tracker device using the change of position and/or orientation.
  • the tracking of 105 is continuous, and may be with or without interruption, such as if the signals are provided in real-time by the joint 35.
  • 105 may also include outputting the tracking data, and this may be in the form of images on an interface, numerical data, etc.
  • the method 100 may generally be described as including: tracking the object optically using a tracker having a trackable portion with a joint enabling movement, the trackable portion being viewed by a tracker device; receiving signalling indicative of a change of position and/or orientation of the trackable portion relative to the object, by movement at the joint; and tracking the object optically using the tracker having the trackable portion viewed by the tracker device using the change of position and/or orientation. It may be necessary to use a computing device, notably because the human eye does not have the capacity to quantify changes of position and/or orientation.
  • the CAS system 10 may generally be described as being a system for tracking an object in computer-assisted surgery.
  • the CAS system 10 may include a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: tracking the object optically using a tracker having a trackable portion with a joint enabling movement, the trackable portion being viewed by a tracker device; receiving signalling indicative of a change of position and/or orientation of the trackable portion relative to the object, by movement at the joint; and tracking the object optically using the tracker having the trackable portion viewed by the tracker device using the change of position and/or orientation.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Surgical Instruments (AREA)

Abstract

A system for tracking an object in computer-assisted surgery, may include a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: tracking the object optically using a tracker having a trackable portion with a joint enabling movement, the trackable portion being viewed by a tracker device; receiving signalling indicative of a change of position and/or orientation of the trackable portion relative to the object, by movement at the joint; and tracking the object optically using the tracker having the trackable portion viewed by the tracker device using the change of position and/or orientation.

Description

MOVABLE SURGICAL TRACKER
CROSS-REFERENCE TO RELATED APPLICATION
[0001] The present disclosure claims the benefit of United States Patent Application No. 63/563,487, filed on March 11 , 2024, the content of which is incorporated herein by reference.
TECHNICAL FIELD
[0002] The present disclosure relates to computer-assisted surgery including bone and tool tracking, and to trackers tied to the patient in the context of computer-assisted surgery.
BACKGROUND OF THE ART
[0003] Tracking of surgical instruments or tools is an integral part of computer- assisted surgery (hereinafter “CAS”), including robotized CAS. The end effector, the tools, bodily parts are tracked for position and/or orientation in such a way that relative navigation information pertaining to bodily parts is obtained. The information is then used in various interventions (e.g., orthopedic surgery, neurological surgery) with respect to the body, such as bone alterations, implant positioning, incisions and the like during surgery.
[0004] In CAS, optical tracking is commonly used in different forms, for instance by the presence of optically-detectable trackers on the tools, on the end effector and/or operating end of a robotic arm, in addition to being present on the patient. For example, the optically-detectable trackers are passive retro reflective components on tools and bones, though other types of trackers may be used. The trackers are viewed by a tracking device, such as a tracking camera (e.g., Navitracker®), a depth camera, and by triangulation the position and orientation of the tracker device is calculable to output navigation data. The robot arm may also be equipped with a tracker device.
[0005] One known constraint of optical trackers is that a line of sight must be maintained between the tracker and the tracker device. There may be challenges in reconciling this line of sight requirement with the limited space available at the surgical site. Moreover, a tendency in surgery is to limit invasiveness of surgical procedures, to the extent possible. Accordingly, the incisions in the soft tissue are often small. However, trackers must be fixed to the patient’s bone and must not be too close to resections. Figs. 1A and 1 B are exemplary trackers 1 that are respectively attached to the tibia and to the femur. Both trackers include a base 1 A that is fixed to the bone by fasteners 1 B, an arm 1C that supports a trackable portion 1 D at a distance from the bone, in such a way that the trackable portion 1 D is on display for the tracker device. The trackable portion 1 D is consequently in a fixed relation relative to the bone.
SUMMARY
[0006] In accordance with a first aspect of the present disclosure, there is provided a system for tracking an object in computer-assisted surgery, comprising: a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: tracking the object optically using a tracker having a trackable portion with a joint enabling movement, the trackable portion being viewed by a tracker device; receiving signalling indicative of a change of position and/or orientation of the trackable portion relative to the object, by movement at the joint; and tracking the object optically using the tracker having the trackable portion viewed by the tracker device using the change of position and/or orientation.
[0007] Further in accordance with the first aspect, for instance, receiving signalling includes receiving signals from a joint in the tracker.
[0008] Still further in accordance with the first aspect, for instance, receiving signals from the joint in the tracker includes receiving signals quantifying the change.
[0009] Still further in accordance with the first aspect, for instance, tracking the object using the change of position and/or orientation includes tracking the object continuously while receiving the signals.
[0010] Still further in accordance with the first aspect, for instance, receiving signalling includes receiving an interaction requesting a pause in the tracking.
[0011] Still further in accordance with the first aspect, for instance, the tracker may be recalibrated to quantify the change after the pause. [0012] Still further in accordance with the first aspect, for instance, reference points are obtained on a surface of the object prior to or during the tracking of the object.
[0013] Still further in accordance with the first aspect, for instance, the object is a bone and the tracker is fixed to the bone.
[0014] Still further in accordance with the first aspect, for instance, the tracker is including with the joint and the trackable portion.
[0015] Still further in accordance with the first aspect, for instance, the joint is a joint providing at least one rotational degree of freedom.
[0016] Still further in accordance with the first aspect, for instance, the joint is a spherical joint.
[0017] Still further in accordance with the first aspect, for instance, the joint is a joint providing at least one translational degree of freedom.
[0018] Still further in accordance with the first aspect, for instance, the joint is a cylindrical joint.
[0019] Still further in accordance with the first aspect, for instance, the joint has encoding capacity to emit signals quantifying movement at the joint.
[0020] Still further in accordance with the first aspect, for instance, the tracker device is included for optically tracking the tracker.
[0021] In accordance with a second aspect, there is provided a tracker for optical tracking during computer-assisted surgery comprising: a base configured to be secured to a bone; a trackable portion having at least one optically detectable element; at least one stem spacing the base from the trackable portion; and a joint in the tracker enabling at least one degree of freedom of movement between the base and the trackable portion, the joint having a sensor configured to emit a signal quantifying the movement.
[0022] Further in accordance with the second aspect, for instance, the joint is in the stem, whereby the stem has a first segment from the base to the joint, and a second segment from the joint to the trackable portion.
[0023] Still further in accordance with the second aspect, for instance, a ratio of length between the second segment and the first segment is at least 5:1 . [0024] Still further in accordance with the second aspect, for instance, the joint is between the stem and the base.
[0025] Still further in accordance with the second aspect, for instance, the joint is a spherical joint providing at least two rotational degrees of freedom of movement.
[0026] Still further in accordance with the second aspect, for instance, a ball portion of the spherical joint is fixed relative to the base.
[0027] Still further in accordance with the second aspect, for instance, a channel extends from a surface of the ball portion and into the base, the channel configured to receive a fastener.
[0028] Still further in accordance with the second aspect, for instance, the sensor is an assembly including a 3D Hall-effect sensor and magnet.
[0029] Still further in accordance with the second aspect, for instance, the sensor includes a wireless chip configured for wireless communication of the signal.
[0030] Still further in accordance with the second aspect, for instance, the joint is a cylindrical joint providing one rotational degree of freedom of movement and one translational degree of freedom of movement.
[0031] Still further in accordance with the second aspect, for instance, a strap holder is on the stem.
DESCRIPTION OF THE DRAWINGS
[0032] Figs. 1 A and 1 B are trackers on bone in accordance with the prior art;
[0033] Fig. 2A is a perspective view of a movable surgical tracker with movable adjustment capacity in accordance with an aspect of the present disclosure;
[0034] Fig. 2B is a perspective view of a movable surgical tracker with movable adjustment capacity in accordance with another aspect of the present disclosure;
[0035] Fig. 2C is a perspective view of a movable surgical tracker with movable adjustment capacity in accordance with yet another aspect of the present disclosure;
[0036] Fig. 2D is a perspective view of the movable surgical tracker with movable adjustment capacity of Fig. 2D, having its base secured to a bone; [0037] Fig. 2E is an assembly view of the movable surgical tracker with movable adjustment capacity of Fig. 2D, having a trackable portion thereof directed toward the base for assembly;
[0038] Fig. 2F is a perspective view showing a pair of the movable surgical tracker with movable adjustment capacity of Fig. 2D, respectively secured to the tibia and femur during knee surgery;
[0039] Fig. 3 is a schematic view of a robotic surgery system in accordance with an aspect of the present disclosure, relative to a patient, using the movable surgical tracker of any one of Fig. 2A to 2C;
[0040] Fig. 4 is a block diagram of the tracking system for robotized computer- assisted surgery of Fig. 3; and
[0041] Fig. 5 is a flow chart of a method for calibrating a tracker device with a CAS system.
DETAILED DESCRIPTION
[0042] Referring to Fig. 3 and 4, a robotic surgery system for computer-assisted surgery (CAS) system is generally shown at 10, and is used to provide surgery assistance to an operator. Robotic capability is merely optional, as the system 10 may be without a robot. For simplicity, it will be referred to herein as the CAS system 10. In Fig. 3, the system 10 is shown relative to a dummy patient in prone decubitus, but only as an example. The system 10 could be used for any body parts, including non- exhaustively hip joint, spine, and shoulder bones, for orthopedic surgery, but could also be used in other types of surgery. For example, the system 10 could be used for surgery of all sorts, such as brain surgery, and soft tissue surgery.
[0043] The CAS system 10 may be robotized in a variant, and has, may include or may be used with a robot 20. The CAS system 10 may further include optical trackers such as movable surgical tracker device 30, a tracker device 40, a CAS controller 50 (also known as a super controller 50), a tracking module 60, and a robot controller 70 (also known as a robot driver), present if a robot is used, or any combination thereof:
• The robot 20, shown by its robot arm 20A may optionally be present as the working end of the system 10, and may be used to perform or guide bone alterations as planned by an operator and/or the CAS controller 50 and as controlled by the CAS controller 50. The robot arm 20A may also be configured for collaborative/cooperative mode in which the operator may manipulate the robot arm 20, or the tool supported by the robot arm 20, though the tool may be operated by a human operator. For example, the tooling end, also known as end effector, may be manipulated by the operator while supported by the robot arm 20A. The robot 20 may be the coordinate measuring machine (CMM) of the CAS system 10;
Optical trackers, such as the trackers 1 of Figs. 1A and 1 B and the movable surgical tracker 30 of Figs. 2A, 2B and 2C are positioned on the robot 20, on patient tissue (e.g., bones B), and/or on the tool(s) T and surgical instruments, and provide tracking data for the robot 20, the patient and/or tools.
The tracking device 40, also known as a sensor device, apparatus, etc performs optical tracking of the optical trackers 30, so as to enable the tracking in space (a.k.a., navigation) of the robot 20, the patient and/or tools;
The CAS controller 50, also known as the super controller, includes the processor(s) and appropriate hardware and software to run a computer- assisted surgery procedure in accordance with one or more workflows. The CAS controller 50 may include or operate the tracking device 40, the tracking module 60, and/or the robot controller 70. As described hereinafter, the CAS controller 50 may also drive the robot arm 20A through a planned surgical procedure;
The tracking module 60 is tasked with determining the position and/or orientation of the various relevant objects during the surgery procedure, such as the end effector of the robot arm 20, bone(s) B and tool(s) T, using data acquired by the tracking device 40 and by the robot 20, and/or obtained from the robot controller 70. The position and/or orientation may be used by the CAS controller 50 to control the robot arm 20A;
The robot controller 70 is optionally present, and is tasked with powering or controlling the various joints of the robot arm 20A, based on operator demands or on surgery planning, and may also be referred to as a robot controller module that is part of the super controller 50. The robot controller 70 may also optionally calculate robot movements of the robot arm 20A, so as to control movements of the robot arm 20A autonomously in some instances, i.e., without intervention from the CAS controller 50;
• An additional camera(s) may be present, for instance as a complementary registration tool. The camera may for instance be mounted on the robot 20A, such as on the robot arm, such that the point of view of the camera is known in the frame of reference, also known as the coordinate system.
[0044] Other components, devices, systems, may be present, such as surgical instruments and tools T, interfaces l/F such as displays, screens, computer station, servers, and like etc. Secondary tracking systems may also be used for redundancy.
[0045] Referring to Fig. 3, the robot 20 may have the robot arm 20A stand from a base 20B, for instance in a fixed relation relative to the operating-room (OR) table supporting the patient, whether it is attached or detached from the table. The robot arm 20A has a plurality of joints 21 and links 22, of any appropriate form, to support an end effector 23 that may interface with the patient, or may be used during surgery without interfacing with the patient. For example, the end effector or tool head may optionally incorporate a force/torque sensor for collaborative/cooperative control mode, in which an operator manipulates the robot arm 20A. The robot arm 20A is shown being a serial mechanism, arranged for the tool head 23 to be displaceable in a desired number of degrees of freedom (DOF). The tool head 23 may for example be a support that is not actuated, the support being used to support a tool, with the robot arm 20A used to position the tool relative to the patient. In a variant, the robot arm 20A controls 6-DOF movements of the tool head, i.e., X, Y, Z in the coordinate system, and pitch, roll and yaw. Fewer or additional DOFs may be present. For simplicity, only a fragmented illustration of the joints 21 and links 22 is provided, but more joints 21 of different types may be present to move the end effector 23 in the manner described above. The joints 21 are powered for the robot arm 20A to move as controlled by the CAS controller 50 in the six DOFs, and in such a way that the position and orientation of the end effector 23 in the coordinate system may be known, for instance by readings from encoders on the various joints 21 . Therefore, the powering of the joints is such that the end effector 23 of the robot arm 20A may execute precise movements, such as moving along a single direction in one translation DOF, or being restricted to moving along a plane, among possibilities. Such robot arms 20A are known, for instance as described in United States Patent Application Serial no. 11/610,728, and incorporated herein by reference.
[0046] The end effector 23 of robot arm 20A may be defined by a chuck or like tool interface, typically actuatable in rotation. As a non-exhaustive example, numerous tools may be used as end effector for the robot arm 20, such tools including a registration pointer as shown in Fig. 1 , equipped with a tracker device 30, a reamer (e.g., cylindrical, tapered), a reciprocating saw, a retractor, a camera, an ultrasound unit, a laser rangefinder or light-emitting device (e.g., the indicator device of US Patent No. 8,882,777), a laminar spreader, an instrument holder, or a cutting guide, depending on the nature of the surgery. The various tools may be part of a multi-mandible configuration or may be interchangeable, whether with human assistance, or as an automated process. The installation of a tool in the tool head may then require some calibration in order to track the installed tool in the X, Y, Z coordinate system of the robot arm 20.
[0047] The end effector 23 of the robot arm 20A may be positioned by the robot 20 relative to surgical area A in a desired orientation according to a surgical plan, such as a plan based on preoperative imaging. Due to the proximity between the robot 20 and the surgical area A, the robot 20 may be covered partially with a surgical drape D, also known as a surgical robotic drape. The surgical drape D is a sterile panel (or panels), tubes, bags or the like that form(s) a physical barrier between the sterile zone (e.g., surgical area) and some equipment that may not fully comply with sterilization standards, such as the robot 20. In an embodiment, the surgical drape D is transparent such that one can see through the drape D. In an embodiment, the robot is entirely covered with the surgical drape D, and this includes the base 20B, but with the exception of the end effector 23. Indeed, as the end effector 23 interacts or may interact with the human body, it may be sterilized and may not need to be covered by the surgical drape D, to access the patient. Some part of the robot 20 may also be on the sterile side of the surgical drape D. In a variant, a portion of the robot arm 20 is covered by the surgical drape D. For example, the surgical drape D may be in accordance with United States Patent Application No. 15/803,247, filed on November s, 2017 and incorporated herein by reference.
[0048] In order to position the end effector 23 of the robot arm 20A relative to the patient B, the CAS controller 50 can manipulate the robot arm 20A automatically (without human intervention), or by a surgeon manually operating the robot arm 20A (e.g. physically manipulating, via a remote controller through the interface l/F) to move the end effector 23 of the robot arm 20A to the desired location, e.g., a location called for by a surgical plan to align an instrument relative to the anatomy. Once aligned, a step of a surgical procedure can be performed, such as by using the end effector 23. To assist in the maneuvering and navigating of the robot arm 20A, a tracker device 30 may optionally be secured to the distalmost link, and may be distinct from the tracker device 30 on the instrument supported by the end effector 23.
[0049] As shown in Fig. 4, the robot arm 20A may include sensors 25 in its various joints 21 and links 22. The sensors 25 may be of any appropriate type, such as rotary encoders, optical sensors, position switches that are a non-exhaustive list of potential sensors, for the position and orientation of the end effector, and of the tool in the end effector 23 to be known. More particularly, the tracking module 60 may determine the position and orientation of the robot 20 in a frame of reference of the robot 20, such as by obtaining the position (x,y,z) and orientation (phi, theta, ro) of the end effector 23 from the CAS controller 50 using the sensors 25 in the robot arm 20A, i.e., robot coordinates may be an integrated function of the robot 20 in that it may determine the position and orientation of its end effector 23 with respect to its coordinate system. Using the data from the sensors 25, the robot 20 may be the coordinate measuring machine (CMM) of the CAS system 10, with a frame of reference (e.g., coordinate system, referential system) of the procedure being relative to the fixed position of the base 20B of the robot 20. The sensors 25 must provide the precision and accuracy appropriate for surgical procedures. The coupling of tools to the robot arm 20A may automatically cause a registration of the position and orientation of the tools in the frame of reference of the robot 20, though steps of calibration could be performed, as explained below.
[0050] Referring to Fig. 2A, a movable surgical tracker 30 in accordance with an aspect of the present disclosure is shown. The tracker 30 may be known as trackable elements, markers, trackable reference, reference tracker, navigation markers, active sensors (e.g., wired or wireless) that may for example include infrared emitters. In the variant shown, the tracker 30 has a base 31 , by which the tracker 30 may be secured to a bone. The base 31 may be a plate, a block, a bracket, etc. It may be attached in any appropriate way to a bone, such as via surgical screws 32, pins, bolts, spikes, etc, or in non-invasive manners such as via a strap, belt, etc.
[0051] A shaft 33 may project from the base 31 and spaces a trackable portion 34 from the base 31 . The shaft 33 may be referred to as an arm, a stem, a support, a spacer, etc. The shaft 33 may be separated into two segments, shown as 33A and 33B, separated by a joint 35. The joint 35 is described in further detail hereinafter, and may serve to enable the movement of the trackable portion 34 relative to the base 31 . The location of the joint 35 is approximate, as it is possible to have the joint 35 closer to the base 31 , or closer to the trackable portion 34. In a variant, there may be more than one such joint 35.
[0052] The trackable portion 34 may have passive retro-reflective elements 34’, that reflect light. The tracker 30 has a known geometry so as to be recognizably through detection by the tracker device 40. In Fig. 2A, the retro-reflective elements 34’ are spheres (i.e., quasi-spheres).
[0053] The retro-reflective elements 34’ are arranged in a given geometrical pattern to be recognized by the optical tracker device 40 of the CAS system 10. In Fig. 2A, the retro-reflective elements 34’ are arranged in a scalene triangle defined by the centers of the optical elements 34’. There may be more or fewer optically detected elements. Moreover, although triangular geometrical patterns are illustrated, it is contemplated to use other geometrical patterns, such as lines and various polygonal shapes.
[0054] In Fig. 2A, the joint 35 is shown as being a spherical joint, i.e., a joint that provide two or three rotational degrees of freedom (DOF) between the segments 33A and 33B. The joint 35 could also be a pivot joint, or any other such joint providing a single degree of freedom of rotation. In a variant, the joint 35 has self-tracking capacity, i.e., sensor technology by which the movement and relation between the segments 33A and 33B may be tracked and quantified. For example, the joint 35 may integrate encoder technology for any movement in the joint 35 to be known in real-time, or other sensing technologies such as a 3D Hall-effect sensor and magnet. Hence, the joint 35 may provide joint signals indicative of any movement that occurs at the joint 35, and therefore joint signals representative of movement between the segments 33A and 33B. Moreover, the joint 35 may have a locking device, such as set screw 35A, though this is optional. The joint 35 may be self-powered, e.g., by way of a battery, and may communicate with the CAS controller 50 in any appropriate way (e.g., wired, wireless, etc). In a variant, there is no joint signals electronically output by the joint 35.
[0055] While a single joint 35 is shown, alternative configurations are possible, such as providing two separate rotational DOF joints, to separate the shaft 33 into three or more segments. As another possibility, there may be joints 35 with one or more rotational DOFs between the base 31 and the shaft 33, and between the shaft 33 and the trackable portion 34. As shown in Fig. 2B, the joint 35 may also allow a translation, such as by being a telescopic joint. The telescopic joint could also optionally provide joint signals.
[0056] Referring to Fig. 2B, a movable surgical tracker 30 in accordance with another aspect of the present disclosure is shown. The tracker 30 may be known as trackable elements, markers, trackable reference, reference tracker, navigation markers, active sensors (e.g., wired or wireless) that may for example include infrared emitters. In the variant shown, the tracker 30 has a base 31 , by which the tracker 30 may be secured to a bone. The base 31 may be a plate, a block, a bracket, etc. It may be attached in any appropriate way to a bone, such as via surgical screws 32, pins, bolts, spikes, etc, or in non-invasive manners such as via a strap, belt, etc.
[0057] A shaft 33 may project from the base 31 and spaces a trackable portion 34 from the base 31 . The shaft 33 may be referred to as an arm, a support, a spacer, etc. The shaft 33 may be separated into two segments, shown as 33A and 33B, separated by a joint 35. The joint 35 is described in further detail hereinafter, and may serve to enable the movement of the trackable portion 34 relative to the base 31. The location of the joint 35 is approximate, as it is possible to have the joint 35 closer to the base 31 , or closer to the trackable portion 34. In a variant, there may be more than one such joint 35. [0058] The trackable portion 34 may have passive retro-reflective elements, that reflect light. The trackers 30 have a known geometry so as to be recognizably through detection by the tracker device 40. For example, the trackers 30 may be retro-reflective lenses. The trackable portion 34 may be as described in U.S. Patent No. 8,386,022.
[0059] The tracker 30 may thus be known as a multifaceted tracker. The tracker 30 of the exemplary embodiment has three tracker ends 130’ supported by arms 130” that interface the tracker ends 130’ to the shaft 33. Each tracker end 130” is provided in three sets of three detectable elements. For example, the tracker ends 130” are each provided with a pyramidal body having faces 131 A, 131 E3, 131C (concurrently, the faces 131). The faces 131 each define an opening 132 having a given geometrical shape. In the embodiment of Fig. 2B, the given geometrical shape is a circle.
[0060] Retro-reflective surfaces are positioned in the openings 132, so as to form circular optical elements 133A, 133B, and 133C of the tracker ends 130’. Other shapes are also considered for the optical elements 133A, 133B, and 133C. The retro- reflective surfaces are made of a retro-reflective material that will be detected by the optical tracker device 40 associated with the CAS system 10. For instance, the material Scotch Lite™ is suited to be used as retro-reflective surface.
[0061] As the optical elements 133A, 133B, and 133C must be in a given geometrical pattern to be recognized by the optical tracker device 40 of the CAS system 10, the optical elements 133A, 133B, and 133C are regrouped in one embodiment in sets of three. Referring to Fig. 2B, a first set of three optical elements consists of the optical elements 133A, each of which is in a different one of the tracker ends 130”. Similarly, a second set consists of the elements 133B, and a third set consists of the elements 133C.
[0062] In the embodiment of Fig. 2B, each of the elements of a same set (e.g., the first set of elements 133A) are parallel to a same plane. Accordingly, the elements 133A are visible from a same field of view. The sets of elements 133A, 133B, 133C are strategically positioned with respect to one another so as to optimize a range of visibility of the tracker device 10. More specifically, the sets are positioned such that once the tracker device 40 of the CAS system 10 loses sight of one of the sets, another set is visible. This ensures the continuous tracking of the tool T having a tracker device 30 within a given range of field of view.
[0063] The sets each form a geometrical pattern that is recognized by the tracking module 60 of the CAS system 10. The combination of circular openings and retro-reflective surface gives a circular shape to the optical elements 133A, 133B, 133C. According to the angle of view of the tracker device 40, these circles will not always appear as being circular in shape. Therefore, the position of the center of the circles can be calculated as a function of the shape perceived from the angle of view by the optical sensor apparatus.
[0064] In the embodiment of Fig. 2B, the geometrical pattern therefore consists of a triangle defined by the centers of the optical elements 133A, 133B or 133C of the sets. It is suggested that the three triangles of the three different sets of optical elements 133A, 133B, 133C be of different shape, with each triangle being associated with a specific orientation with respect to the tool. Alternatively, the three triangles formed by the three different sets may be the same, but the perceived shape of the circular reflective surfaces must be used to identify which of the three sets of reflective surfaces is seen. There may be more or less optical elements, and sets of optical elements, as described in U.S. Patent No. 8,386,022. Moreover, although triangular geometrical patterns are illustrated, it is contemplated to use other geometrical patterns, such as lines and various polygonal shapes.
[0065] In Fig. 2B, the joint 35 is shown as being a cylindrical joint, i.e., a joint that provide two degrees of freedom (DOF) between the segments 33A and 33B, one rotational DOF and one translational DOF. In a variant, the joint 35 has self-tracking capacity, i.e., sensor technology by which the movement and relation between the segments 33A and 33B may be tracked and quantified. For example, the joint 35 may integrate encoder technology for any movement in the joint 35 to be known in real-time, 3D Hall-effect sensor and magnet, etc. Hence, the joint 35 may provide joint signals indicative of any movement that occurs at the joint 35, and therefore joint signals representative of movement between the segments 33A and 33B. Moreover, the joint 35 may have a locking device, such as set screw 35A, though this is optional. The joint 35 may be self-powered, e.g., by way of a battery, and may communicate with the CAS controller 50 in any appropriate way (e.g., wired, wireless, etc). In a variant, there is no joint signals electronically output by the joint 35.
[0066] In a variant, the trackable portion 34 may be embodied by other detectable devices, patterns, etc. For example, the trackable portion 34 could include a QR code laid on a flat surface, or other optically-detectable element(s), and this applies to all trackers 30 described herein. In an embodiment, the trackers 30 may be active emitters.
[0067] Referring to Figs. 2C-2E another variant of the tracker 30 is shown. The tracker 30 has components that may be found in the trackers 30 of Fig. 2A and/or Fig. 2B, whereby like reference numerals will pertain to like components.
[0068] In the variant of Figs. 2C-2E, the tracker 30 has a base 31 , by which the tracker 30 may be secured to a bone. The base 31 is illustrated as being a plate 31A, of circular shape, but this is only an option, as it could have other shapes. A shaft segment 33A may project from the base 31 , and may be relatively short in contrast to a shaft segment 33B, that may also be referred to as an arm (e.g., the shaft segment 33B may be at least 5 times as long as the shaft segment 33B). The segments 33A and 33B may be separated by a joint 35, shown as being a spherical joint. The spherical joint 35 may have a ball portion 35B. The ball portion 35B may be mounted to the segment 33A. In a variant, the plate 31A, the shaft segment 33A and the ball portion 35B form a monoblock, referred to as a base component. However, other configurations are possible. As observed from Fig. 2D, a central passage may be defined in the ball portion 35B, so as to extend into the shaft segment 33A and the plate 31A. Therefore, fastener 32 may be inserted into the central passage, via the ball portion 35B, to secure the base component to the bone. An appropriate shoulder may be defined in the channel (e.g., a counterbore) for the head of the fastener 32 to be abutted and concealed in the ball portion 35B. The base component is therefore fixed to the bone and immovable when fixed to it. The fastener 32 may be a surgical screw, pin, bolt, spike, etc. The central passage allows the centralizing of the fastener 32 and may therefore contribute to the stability of the securing of the base component.
However, other fastening configurations are possible, including that of Fig. 2A.
[0069] The joint 35 may further include a socket 35C. The socket 35C may be part of a receptable or housing 35D from which the segment 33B projects. The socket 35C is a cavity that may have the geometry of a truncated sphere, such that a spherical joint is formed when the ball portion 35B is housed into the socket 35C. Appropriate components such as a set screw, a circlip, etc may be used for the ball portion 35B to be captive in the socket 35C, yet rotatable in two or more rotational degrees of freedom. The housing 35D may include electronic components 35D’, a battery, the 3D Hall-effect sensor, etc. In a variant, the electronic components may include a communications chip, such as a Bluetooth® chip, a wi-fi chip, etc, for joint signals to be emitted by the joint 35, as an option. Although the ball portion 35B is on the side of the base 31 , the reverse arrangement is possible, with the socket 35C being on the side of the base 31 .
[0070] As best seen in Fig. 2C, the shaft segment 33B may not be circular in cross-sectional shape. It is illustrated as being an elongated strip or bar, that may not be straight all along its length. The shaft segment 33B is shown having various segments, in such a way that one of its subsegments, shown as 33C, may be brought closer to the patient body. The subsegment 33C may have a bracket 33D that may be used to receive and hold a strap 33E (Fig. 2F). The bracket 33D defines a slot for the strap to pass through it. However, the bracket 33D is optional or could be located elsewhere. It is for example possible to use the strap 33E without the bracket 33D, or to have the bracket 33D at other locations on the segment 33B.
[0071] A post 33F may optionally be present at an end of the shaft segment 33B. The post 33F may project in a different direction than a remainder of the shaft segment 33B, and is used to support the trackable portion 34. The post 33F may have elbows so as not to be straight, for instance to be oriented in a particular direction based on the typical set-up in an operating room. In the illustrated variant, the trackable portion 34 is similar to that shown in Fig. 2B, i.e., a multi-faceted tracker. The trackable portion 34 could be mounted directly to the post 33F. However, in Fig. 2C, a support 33G may be provided at the free end of the post 33F. The support 33G may emulate the shape of the trackable portion 34, and hence define two or more connection points (three shown). This may contribute to the rigidity of the assembly.
[0072] Fig. 2F shows two of the tracker 30 on a limb, with one of the trackers 30 secured to the tibia, and another secured to the femur. The trackers 30 are strapped to the limbs by way of straps 33E, such that the trackers 30 are relatively immovable. However, a part of the trackers 30 are on soft tissue, such that some small movements are possible. If movement occurs relative to the base 31 (and ball portion 35B), the movement may be quantified by the electronic components in the joints 35, and adjustments can be made in the tracking. Because of the configuration of the tracker 30 of Fig. 2C, the optical detection may be observed as taking less volume, with the low profile arrangement being such that the segment 33B is stowed along the limb, as opposed to projecting away from the limb as in prior art trackers (e.g., Figs. 1A and 1 B). The relative low height of the segment 33A may also contribute to the low profile of the tracker 30.
[0073] Although the joint 35 is shown separating the shaft 33 in a pair of segments 33A and 33B (or stems, rods, bars), it is possible to provide the joint 35 directly on the base 31 , such that the stem 33 projects directly from the base 31. For example, it may be said that the embodiment of Fig. 2C does not have a shaft or stem segment 33A, considering its small height. Likewise, the joint 35 could be directly at the trackable portion 34, such that the tracker 30 may be without shaft segment 33B. The shaft 33 (i.e., stem, bar, arm) may be present to space the base 31 from the trackable portion 34, even though the shaft 33 may not be connected to one or another of the base 31 and trackable portion 34, being connected to the joint 35 instead at one end.
[0074] In Figs. 3 and 4, the tracker device 40 is shown as being embodied by an image capture device, capable of illuminating its environment. In a variant, the tracker device 40 may have two (or more) points of view, such that triangulation can be used to determine the position of the tracker devices 30 in space, i.e., the coordinate system of the CAS system 10. The tracker device 40 may emit light, or use ambient light, to observe the trackers 30 from its points of view, so as to determine a position of the trackers 30 relative to itself. By knowing the geometry of the arrangements of trackers 30, the tracker device 40 can produce navigation data enabling the locating of objects within the coordinate system of the CAS system 10. In an embodiment, the tracker device 40 is of the type known as the Polaris products by Northern Digital Inc. The tracker device 40 may form the complementary part of the CMM function of the CAS system 10, with the trackers 30 on the robot base 20A for example. The tracker device 40 may be a depth camera as another possibility, such as when a QR token is used as an alternative to the retro-reflective elements shown in Figs. 2A, 2B and 2C.
[0075] Referring to Fig. 4, the CAS controller 50 is shown in greater detail relative to the other components of the CAS system 10. The CAS controller 50 has a processor unit 51 and a non-transitory computer-readable memory 52 communicatively coupled to the processing unit 51 and configured for executing computer-readable program instructions executable by the processing unit 51 to perform some functions, such as tracking the patient tissue and tools, using the position and orientation data from the robot 20, signals from the joint 35 if available, and the readings from the tracker device 40. Accordingly, as part of the operation of the CAS controller 50, the computer-readable program instructions may include an operating system that may be viewed by a user or operator as a GUI on one or more of the interfaces of the CAS system 10. It is via this or these interfaces that the user or operator may interface with the CAS system 10, be guided by a surgical workflow, obtain navigation data, etc. The CAS controller 50 may also control the movement of the robot arm 20A via the robot controller module 70, if present. The CAS system 10 may comprise various types of interfaces l/F, for the information to be provided to the operator. The interfaces l/F may include and/or screens including wireless portable devices (e.g., phones, tablets), audio guidance, LED displays, head-mounted display for virtual reality, augmented reality, mixed reality, among many other possibilities. For example, the interface l/F comprises a graphic-user interface (GUI) operated by the system 10. The CAS controller 50 may also display images captured pre-operatively, or using cameras associated with the procedure (e.g., 3D camera, laparoscopic cameras, tool mounted cameras), for instance to be used in the collaborative/cooperative control mode of the system 10, or for visual supervision by the operator of the system 10, with augmented reality for example. The CAS controller 50 may drive the robot arm 20A, in performing the surgical procedure based on the surgery planning achieved pre-operatively, or in maintaining a given position and orientation to support a tool. The CAS controller 50 may run various modules, in the form of algorithms, code, non-transient executable instructions, etc, in order to operate the CAS system 10 in the manner described herein. The CAS controller 50 may be part of any suitable processor unit, such as a personal computer or computers including laptops and desktops, tablets, server, etc.
[0076] The tracking module 60 may be a subpart of the CAS controller 50, or an independent module or system. The tracking module 60 receives the position and orientation data from the robot 20, the joint signals from the joint(s) 35 and the readings from the tracker device 40. The tracking module 60 may hence determine the relative position of the objects in the referential system of the CAS system 10, such as relative to the robot arm 20A in a manner described below. For example, the tracking module 60 may track an object such as a bone using the movable surgical tracker 30 of Fig. 2A, 2B or Fig. 2C. In a variant, the movable surgical tracker 30 is in a fixed position and orientation relative to the bone, such that the tracking by the tracking module 60 is achieved using readings from the tracker device 40 or like camera. If the joint 35 has signalling capacity and can produce signals to indicate movement between the links it joins (e.g., segments 33A and 33B), the joint signals can be used by the tracking module 60 as a transform indicative of the relation between the trackable portion 34 and the bone. Stated differently, the trackable portion 34 has changed its position and/or orientation relative to the bone, such that the tracking module 60 must take this changed position and/or orientation into consideration for the tracking. If the joint 35 does not have signalling capacity, a calibration step may be performed for the tracking module 60 to track the bone using the new position and/or orientation.
[0077] The tracking module 60 may also be provided with models of the objects to be tracked. For example, the tracking module 60 may track bones and tools, and may use virtual bone models and tool models that may be merged with optically captured data. The bone models may be acquired from pre-operative imaging (e.g., MRI, CT-scans), for example in 3D or in multiple 2D views, including with 2D X-ray to 3D bone model technologies. The virtual bone models may also include some image processing done preoperatively, for example to remove soft tissue or refine the surfaces that will be exposed and tracked. The virtual bone models may be of greater resolution at the parts of the bone that will be tracked during surgery, such as the knee articulation in knee surgery. The bone models may also carry additional orientation data, such as various axes (e.g., longitudinal axis, mechanical axis, etc). The bone models may therefore be patient specific. It is also considered to obtain bone models from a bone model library, with the data obtained from the video images used to match a generated 3D surface of the bone with a bone from the bone atlas. The virtual tool models may be provided by the tool manufacturer, or may also be generated in any appropriate way so as to be a virtual 3D representation of the tool(s).
[0078] Additional data may also be available, such as tool orientation (e.g., axis data and geometry). By having access to bone and tool models, the tracking module 60 may obtain additional information, such as the axes related to bones or tools.
[0079] Still referring to Fig. 4, the CAS controller 50 may have the robot controller 70 integrated therein, if a robot is used in the CAS system 10. However, the robot controller 70 may be physically separated from the CAS controller 50, for instance by being integrated into the robot 20 (e.g., in the robot base 20B). The robot controller 70 is tasked with powering and/or controlling the various joints of the robot arm 20A. The robot controller 70 may also optionally calculate robot movements of the robot arm 20A, so as to control movements of the robot arm 20A autonomously in some instances, i.e., without intervention from the CAS controller 50. There may be some force feedback provided by the robot arm 20A to avoid damaging the bones, to avoid impacting other parts of the patient or equipment and/or personnel. The robot controller 70 may perform actions based on a surgery planning. The surgery planning may be a module programmed specifically for any given patient, according to the parameters of surgery desired by an operator such as an engineer and/or surgeon. The parameters may include geometry of selected, planned bone cuts, planned cut depths, sequence or workflow of alterations with a sequence of surgical steps and tools, tools used, etc.
[0080] As observed herein, the trackers 30 and the tracker device 40 may be complementary tracking technology. The position and orientation of the surgical tool calculated by the tracking module 60 using optical tracking and joint signals from the joint(s) 35 may be redundant over the tracking data provided by the robot controller 70 and/or the CAS controller 50 and its embedded robot arm sensors 25, referred to as maneuvering data for the robot arm 20A. However, the redundancy may assist in ensuring the accuracy of the tracking of the surgical tool, and end effector 23. If the tracker 30 is one that corresponds to the multifaceted tracker as in Fig. 2B and 2C, the calibration file may include all three of the geometrical patterns (e.g., the triangular patterns of 134A, of 134B and of 134C).
[0081] Now that the various components of the CAS system 10 have been described, a contemplated procedure performed with the CAS system 10 or with a similar CAS system is set forth, with reference to a flow chart 100 illustrative of a method for tracking an object in computer-assisted surgery with a movable surgical tracker such as those shown Figs. 2A, 2B and 2C, the method itself depicted in Fig. 5. The method is an example of a procedure that may be performed by the CAS controller 50 and/or other parts of the CAS system 10 of the present disclosure. For example, the method 100 may be computer-readable program instructions in the non-transitory computer-readable memory 52 for example, and executable by the processing unit 51 communicatively coupled to the processing unit 51 .
[0082] According to 101 , a tracker such as the tracker 30 of Figs. 2A, 2B and 2C is on an object, such as a bone or tool. In a variant, the object is a bone. The user may adjust the position and/or orientation of the trackable portion 34 relative to the base 31 , for a line of sight to be present between the trackable portion 34 and the tracker 40. For instance, this may include a set-up with two or more trackers, such as in Fig. 2F. This may include imparting movement via the joint(s) 35. If the joint 35 is lockable or blockable, such as via a set screw 35A (Figs. 2A, 2B and 2C), the joint 35 may be unlocked for the movement and then locked. In a variant, the joint 35 has inherent friction to prevent movement at the joint 35 unless an appreciable amount of force is applied to the tracker 30.
[0083] According to 102, the object may be tracked, using the tracker device 34 optically viewing the tracker 30. The tracking may be continuous, i.e., it may be nonstop during one or more steps. The continuous tracking may include pauses.
[0084] According to 103, reference points may be digitized for the object, e.g., the bone. This may include one or more axes, surfacic data such as a cloud of points. 103 may occur before 102, or during 102, for example. In some instances, models may be associated to the trackers 30, via calibration steps, model generation with depth cameras, etc, such that the digitizing (also known as registering) may be minimal or unnecessary. The reference points or like object data may be recorded in the coordinate system of the CAS system 10. The coordinate system may be on the bone. The reference points or like object data may be trackable with reference to the tracker 30.
[0085] According to 104, signalling may be received indicative of a change of position and/or orientation of the trackable portion 34 relative to the object, by movement at the joint 35. In a variant in which the joint 35 is not electronically equipped, the signalling may come from a user interface. For example, the user may ask for a pause of tracking while the trackable portion 34 is moved to another position and/or orientation using the joint 35, for instance to liberate some space or enhance the ergonomy of the surgical site. This may include unlocking and locking the joint 35, and/or removing the strap 33E, relocating the segment 33B (e.g., Figs. 2A-2C) and reinstalling the strap 33E. In a variant, the bone is kept immobile during movement of the trackable portion 34. Once the change has occurred, the trackable module 60 may recalibrate the trackable portion 34 using the reference points on the bone as reference to set the new position and/or orientation of the trackable portion 34. This may include recording points on the bone as per 103, or using the segment 33A as reference as another possibility. Other approaches are considered, such as using image processing to quantify the change.
[0086] In an alternative embodiment, the joint 35 provides signals through its electronic components, to quantify the movement. The signals may be provided in realtime, for the tracking module 60 to receive transform data. The trackable module 60 may recalibrate the trackable portion 34 using the reference points on the bone as reference and the transform data from the joint 35 (if electronic) to set the new position and/or orientation of the trackable portion 34.
[0087] According to 105, the object is tracked optically using the tracker having the trackable portion viewed by the tracker device using the change of position and/or orientation. For example, the tracking of 105 is continuous, and may be with or without interruption, such as if the signals are provided in real-time by the joint 35. 105 may also include outputting the tracking data, and this may be in the form of images on an interface, numerical data, etc. [0088] Hence, the method 100 may generally be described as including: tracking the object optically using a tracker having a trackable portion with a joint enabling movement, the trackable portion being viewed by a tracker device; receiving signalling indicative of a change of position and/or orientation of the trackable portion relative to the object, by movement at the joint; and tracking the object optically using the tracker having the trackable portion viewed by the tracker device using the change of position and/or orientation. It may be necessary to use a computing device, notably because the human eye does not have the capacity to quantify changes of position and/or orientation.
[0089] The CAS system 10 may generally be described as being a system for tracking an object in computer-assisted surgery. The CAS system 10 may include a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: tracking the object optically using a tracker having a trackable portion with a joint enabling movement, the trackable portion being viewed by a tracker device; receiving signalling indicative of a change of position and/or orientation of the trackable portion relative to the object, by movement at the joint; and tracking the object optically using the tracker having the trackable portion viewed by the tracker device using the change of position and/or orientation.

Claims

CLAIMS:
1 . A system fortracking an object in computer-assisted surgery, comprising: a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: tracking the object optically using a tracker having a trackable portion with a joint enabling movement, the trackable portion being viewed by a tracker device; receiving signalling indicative of a change of position and/or orientation of the trackable portion relative to the object, by movement at the joint; and tracking the object optically using the tracker having the trackable portion viewed by the tracker device using the change of position and/or orientation.
2. The system according to claim 1 , wherein receiving signalling includes receiving signals from a joint in the tracker.
3. The system according to claim 2, wherein receiving signals from the joint in the tracker includes receiving signals quantifying the change.
4. The system according to claim 3, wherein tracking the object using the change of position and/or orientation includes tracking the object continuously while receiving the signals.
5. The system according to claim 1 , wherein receiving signalling includes receiving an interaction requesting a pause in the tracking.
6. The system according to claim 5, further comprising recalibrating the tracker to quantify the change after the pause.
7. The system according to any one of claims 1 to 6, including obtaining reference points on a surface of the object prior to or during the tracking of the object.
8. The system according to any one of claims 1 to 7, wherein the object is a bone and the tracker is fixed to the bone.
9. The system according to any one of claims 1 to 8, including the tracker with the joint and the trackable portion.
10. The system according to claim 9, wherein the joint is a joint providing at least one rotational degree of freedom.
11 . The system according to claim 10, wherein the joint is a spherical joint.
12. The system according to claim 9 or claim 10, wherein the joint is a joint providing at least one translational degree of freedom.
13. The system according to claim 12, wherein the joint is a cylindrical joint.
14. The system according to any one of claims 9 to 13, wherein the joint has encoding capacity to emit signals quantifying movement at the joint.
15. The system according to any one of claims 1 to 14, including the tracker device for optically tracking the tracker.
16. The system according to any one of claims 1 to 15, including the tracker.
17. A tracker for optical tracking during computer-assisted surgery comprising: a base configured to be secured to a bone; a trackable portion having at least one optically detectable element; at least one stem spacing the base from the trackable portion; and a joint in the tracker enabling at least one degree of freedom of movement between the base and the trackable portion, the joint having a sensor configured to emit a signal quantifying the movement.
18. The tracking according to claim 17, wherein the joint is in the stem, whereby the stem has a first segment from the base to the joint, and a second segment from the joint to the trackable portion.
19. The tracking according to claim 18, wherein a ratio of length between the second segment and the first segment is at least 5:1 .
20. The tracking according to claim 17, wherein the joint is between the stem and the base.
21. The tracking according to any one of claims 17 to 20, wherein the joint is a spherical joint providing at least two rotational degrees of freedom of movement.
22. The tracking according to claim 21 , wherein a ball portion of the spherical joint is fixed relative to the base.
23. The tracking according to claim 22, wherein a channel extends from a surface of the ball portion and into the base, the channel configured to receive a fastener.
24. The tracking according to claim 23, wherein the sensor is an assembly including a 3D Hall-effect sensor and magnet.
25. The tracking according to any one of claims 17 to 24, wherein the sensor includes a wireless chip configured for wireless communication of the signal.
26. The tracking according to claim 17, wherein the joint is a cylindrical joint providing one rotational degree of freedom of movement and one translational degree of freedom of movement.
27. The tracking according to any one of claims 17 to 26, including a strap holder on the stem.
PCT/CA2025/050335 2024-03-11 2025-03-11 Movable surgical tracker Pending WO2025189285A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463563487P 2024-03-11 2024-03-11
US63/563,487 2024-03-11

Publications (1)

Publication Number Publication Date
WO2025189285A1 true WO2025189285A1 (en) 2025-09-18

Family

ID=97062558

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2025/050335 Pending WO2025189285A1 (en) 2024-03-11 2025-03-11 Movable surgical tracker

Country Status (1)

Country Link
WO (1) WO2025189285A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007087351A2 (en) * 2006-01-24 2007-08-02 Carnegie Mellon University Method, apparatus, and system for computer-aided tracking, navigation, and motion teaching
US20150282735A1 (en) * 2014-04-04 2015-10-08 Izi Medical Products,Llc Reference device for surgical navigation system
US20170340395A1 (en) * 2016-05-26 2017-11-30 Mako Surgical Corp. Navigation Tracker With Kinematic Connector Assembly
WO2023079530A1 (en) * 2021-11-08 2023-05-11 Neocis Inc. A robot system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007087351A2 (en) * 2006-01-24 2007-08-02 Carnegie Mellon University Method, apparatus, and system for computer-aided tracking, navigation, and motion teaching
US20150282735A1 (en) * 2014-04-04 2015-10-08 Izi Medical Products,Llc Reference device for surgical navigation system
US20170340395A1 (en) * 2016-05-26 2017-11-30 Mako Surgical Corp. Navigation Tracker With Kinematic Connector Assembly
WO2023079530A1 (en) * 2021-11-08 2023-05-11 Neocis Inc. A robot system

Similar Documents

Publication Publication Date Title
US12201383B2 (en) Bone and tool tracking in robotized computer-assisted surgery
US11844577B2 (en) System and method for verifying calibration of a surgical system
US11633233B2 (en) Surgical system for cutting an anatomical structure according to at least one target cutting plane
EP3551099B1 (en) Surgical system for cutting an anatomical structure according to at least one target plane
US5976156A (en) Stereotaxic apparatus and method for moving an end effector
US12260561B2 (en) Tracking system for robotized computer-assisted surgery
US20230346484A1 (en) Robotic surgery system with user interfacing
US20250325277A1 (en) Systems and methods for robot-assisted knee arthroplasty surgery
EP4454597A2 (en) Surgical robotic arm with proximity sensing
EP3949889B1 (en) Robotic surgical system including a coupler for connecting a tool to a manipulator and methods of using the coupler
WO2025189285A1 (en) Movable surgical tracker
US20250134606A1 (en) Robotic surgery system with user interfacing
US20240374329A1 (en) Robotic system with force monitoring for computer-assisted surgery system
US20250331937A1 (en) System And Method For Aligning An End Effector To A Haptic Object
WO2025222291A1 (en) Advanced planning for accessibility assessment in robotic assisted surgery
WO2025006323A1 (en) Locating features at pre-defined locations relative to a bone cut surface

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25769088

Country of ref document: EP

Kind code of ref document: A1