[go: up one dir, main page]

WO2025240248A1 - Burr tracking for surgical navigation procedures - Google Patents

Burr tracking for surgical navigation procedures

Info

Publication number
WO2025240248A1
WO2025240248A1 PCT/US2025/028579 US2025028579W WO2025240248A1 WO 2025240248 A1 WO2025240248 A1 WO 2025240248A1 US 2025028579 W US2025028579 W US 2025028579W WO 2025240248 A1 WO2025240248 A1 WO 2025240248A1
Authority
WO
WIPO (PCT)
Prior art keywords
burr
instrument
outer sheath
fiducial
surgical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2025/028579
Other languages
French (fr)
Inventor
Paul Alexander Torrie
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Smith and Nephew Asia Pacific Pte Ltd
Smith and Nephew Inc
Original Assignee
Smith and Nephew Asia Pacific Pte Ltd
Smith and Nephew Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Smith and Nephew Asia Pacific Pte Ltd, Smith and Nephew Inc filed Critical Smith and Nephew Asia Pacific Pte Ltd
Publication of WO2025240248A1 publication Critical patent/WO2025240248A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/32Surgical cutting instruments
    • A61B17/320016Endoscopic cutting instruments, e.g. arthroscopes, resectoscopes
    • A61B17/32002Endoscopic cutting instruments, e.g. arthroscopes, resectoscopes with continuously rotating, oscillating or reciprocating cutting instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery

Definitions

  • the present disclosure relates surgical navigation systems and methods, and more particularly to instrument tracking techniques for surgical navigation systems.
  • Arthroscopic surgical procedures are minimally invasive surgical procedures in which access to the surgical site within the body is by way of small keyholes or ports through the patient’s skin.
  • the various tissues within the surgical site are visualized by way of an arthroscope placed through a port or portal, and the internal scene is shown on an external display device.
  • the tissue may be repaired or replaced through the same or additional ports.
  • computer-assisted surgical procedures e.g., surgical procedures associated with a knee or knee joint, surgical procedures associated with a hip or hip joint, etc.
  • the location of various objects with the surgical site may be tracked relative to the bone by way of images captured by an arthroscope and a three- dimensional model of the bone.
  • a surgical instrument configured to perform at least one of a shaving function and a burr function of a surgical procedure includes a handle, an outer sheath having a proximal end coupled to the handle and a distal end opposite the proximal end, the outer sheath defining an inner channel having an opening defined at the distal end, an inner cutting assembly disposed within the inner channel of the outer sheath, the inner cutting assembly being configured to rotate within the outer sheath to perform the at least one of the shaving function and the burr function, and the inner cutting assembly including a cutting end exposed within the opening of the inner channel, and at least one fiducial marker located on a portion of the distal end of the outer sheath that overlaps the cutting end of the inner cutting assembly, the at least one fiducial marker being configured to be detected by an optical tracking system.
  • the surgical instrument is a burr/shaver instrument.
  • the cutting end includes a burr tip, and wherein the at least one fiducial marker extends over at least 50% of the burr tip in a distal direction.
  • the outer sheath includes a cutaway exposing at least a portion of the burr tip. The cutaway is defined in two or more sides of the outer sheath. A gap between the cutaway and the at least one fiducial marker is less than 2.0 mm.
  • the inner cutting assembly includes an inner shaft, the at least one fiducial marker includes a plurality of fiducial markers, and at least one of the plurality of fiducial markers is located on the inner shaft.
  • the outer sheath includes at least one of a window and a transparent portion aligned with the at least one of the plurality of fiducial markers located on the inner shaft.
  • the at least one fiducial marker is arranged on a plane that is not parallel to a longitudinal axis of the surgical instrument.
  • the at least one fiducial marker includes first and second fiducial markers, and the first and second fiducial markers have different shapes or different sizes relative to one another.
  • a system includes a computing device configured to receive one or more images of a surgical site, the one or more images including the at least one fiducial marker of the surgical instrument of claim 1 , detect the at least one fiducial marker, and determine a location of the surgical instrument based on the detected at least one fiducial marker.
  • a system for tracking a position of a shaver/burr instrument relative to patient anatomy includes memory storing instructions and one or more processing devices configured to execute the instructions. Executing the instructions causes the one or more processing devices to receive one or more images of a surgical site, the one or more images including at least one fiducial marker of the shaver/burr instrument, the at least one fiducial marker overlapping a cutting end of the shaver/burr instrument in a distal direction, detect the at least one fiducial marker of the shaver/burr instrument, and determine a location of the shaver/burr instrument within the surgical site based on the detected at least one fiducial marker.
  • the system further includes the shaver/burr instrument.
  • the shaver/burr instrument includes a handle, an outer sheath having a proximal end coupled to the handle and a distal end opposite the proximal end, the outer sheath defining an inner channel having an opening defined at the distal end, and an inner cutting assembly disposed within the inner channel of the outer sheath, the inner cutting assembly being configured to rotate within the outer sheath to perform at least one of a shaving function and a burr function, and the inner cutting assembly including the cutting end.
  • the least one fiducial marker is located on a portion of the distal end of the outer sheath that overlaps the cutting end of the inner cutting assembly.
  • the cutting end includes a burr tip, and the at least one fiducial marker extends over at least 50% of the burr tip in the distal direction.
  • the shaver/burr instrument includes an outer sheath and an inner cutting assembly configured to rotate within the outer sheath, the at least one fiducial marker includes a plurality of fiducial markers, and at least one of the plurality of fiducial markers is located on the inner cutting assembly.
  • the outer sheath includes at least one of a window and a transparent portion aligned with the at least one of the plurality of fiducial markers located on the inner cutting assembly.
  • the at least one fiducial marker is arranged on a plane that is not parallel to a longitudinal axis of the shaver/burr instrument.
  • the at least one fiducial marker includes first and second fiducial markers, and wherein the first and second fiducial markers have different shapes or different sizes relative to one another.
  • a method for tracking a position of a shaver/burr instrument includes, using one or more processors configured to execute instructions stored in memory, receiving one or more images of a surgical site, the one or more images including at least one fiducial marker arranged on a portion of the shaver/burr instrument overlapping, in a distal direction, a cutting end of an inner cutting assembly of the shaver/burr instrument, detecting the at least one fiducial marker of the shaver/burr instrument, determining a location of the shaver/burr instrument within the surgical site based on the detected at least one fiducial marker, and at least one of displaying visual guidance based on the determined location of the shaver/burr instrument and controlling operation of the shaver/burr instrument based on the determined location of the shaver/burr instrument.
  • a surgical instrument configured to perform at least one of a shaving function and a burr function of a surgical procedure includes a handle, an outer sheath having a proximal end coupled to the handle and a distal end opposite the proximal end, the outer sheath defining an inner channel having an opening defined at the distal end, an inner cutting assembly disposed within the inner channel of the outer sheath, the inner cutting assembly being configured to rotate within the outer sheath to perform the at least one of the shaving function and the burr function, and the inner cutting assembly including a cutting end exposed within the opening of the inner channel, and at least one fiducial marker located on a portion of the distal end of the outer sheath located within 10.0 mm of the cutting end of the inner cutting assembly, the at least one fiducial marker being configured to be detected by an optical tracking system.
  • FIG. 1 shows a surgical system in accordance with at least some embodiments
  • FIG. 2 shows a conceptual drawing of a surgical site with various objects within the surgical site tracked, in accordance with at least some embodiments
  • FIG. 3 shows a method in accordance with at least some embodiments
  • FIG. 4 is an example video display showing portions of a femur and a bone fiducial during a registration procedure, in accordance with at least some embodiments;
  • FIG. 5 shows a method in accordance with at least some embodiments
  • FIG. 6 shows an example burr instrument in accordance with at least some embodiments
  • FIGS. 7A, 7B, and 7C show (in a perspective view) another example burr instrument in accordance with at least some embodiments
  • FIGS. 8A, 8B, and 8C show other example arrangements of fiducial markers on a burr instrument in accordance with at least some embodiments
  • FIG. 10 shows an example computer system or computing device configured to implement the various systems and methods of the present disclosure.
  • a processor programmed to perform various functions refers to one processor programmed to perform each and every function, or more than one processor collectively programmed to perform each of the various functions.
  • an initial reference to “a [referent]”, and then a later reference for antecedent basis purposes to “the [referent]”, shall not obviate the fact the recited referent may be plural.
  • terms, such as “a,” “an,” or “the,” again, may be understood to convey a singular usage or to convey a plural usage, depending at least in part upon context.
  • the term “based on” may be understood as not necessarily intended to convey an exclusive set of factors and may, instead, allow for existence of additional factors not necessarily expressly described, again, depending at least in part on context.
  • a timer circuit may define a clock output.
  • the example timer circuit may create or drive a clock signal on the clock output.
  • these “inputs” and “outputs” define electrical connections and/or signals transmitted or received by those connections.
  • these “inputs” and “outputs” define parameters read by or written by, respectively, the instructions implementing the function.
  • “input” may refer to actions of a user, interactions with input devices or interfaces by the user, etc.
  • Controller shall mean, alone or in combination, individual circuit components, an application specific integrated circuit (ASIC), a microcontroller with controlling software, a reduced-instruction-set computer (RISC) with controlling software, a digital signal processor (DSP), a processor with controlling software, a programmable logic device (PLD), a field programmable gate array (FPGA), or a programmable system-on-a-chip (PSOC), configured to read inputs and drive outputs responsive to the inputs.
  • ASIC application specific integrated circuit
  • RISC reduced-instruction-set computer
  • DSP digital signal processor
  • PLD programmable logic device
  • FPGA field programmable gate array
  • PSOC programmable system-on-a-chip
  • proximal refers to a point or direction nearest a handle of the probe (e.g., a direction opposite the probe tip).
  • distal refers to a point or direction nearest the probe tip (e.g., a direction opposite the handle).
  • a non-transitory computer readable medium stores computer data, which data can include computer program code (or computer-executable instructions) that is executable by a computer, in machine-readable form.
  • a computer readable medium may comprise computer readable storage media, for tangible or fixed storage of data, or communication media for transient interpretation of code-containing signals.
  • Computer readable storage media refers to physical or tangible storage (as opposed to signals) and includes without limitation volatile and non-volatile, removable and non-removable media implemented in any method or technology for the tangible storage of information such as computer-readable instructions, data structures, program modules or other data.
  • Computer readable storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, optical storage, cloud storage, magnetic storage devices, or any other physical or material medium which can be used to tangibly store the desired information or data or instructions and which can be accessed by a computer or processor.
  • server should be understood to refer to a service point that provides processing, database, and communication facilities.
  • server can refer to a single, physical processor with associated communications and data storage and database facilities, or it can refer to a networked or clustered complex of processors and associated network and storage devices, as well as operating software and one or more database systems and application software that support the services provided by the server. Cloud servers are examples.
  • a “network” should be understood to refer to a network that may couple devices so that communications may be exchanged, such as between a server and a client device or other types of devices, including between wireless devices coupled via a wireless network, for example.
  • a network may also include mass storage, such as network attached storage (NAS), a storage area network (SAN), a content delivery network (CDN) or other forms of computer or machine- readable media, for example.
  • a network may include the Internet, one or more local area networks (LANs), one or more wide area networks (WANs), wire-line type connections, wireless type connections, cellular or any combination thereof.
  • LANs local area networks
  • WANs wide area networks
  • wire-line type connections wireless type connections
  • cellular or any combination thereof may be any combination thereof.
  • sub-networks which may employ differing architectures or may be compliant or compatible with differing protocols, may interoperate within a larger network.
  • a wireless network should be understood to couple client devices with a network.
  • a wireless network may employ stand-alone ad- hoc networks, mesh networks, Wireless LAN (WLAN) networks, cellular networks, or the like.
  • a wireless network may further employ a plurality of network access technologies, including Wi-Fi, Long Term Evolution (LTE), WLAN, Wireless Router (WR) mesh, or 2nd, 3rd, 4 th or 5 th generation (2G, 3G, 4G or 5G) cellular technology, mobile edge computing (MEC), Bluetooth, 802.11 b/g/n, or the like.
  • Network access technologies may enable wide area coverage for devices, such as client devices with varying degrees of mobility, for example.
  • a wireless network may include virtually any type of wireless communication mechanism by which signals may be communicated between devices, such as a client device or a computing device, between or within a network, or the like.
  • a computing device may be capable of sending or receiving signals, such as via a wired or wireless network, or may be capable of processing or storing signals, such as in memory as physical memory states, and may, therefore, operate as a server.
  • devices capable of operating as a server may include, as examples, dedicated rackmounted servers, desktop computers, laptop computers, set top boxes, integrated devices combining various features, such as two or more features of the foregoing devices, or the like.
  • a client (or consumer or user) device may include a computing device capable of sending or receiving signals, such as via a wired or a wireless network.
  • a client device may, for example, include a desktop computer or a portable device, such as a cellular telephone, a smart phone, a display pager, a radio frequency (RF) device, an infrared (IR) device a Near Field Communication (NFC) device, a Personal Digital Assistant (PDA), a handheld computer, a tablet computer, a phablet, a laptop computer, a set top box, a wearable computer, smart watch, an integrated or distributed device combining various features, such as features of the forgoing devices, or the like.
  • RF radio frequency
  • IR infrared
  • NFC Near Field Communication
  • PDA Personal Digital Assistant
  • the client device can also be, or can communicatively be coupled to, any type of known or to be known medical device (e.g., any type of Class I, II or III medical device), such as, but not limited to, a MRI machine, CT scanner, Electrocardiogram (ECG or EKG) device, photopletismograph (PPG), Doppler and transmit-time flow meter, laser Doppler, an endoscopic device neuromodulation device, a neurostimulation device, and the like, or some combination thereof.
  • a MRI machine e.g., any type of Class I, II or III medical device
  • ECG or EKG Electrocardiogram
  • PPG photopletismograph
  • Doppler and transmit-time flow meter e.g., laser Doppler
  • an endoscopic device neuromodulation device e.g., a neurostimulation device
  • neurostimulation device e.g., a neurostimulation device, and the like, or some combination thereof.
  • CAS Computer-Aided Surgery
  • surgical navigation systems support surgeons in planning and performing complex surgical procedures with increased precision and accuracy.
  • arthroscopy is a minimally invasive medical procedure for diagnosing and treating joint problems.
  • An orthopedic surgeon makes a small incision in the skin of the patient and inserts a lens into the incision.
  • the lens is attached to a camera (e.g., an endoscopic camera) and coupled to a light source, allowing the joint to be visualized and treated.
  • Surgical navigation and CAS systems have had significant impact in minimally invasive surgeries (MIS) such as arthroscopic procedures because the increased difficulty in visualizing the anatomy of the patient further complicates the surgical workflow.
  • MIS minimally invasive surgeries
  • Video-based surgical navigation uses visual fiducials or markers (also called visual markers) attached to patient anatomy to guide the surgeon throughout the medical procedure.
  • the videobased navigation process requires the precise registration of a pre-operative anatomical model with data acquired intra-operatively.
  • the registration process requires the surgeon to digitize the surface of interest that corresponds to the preoperative model.
  • the visual markers attached to the anatomies define reference frames to which the pre-operative model and the intra-operative acquired data are aligned.
  • fiducial markers with known visual patterns may be attached both to the targeted anatomy and to the instruments and subsequently tracked such that their relative poses can be accurately estimated (e.g., by applying 3D computer vision methods on the images/video acquired by a camera). These relative poses allow the instruments to be located with respect to the anatomy at every frame time instant.
  • VBSN facilitates the tracking of instruments with respect to the targeted anatomy to which a fiducial is rigidly attached (which may be referred to as a “base marker”).
  • a shaver/burr instrument includes a handpiece held by the surgeon to guide and control the instrument, an outer sheath extending distally from the handpiece, and an inner shaver or burr cutting assembly configured to rotate within the sheath (e.g., responsive to control signals received via the handpiece).
  • the shaver/burr includes a corresponding distal tip (i.e. , a blade or burr tip) configured to cut and remove tissue (e.g., using a shaver blade tip) or bone (e.g., using a burr tip).
  • Tracking locations of a shaver/burr may be difficult relative to other instruments used in arthroscopic procedures.
  • multiple components i.e., the outer sheath and the inner rotating cutting assembly
  • distances between and relative positions of the multiple components e.g., manufacturing tolerances, geometric constraints, etc.
  • placement of fiducial markings may be difficult and visibility of the fiducial markings may be reduced for certain types of procedures (e.g., hip procedures) and positions/orientations of the shaver/burr instrument within a joint or other surgical site.
  • a shaver/burr incudes various configurations or arrangements of fiducial markers to facilitate tracking by a surgical navigation system.
  • multiple fiducial markers are positioned on multiple planes at a distal end of the shaver/burr.
  • at least one fiducial may be positioned on the rotating cutting assembly to facilitate optical detection (e.g., by an optical tracking system).
  • optical detection e.g., by an optical tracking system.
  • shaver/burr instruments Although described with respect to shaver/burr instruments, the principles of the present disclosure may be applied to other types of surgical instruments, including, but not limited to, radio frequency (RF) wands. Further, although describe with respect to surgical procedures performed on joints (e.g., hips, knees, etc.), instruments configured according to the principles of the present disclosure may be used for other types (e.g., non-joint) surgical procedures.
  • RF radio frequency
  • FIG. 1 shows an example surgical system (e.g., a system including or implementing an arthroscopic video-based navigation system) 100 in accordance with at least some embodiments of the present disclosure.
  • the example surgical system 100 comprises a tower or device cart 102 and various tools or instruments, such as an example mechanical resection instrument 104, an example plasma-based ablation instrument (hereafter just ablation instrument 106), and an endoscope in the example form of an arthroscope 108 and attached camera head or camera 110.
  • the arthroscope 108 may be a rigid device, unlike endoscopes for other procedures, such as upper-endoscopies.
  • the resection instrument 104 may correspond to a shaver/burr instrument configured in accordance with the principles of the present disclosure as described below in more detail.
  • the device cart 102 may comprise a display device 114, a resection controller 116, and a camera control unit (CCU) together with an endoscopic light source and video (e.g., a VBN) controller 118.
  • the combined CCU and video controller 118 not only provides light to the arthroscope 108 and displays images received from the camera 110, but also implements various additional aspects, such as registering a three-dimensional bone model with the bone visible in the video images, and providing computer-assisted navigation during the surgery.
  • the combined CCU and video controller are hereafter referred to as surgical controller 118.
  • the CCU and video controller may be a separate and distinct system from the controller that handles registration and computer-assisted navigation, yet the separate devices would nevertheless be operationally coupled.
  • the example device cart 102 further includes a pump controller 122 (e.g., single or dual peristaltic pump). Fluidic connections of the mechanical resection instrument 104 and ablation instrument 106 to the pump controller 122 are not shown so as not to unduly complicate the figure. Similarly, fluidic connections between the pump controller 122 and the patient are not shown so as not to unduly complicate the figure. In the example system, both the mechanical resection instrument 104 and the ablation instrument 106 are coupled to the resection controller 116 being a dualfunction controller. In other cases, however, there may be a mechanical resection controller separate and distinct from an ablation controller.
  • the example devices and controllers associated with the device cart 102 are merely examples, and other examples include vacuum pumps, patient-positioning systems, robotic arms holding various instruments, ultrasonic cutting devices and related controllers, patientpositioning controllers, and robotic surgical systems.
  • FIGS. 1 and 2 further show additional instruments that may be present during an arthroscopic surgical procedure.
  • an example probe 124 e.g., shown as a touch probe, but which may be a touchless probe in other examples
  • a drill guide or aimer 126 e.g., a drill guide or aimer 126
  • a bone fiducial 128 e.g., a bone fiducial 128
  • the probe 124 may be used during the surgical procedure to provide information to the surgical controller 118, such as information to register a three-dimensional bone model to an underlying bone visible in images captured by the arthroscope 108 and camera head 110.
  • the aimer 126 may be used as a guide for placement and drilling with a drill wire to create an initial or pilot tunnel through the bone.
  • the bone fiducial 128 may be affixed or rigidly attached to the bone and serve as an anchor location for the surgical controller 118 to know the position and orientation of the bone (e.g., after registration of a three-dimensional bone model). Additional tools and instruments may be present, such as the drill wire, various reamers for creating the throughbore and counterbore aspects of a tunnel through the bone, and various tools, such as for suturing and anchoring a graft. These additional tools and instruments are not shown so as not to further complicate the figure.
  • Example workflow for a surgical procedure is described below. While described with respect to an example anterior cruciate ligament repair procedure, the below techniques may also be performed for other types of surgical procedures, such as hip procedures or other procedures that include joint distraction.
  • a surgical procedure may begin with a planning phase.
  • An example procedure may start with imaging (e.g., X-ray imaging, computed tomography (CT), magnetic resonance imaging (MRI)) of the anatomy of the patient, including the relevant anatomy (e.g., for a knee procedure the lower portion of the femur, the upper portion of the tibia, and the articular cartilage; for a hip procedure, an upper portion of the femur, the acetabulum/hip joint, pelvis, etc.).
  • imaging e.g., X-ray imaging, computed tomography (CT), magnetic resonance imaging (MRI)
  • CT computed tomography
  • MRI magnetic resonance imaging
  • the imaging may be preoperative imaging, hours or days before the intraoperative repair, or the imaging may take place within the surgical setting just prior to the intraoperative repair.
  • the discussion that follows assumes MRI imaging, but again many different types of imaging may be used.
  • the image slices from the MRI imaging can be segmented such that a volumetric model or three-dimensional model of the anatomy is created. Any suitable currently available, or after developed, segmentation technology may be used to create the three-dimensional model. More specifically to the example of anterior cruciate ligament repair, a three-dimensional bone model of the lower portion of the femur, including the femoral condyles, is created. Conversely, for a hip procedure, a three- dimensional model of the upper portion of the femur and at least a portion of the pelvis (e.g., the acetabulum) is created.
  • an operative plan is created.
  • the results of the planning may include: a three-dimensional bone model of the distal end of the femur; a three-dimensional bone model for a proximal end of the tibia; an entry location and exit location through the femur and thus a planned-tunnel path for the femur; and an entry location and exit location through the tibia and thus a planned-tunnel path through the tibia.
  • Other surgical parameters may also be selected during the planning, such as tunnel throughbore diameters, tunnel counterbore diameters and depth, desired post-repair flexion, and the like, but those additional surgical parameters are omitted so as not to unduly complicate the specification.
  • the results of the planning may include a three-dimensional bone model of the proximal end of the femur; a three-dimensional bone model for at least a portion of the pelvis/hip joint (e.g., a region of the pelvis corresponding to the acetabulum); a surgical area of interest within the hip joint; and parameters associated with achieving an amount of distraction in the surgical area of interest to provide sufficient access to the surgical area of interest.
  • example hip procedures may include, but are not limited to, labral repair, femoroacetabular impingement (FAI) debridement (e.g., removal of bone spurs/growths), cartilage repair, and synovectomy (e.g., removal of inflamed tissue).
  • FAI femoroacetabular impingement
  • synovectomy e.g., removal of inflamed tissue.
  • These example procedures typically require access to a specific surgical area of interest within the hip joint (i.e., in a specific area within an interface between the pelvis and the femoral head, such as an area around/surrounding a bone spur or growth, cartilage or tissue to be repaired or removed, etc.).
  • the intraoperative aspects include steps and procedures for setting up the surgical system to perform the various repairs. It is noted, however, that some of the intraoperative aspects (e.g., optical system calibration) may take place before any portals or incisions are made through the patient’s skin, and in fact before the patient is wheeled into the surgical room. Nevertheless, such steps and procedures may be considered intraoperative as they take place in the surgical setting and with the surgical equipment and instruments used to perform the actual repair.
  • FIG. 2 shows a conceptual drawing of a surgical site with various objects (e.g., surgical instruments/tools) within the surgical site.
  • objects e.g., surgical instruments/tools
  • FIG. 2 shows a conceptual drawing of a surgical site with various objects (e.g., surgical instruments/tools) within the surgical site.
  • a distal end of the arthroscope 108 a portion of a bone 200 (e.g., femur), the bone fiducial 128 within the surgical site, and the probe 124.
  • the arthroscope 108 illuminates the surgical site with visible light.
  • the illumination is illustrated by arrows 208.
  • the illumination provided to the surgical site is reflected by various objects and tissues within the surgical site, and the reflected light that returns to the distal end enters the arthroscope 108, propagates along an optical channel within the arthroscope 108, and is eventually incident upon a capture array within the camera 110 (FIG. 1 ).
  • the images detected by the capture array within the camera 110 are sent electronically to the surgical controller 118 (FIG. 1 ) and displayed on the display device 114 (FIG. 1 ).
  • the arthroscope 108 is monocular or has a single optical path through the arthroscope for capturing images of the surgical site, notwithstanding that the single optical path may be constructed of two or more optical members (e.g., glass rods, optical fibers). That is to say, in example systems and methods the computer-assisted navigation provided by the arthroscope 108, the camera 110, and the surgical controller 118 is provided with the arthroscope 108 that is not a stereoscopic endoscope having two distinct optical paths separated by an interocular distance at the distal end endoscope.
  • the computer-assisted navigation provided by the arthroscope 108, the camera 110, and the surgical controller 118 is provided with the arthroscope 108 that is not a stereoscopic endoscope having two distinct optical paths separated by an interocular distance at the distal end endoscope.
  • Viewing direction refers to a line residing at the center of an angle subtended by the outside edges or peripheral edges of the view of an endoscope.
  • the viewing direction for some arthroscopes is aligned with the longitudinal central axis of the arthroscope, and such arthroscopes are referred to as “zero degree” arthroscopes (e.g., the angle between the viewing direction and the longitudinal central axis of the arthroscope is zero degrees).
  • the viewing direction of other arthroscopes forms a non-zero angle with the longitudinal central axis of the arthroscope.
  • the viewing direction forms a 30° angle to the longitudinal central axis of the arthroscope, the angle measured as an obtuse angle beyond the distal end of the arthroscope.
  • the view angle 210 of the arthroscope 108 forms a non-zero angle to the longitudinal central axis 212 of the arthroscope 108.
  • within the view of the arthroscope 108 is a portion of the bone 200 (in this example, within the intercondylar notch), along with the example bone fiducial 128, and the example probe 124.
  • the example bone fiducial 128 is multifaceted element, with each face or facet having a fiducial disposed or created thereon. However, the bone fiducial need not have multiple faces, and in fact may take any shape so long as that shape can be tracked within the video images.
  • the bone fiducial such as bone fiducial 128, may be attached to the bone 200 in any suitable form (e.g., via the screw portion of the bone fiducial 128 visible in FIG. 1 ).
  • the patterns of the fiducials on each facet are designed to provide information regarding the position and orientation of the bone fiducial 128 in the three-dimensional coordinate space of the view of the arthroscope 108. More particularly, the pattern is selected such that the position and orientation of the bone fiducial 128 may be determined from images captured by the arthroscope 108 and attached camera (FIG. 1 ).
  • the probe 124 is also shown as partially visible within the view of the arthroscope 108.
  • the probe 124 may be used, as discussed more below, to identify a plurality of surface features on the bone 200 as part of the registration of the bone 200 to the three-dimensional bone model.
  • the probe 124 and/or the aimer 126 may carry their own, unique fiducials, such that their respective poses may be calculated from the one or more fiducial present in the video stream.
  • the medical instrument used to help with registration of the three-dimensional bone model be it the probe 124, the aimer 126, or any other suitable medical device, may omit carrying fiducials. Stated otherwise, in such examples the medical instrument has no fiducial markings. In such cases, the pose of the medical instrument may be determined by a machine learning model, discussed in more detail below.
  • the images captured by the arthroscope 108 and attached camera are subject to optical distortion in many forms.
  • the visual field between distal end of the arthroscope 108 and the bone 200 within the surgical site is filled with fluid, such as bodily fluids and saline used to distend the joint.
  • fluid such as bodily fluids and saline used to distend the joint.
  • Many arthroscopes have one or more lenses at the distal end that widen the field of view, and the wider field of view causes a “fish eye” effect in the captured images.
  • the optical elements within the arthroscope e.g., rod lenses
  • the camera may have various optical elements for focusing the images received onto the capture array, and the various optical elements may have aberrations inherent to the manufacturing and/or assembly process.
  • the endoscopic optical system prior to use within each surgical procedure, is calibrated to account for the various optical distortions.
  • the calibration creates a characterization function that characterizes the optical distortion, and further analysis of the frames of the video stream may be, prior to further analysis, compensated using the characterization function.
  • the next example step in the intraoperative procedure is the registration of the bone model created during the planning stage.
  • the three-dimensional bone model is obtained by or provided to the surgical controller 118.
  • the surgical controller 118 receives the three- dimensional bone model, and assuming the arthroscope 108 is inserted into the knee by way of a port or portal through the patient’s skin, the surgical controller 118 also receives video images of a portion of the lower end of the femur.
  • the surgical controller 118 registers the three-dimensional bone model to the images of the femur received by way of the arthroscope 108 and camera 110.
  • the bone fiducial 128 is attached to the femur.
  • the bone fiducial placement is such that the bone fiducial is within the field of view of the arthroscope 108.
  • the bone fiducial 128 is placed within the intercondylar notch superior to the expected location of the tunnel through lateral condyle.
  • the bone fiducial 128 is placed on the femoral head.
  • the surgical controller 118 (FIG. 1 ) is provided or determines a plurality of surface features of an outer surface of the bone.
  • Identifying the surface features may take several forms, including a touch-based registration using the probe 124 without a carried fiducial, a touchless registration technique in which the surface features are identified after resolving the motion of the arthroscope 108 and camera relative to the bone fiducial 128, and a third technique in which uses a patient-specific instrument.
  • the surgeon may touch a plurality of locations using the probe 124 (FIG. 1 ).
  • receiving the plurality of surface features of the outer surface of the bone may involve the surgeon “painting” the outer surface of the bone.
  • “Painting” is a term of art that does not involve application of color or pigment, but instead implies motion of the probe 124 when the distal end of the probe 124 is touching bone.
  • the probe 124 does not carry or have a fiducial visible to the arthroscope 108 and the camera 110. It follows that the pose of the probe 124 and the location of the distal tip of the probe 124 needs to be determined in order to gather the surface features for purposes of registering the three- dimensional bone model.
  • FIG. 3 shows a method 300 in accordance with at least some embodiments of the present disclosure.
  • the example method 300 may be implemented in software within a computer system, such as the surgical controller 118.
  • the example method 300 comprises obtaining a three-dimensional bone model (block 302). That is to say, in the example method 300, what is obtained is the three- dimensional bone model that may be created by segmenting a plurality of non-invasive images (e.g., CT, MRI) taken preoperatively or intraoperatively. With the bone segmented from or within the images, the three-dimensional bone model may be created.
  • a plurality of non-invasive images e.g., CT, MRI
  • the three-dimensional bone may take any suitable form, such as a computer- aided design (CAD) model, a point cloud of data points with respect to an arbitrary origin, or a parametric representation of a surface expressed using analytical mathematical equations.
  • CAD computer- aided design
  • the three-dimensional bone model is defined with respect to the origin and in any suitable an orthogonal basis.
  • the next step in the example method 300 is determining locations of a distal tip of the medical instrument visible within the video images (block 306), where the distal tip is touching the bone in at least some of the frames of the video images, and the medical instrument does not have a fiducial. Determining the locations of the distal tip of the medical instrument may take any suitable form. In one example, determining the locations may include segmenting the medical instrument in the frames of the video images (block 308). The segmenting may take any suitable form, such as applying the video images to a segmentation machine learning algorithm.
  • the segmentation machine learning algorithm may take any suitable form, such as neural network or convolution neural network trained with a training data set showing the medical instrument in a plurality of known orientations. The segmentation machine learning algorithm may produce segmented video images where the medical instrument is identified or highlighted in some way (e.g., box, brightness increased, other objects removed).
  • the example method 300 may estimate a plurality of poses of the medical instrument within a respective plurality of frames of the video images (block 310).
  • the estimating the poses may take any suitable form, such as applying the video images to a pose machine learning algorithm.
  • the pose machine learning algorithm may take any suitable form, such as neural network or convolution neural network trained to perform six-dimensional pose estimation.
  • the resultant of the pose machine learning algorithm may be, for at least some of the frames of the video image, an estimated pose of the medical instrument in the reference frame of the video images and/or in the reference frame provided by the bone fiducial. That is, the resultant of the pose machine learning algorithm may be a plurality of poses, one pose each for at least some of the frames of the segmented video images.
  • the next step in the example method 300 is determining the locations based on the plurality of poses (block 312).
  • the location of the distal tip can be determined in the reference frame of the video images and/or the bone fiducial.
  • the resultant is a set of locations that, at least some of which, represent locations of the outer surface of the bone.
  • FIG. 3 shows an example three-step process for determining the locations of the distal tip of the medial instrument.
  • a single machine learning model such as a convolution neural network
  • the convolution neural network may segment the medical instrument, perform the six-dimensional pose estimation, and determine the location of the distal tip in each frame.
  • the training data set in such a situation would include a data set in which each frame has the medical device segmented, the sixdimensional pose identified, and the location of the distal tip identified.
  • the output of the determining step 306 may be a segmented video stream distinct from the video images captured at step 304.
  • the later method steps may use both segmented video stream and the video images to perform the further tasks.
  • the location information may be combined with the video images, such as being embedded in the video images, or added as metadata to each frame of the video images.
  • FIG. 4 is an example video display showing portions of a femur and a bone fiducial during a registration procedure. Although described with respect to a distal end of a femur, the principles and techniques described and shown in FIG. 4 can be applied to other anatomical structures/procedures, such as a femoral head for hip procedures as described herein.
  • the display may be shown, for example, on the display device 114 associated with the device cart 102, or any other suitable location.
  • visible in the main part of the display of FIG. 4 is an intercondylar notch 400, a portion of the lateral condyle 402, a portion the medial condyle 404, and the example bone fiducial 128.
  • Shown in the upper right corner of the example display is a depiction of the bone, which may be a rendering 406 of the bone created from the three-dimensional bone model. Shown on the rendering 406 is a recommended area 408, the recommended area 408 being portions of the surface of the bone to be “painted” as part of the registration process. Shown in the lower right corner of the example display is a depiction of the bone, which again may be a rendering 412 of the bone created from the three-dimensional bone model. Shown on the rendering 412 are a plurality of surface features 416 on the bone model that have been identified as part of the registration process. Further shown in the lower right corner of the example display is progress indicator 418, showing the progress of providing and receiving of locations on the bone. The example progress indicator 418 is a horizontal bar having a length that is proportional to the number of locations received, but any suitable graphic or numerical display showing progress may be used (e.g., 0% to 100%).
  • the surgical controller 118 receives the surface features on the bone, and may display each location both within the main display as dots or locations 416, and within the rendering shown in the lower right corner. More specifically, the example surgical controller 118 overlays indications of identified surface features 416 on the display of the images captured by the arthroscope 108 and camera 110, and in the example case shown, also overlays indications of identified surface features 416 on the rendering 412 of the bone model. Moreover, as the number of identified locations 416 increases, the surgical controller 118 also updates the progress indicator 418.
  • the plurality of surface features 416 may be, or the example surgical controller 118 may generate, a registration model relative to the bone fiducial 128 (block 314).
  • the registration model may take any suitable form, such as a computer-aided design (CAD) model or point cloud of data points in any suitable orthogonal basis.
  • CAD computer-aided design
  • the registration model regardless of the form, may have fewer overall data points or less “structure” than the bone model created by the non-invasive computer imaging (e.g., MRI).
  • the goal of the registration model is to provide the basis for the coordinate transforms and scaling used to correlate the bone model to the registration model and relative to the bone fiducial 128.
  • the next step in the example method 300 is registering the bone model relative to the location of the bone fiducial based on the registration model (block 316).
  • Registration may conceptually involve testing a plurality of coordinate transformations and scaling values to find a correlation that has a sufficiently high correlation or confidence factor. Once a correlation is found with the sufficiently high confidence factor, the bone model is said to be registered to the location of the bone fiducial. Thereafter, the example registration method 300 may end (block 318); however, the surgical controller 118 may then use the registered bone model to provide computer-assisted navigation regarding a procedure involving the bone.
  • registration of the bone model involves a touch-based registration technique using the probe 124 without a carried fiducial.
  • other registration techniques are possible, such as a touchless registration technique.
  • the example touchless registration technique again relies on placement of the bone fiducial 128.
  • the bone fiducial may have fewer faces with respective fiducials. Once placed, the bone fiducial 128 represents a fixed location on the outer surface of the bone in the view of the arthroscope 108, even as the position of the arthroscope 108 is moved and changed relative to the bone fiducial 128.
  • the surgical controller 118 determines a plurality of surface features of an outer surface of the bone, and in this example determining the plurality of surface features is based on a touchless registration technique in which the surface features are identified based on motion of the arthroscope 108 and camera 110 relative to the bone fiducial 128.
  • Another technique for registering the bone model to the bone uses a patientspecific instrument.
  • a registration model is created, and the registration model is used to register the bone model to the bone visible in the video images.
  • the registration model is used to determine a coordinate transformation and scaling to align the bone model to the actual bone.
  • use of the registration model may be omitted, and instead the coordinate transformations and scaling may be calculated directly.
  • FIG. 5 shows a method 500 in accordance with at least some embodiments.
  • the example method may be implemented in software within one or more computer systems, such as, in part, the surgical controller 118.
  • the example method 500 comprises obtaining a three-dimensional bone model (block 502).
  • a three-dimensional bone model that may be created by segmenting a plurality of non-invasive images (e.g., MRI) taken preoperatively or intraoperatively.
  • the method 500 further includes generating a patient-specific instrument that has a feature designed to couple to the bone represented in the bone model in only one orientation (block 504).
  • Generating the patient-specific instrument may first involve selecting a location at which the patient-specific instrument will attach.
  • a device or computer system may analyze the bone model and select the attachment location.
  • the attachment location may be a unique location in the sense that, if a patient-specific instrument is made to couple to the unique location, the patient-specific instrument will not couple to the bone at any other location.
  • the location selected may be at or near the upper or superior portion on the intercondylar notch.
  • the bone model shows another location with a unique feature, such as a bone spur or other raised or sunken surface anomaly
  • a unique location may be selected as the attachment location for the patient-specific instrument.
  • the location may be selected based on a location, within the hip joint, of a bone spur or other anatomical feature associated with the hip procedure.
  • forming the patient-specific instrument may take any suitable form.
  • a device or computer system may directly print, such as using a 3D printer, the patient-specific instrument.
  • the device or computer system may print a model of the attachment location, and the model may then become the mold for creating the patient-specific instrument.
  • the model may be the mold for an injection-molded plastic or casting technique.
  • the patient-specific instrument carries one or more fiducials, but as mentioned above, in other cases the patient-specific instrument may itself be tracked and thus carry no fiducials.
  • the method 500 further includes coupling the patient-specific instrument to the bone, in some cases the patient-specific instrument having the fiducial coupled to an exterior surface (block 506).
  • the attachment location for the patient-specific instrument can be selected to be unique such that the patient-specific instrument couples to the bone in only one location and in only one orientation.
  • the patient-specific instrument may be inserted arthroscopically. That is, the attachment location may be selected such that a physical size of the patient-specific instrument enables insertion through the ports/portals in the patient’s skin.
  • the patient-specific instrument may be made or constructed of a flexible material that enables the patient-specific instrument to deform for insertion in the surgical site, yet return to the predetermined shape for coupling to the attachment location.
  • the patient-specific instrument may be a rigid device with fewer size restrictions.
  • the method 500 further includes capturing video images of the patientspecific instrument (block 508).
  • the capturing may be performed intraoperatively.
  • the capturing of video images is by the surgical controller 118 by way of arthroscope 108 and camera 110.
  • the capturing may be by any suitable camera device, such as one or both cameras of a stereoscopic camera systems, or a portable computing device, such as a tablet or smart-phone device.
  • the video images may be provided to the surgical controller 118 in any suitable form.
  • the example method 500 further includes registering the bone model based on the location of the patient-specific instrument (block 510). That is, given that the patient-specific instrument couples to the bone at only one location and in only one orientation, the location and orientation of the patient-specific instrument is directly related to the location and origination of the bone, and thus the coordinate transformations and scaling for the registration may be calculated directly. Thereafter, the example method 500 may end; however, the surgical controller 118 may then use the registered bone model to provide computer-assisted navigation regarding a surgical task or surgical procedure involving the bone.
  • the surgical controller 118 may provide guidance regarding a surgical task of a surgical procedure.
  • the specific guidance is dependent upon the surgical procedure being performed and the stage of the surgical procedure.
  • a non-exhaustive list of guidance comprises: changing a drill path entry point; changing a drill path exit point; aligning an aimer along a planned drill path; showing location at which to cut and/or resect the bone; reaming the bone by a certain depth along a certain direction; placing a device (suture, anchor or other) at a certain location; placing a suture at a certain location; placing an anchor at a certain location; showing regions of the bone to touch and/or avoid; and identifying regions and/or landmarks of the anatomy.
  • the guidance may include highlighting within a version of the video images displayed on a display device, which can be the arthroscopic display or a see-through display, or by communicating to a virtual reality device or a robotic tool.
  • FIGS. 1 -5 may be implemented with a shaver/burr instrument configured in accordance with the principles of the present disclosure.
  • tracking locations of a shaver/burr instrument using the systems and methods described above may be difficult (e.g., relative to other instruments used in arthroscopic procedures, such as the probe 124).
  • a shaver/burr according to the present disclosure incudes various configurations or arrangements of fiducial markers to facilitate tracking by a surgical navigation system (e.g., the system 100) as described below in more detail.
  • FIG. 6 shows an example shaver/burr instrument (in this example, a burr instrument 600) having a distal tip 602 configured in accordance with the principles of the present disclosure.
  • a burr instrument 600 having a distal tip 602 configured in accordance with the principles of the present disclosure.
  • the principles of the present disclosure can be implemented with shaver instruments and/or other types of cutting instruments including stationary and rotating components similar to the burr instrument 600.
  • the burr instrument 600 includes an outer sheath 604 (e.g., extending distally from a handpiece, not shown in FIG. 6, held by a surgeon) and an inner burr cutting assembly 606 configured to rotate within the sheath 604 (e.g., responsive to control signals received via the handpiece).
  • the cutting assembly 606 includes a corresponding cutting end (e.g., a burr tip 608) configured to, for example, cut and remove bone.
  • the cutting assembly 606 passes through and extends from an opening of an inner channel 610 defined by the outer sheath 604.
  • the burr tip 608 is disposed at a distal end of an inner shaft 612 configured to rotate within the sheath 604.
  • the cutting end may refer to any portion of the burr tip 608.
  • the cutting end may refer to a distal tip of a cutting member or assembly (e.g., a blade tip).
  • the outer sheath 604 includes a cutaway 614 (a window, opening, etc.) to provide clearance for the cutting assembly 606.
  • the cutaway 614 corresponds to a portion of the sheath 604 that does not extend over the burr tip 608, thereby exposing (i.e., leaving uncovered) the burr tip 608.
  • the cutaway 614 is defined in two or more sides of the outer sheath 604 such that the cutaway completely exposes (i.e., does not cover any portion of) the burr tip 608 on the two or more sides. As shown, the burr tip 608 extends beyond the distal tip 602.
  • the outer sheath 604 includes one or more planes or planar surfaces 616. As one example, as shown in FIG. 6, the outer sheath 604 includes four (4) planar surfaces 616 (although only two of the planar surfaces 616 are visible). In other words, the outer sheath 604 may be generally square or rectangular. In other examples, the outer sheath 604 may include fewer or more than four of the planar surfaces 616.
  • the burr instrument 600 includes one or more fiducial markers 620-1 and 620- 2 (referred to collectively as the fiducial markers 620) positioned on respective planar surfaces 616.
  • Each of the fiducial markers 620 may include one or more actual fiducial markings.
  • each of the fiducial markers 620 includes two (a pair) of fiducial markings.
  • the fiducial markers 620 are located/positioned as close as possible to the distal tip 602 (and, accordingly, the burr tip 608) without intersecting the cutaway 614. In other words, a gap 622 between the fiducial markers 620 and the distal tip 602 and/or the cutaway 614 is minimized. As one example, the gap between at least one of the fiducial markings of each of the fiducial markers 620 and the cutaway 614 is less than 2.0 mm. In various examples, one or more of the fiducial markers 620 may be positioned within 10.00 mm of the cutting end (e.g., the burr tip 608). In various examples, one or more of the fiducial markers 620 may be positioned less than 10.00 mm from the cutting end (e.g., within 8.0 mm, 6.0 mm, 4.0 mm, or 2.0 mm of the cutting end).
  • At least one of the fiducial markers 620 extends over/overlaps the burr tip 608 in the distal direction.
  • the fiducial marker 620-1 extends past a proximal end or edge of the burr tip 608 in the distal direction.
  • the fiducial marker 620-1 extends over at least 50% of the diameter of the burr tip 608.
  • the fiducial marker 620- 1 is positioned as close as possible to the location of the burr tip 608. Accordingly, a detected location of the fiducial marker 620-1 is similar to the actual location of the burr tip 608.
  • the fiducial marker 620-1 does not overlap the burr tip 608 and instead terminates within 2.0 mm of the burr tip 608.
  • the fiducial markers 620 may be positioned such that at least one of the fiducial markers 620 is visible (i.e. , to an endoscopic camera) in a full 360° around a rotation axis of the burr instrument 600.
  • the burr instrument 600 may include four fiducial markers 620 arranged on respective sides/planar surfaces. In this manner, visibility of the fiducial markers 620 is increased (i.e., visibility is maximized regardless of a rotational position of the outer sheath 604 relative to an endoscope).
  • Systems and methods according to the present disclosure are configured (e.g., using the techniques described above in FIGS. 1 -5) to determine a location of the burr instrument 600 (and, more particularly, a location of the distal tip 602, the burr tip 608, etc.) based on locations of the fiducial markers 620. For example, for a given instrument, locations of various components or geometric entities of the burr instrument 600 relative to the fiducial markers 620 are known (e.g., as previously determined, such as during manufacturing, calibration, etc., and stored as data accessible by the system 100).
  • locations of various portions of the burr instrument 600 can be determined based on the locations of the fiducial markers and the stored data.
  • the location of the burr tip 608 relative to patient anatomy can also be determined.
  • FIGS. 7A, 7B, and 7C show (in a perspective view) another example burr instrument 700 according to the principles of the present disclosure. Similar to the burr instrument 600, the burr instrument 700 includes a distal tip 702, an outer sheath 704, an inner cutting assembly 706, and a burr tip 708 disposed at a distal end of an inner shaft 712. FIGS. 7A, 7B, and 7C show the cutting assembly 706 at different rotational positions relative to the outer sheath 704. A cutaway 714 in the sheath 704 provides clearance for and facilitates visibility of the burr tip 708 and the cutting assembly 706.
  • the sheath 704 may include one or more planar surfaces 716. Although not shown in FIGS. 7A, 7B, and 7C, the planar surfaces 716 of the sheath 704 may include respective fiducial markers. However, in this example, the cutting assembly 706 (e.g., the inner shaft 712) includes one or more fiducial markers 720 arranged on a corresponding planar (or generally/approximately planer) surface 724 of the shaft 712. In some examples, the surface 724 (i.e. , a fiducial plane) is parallel to a rotation axis of the cutting assembly 706. In other examples, the surface 724 may not be parallel to the rotation axis of the cutting assembly 706.
  • the fiducial marker 720 is arranged directly on the cutting assembly 706 to facilitate optical detection and location of the cutting assembly 706.
  • Optical detection/tracking techniques as described herein e.g., techniques configured to detect and locate fiducial markers using images obtained using an endoscopic camera
  • various mitigating technologies e.g., strobes or tissue removers
  • At least a portion of the outer sheath 704 may be transparent or include an aperture or window 726 to enable viewing of the fiducial marker 720 at additional rotational positions of the sheath 704 (e.g., outer sheath positions relative to the inner cutting assembly, the endoscopic camera, etc.).
  • FIGS. 8A, 8B, and 8C show other example arrangements of fiducial markers on a burr instrument 800 according to the present disclosure.
  • the fiducial markers and/or planes containing the fiducial markers may have various orientations relative to a longitudinal axis of the burr instrument 800.
  • fiducial markers 804 can be positioned at any orientation relative to a longitudinal axis 808 of the burr instrument 800.
  • one or more of the fiducial markers 804 can be skewed, rotated, etc. relative to the axis 808.
  • lateral sides of the fiducial markers 804 are not required to be parallel or perpendicular to the axis 808.
  • fiducial planes i.e. , the planes on which the fiducial markers 804 are arranged
  • the fiducial planes can be skewed, rotated, etc. relative to the axis 808.
  • the fiducial markers 804 may be arranged on a curved outer sheath 812, and/or the fiducial markers 804 themselves may be curved (i.e., not planar).
  • the fiducial markers 804 may be of different sizes on the same burr instrument 800 to facilitate viewing/detection from an endoscope positioned at different distances relative to the burr instrument 800. Further, the fiducial markers 804 may have non-uniform and/or non-symmetrical shapes.
  • a pose of a rotating fiducial marker can be tracked (e.g., as a function of time) to determine the rotational axis or other positional/operational features of the burr tip, the cutting assembly, etc.
  • fiducial markers are located in positions at a most distal end of the shaver/burr to achieve the greatest likelihood of being tracked (e.g., within a capsule of a joint). Further, for examples where the fiducials are positioned directly on the inner cutting assembly, tolerance stack-up is minimized (e.g., relative to examples where the fiducials are located only on the outer sheath).
  • FIG. 9 shows an example method 900 for performing instrument tracking techniques in accordance with the principles of the present disclosure.
  • the method 900 may be performed by one or more processing devices or processors, computing devices, etc., such as the system 100 or another computing device executing instructions stored in memory.
  • One or more steps of the method 900 may be omitted in various examples, and/or may be performed in a different sequence than shown in FIG. 9. The steps may be performed sequentially or non-sequentially, two or more steps may be performed concurrently, etc.
  • the method 900 includes obtaining images (e.g., real-time or near real-time images) of a surgical environment including patient anatomy with one or more visual markers (e.g., a bone fiducial marker, which may be referred to as a base marker) fixed to patient anatomy as well as visual/fiducial markers associated with one or more surgical instruments.
  • Obtaining the images may include obtaining images using an endoscopic/arthroscopic camera or other imaging device configured to provide an image feed.
  • an image scan of patient anatomy may be performed, such as by performing a pre-operative imaging scan (e.g., a CT scan), retrieving the stored image scan (as data) from memory, etc.
  • the images obtained from the image feed may be aligned with a model that is generated based on the image scan to provide visual guidance as described herein.
  • the obtained images may include a sparse or dense set of images of the surgical environment including the one or more visual markers (e.g., at least one visual marker).
  • the method 900 includes detecting one or more fiducial markers arranged on the shaver/burr instrument.
  • the method 900 includes determining a position of one or more portions (e.g., a burr tip) of the shaver/burr instrument based on a location of the detected fiducial markers arranged on the shaver/burr instrument. For example, the position of the burr tip can be calculated based on a known relationship between the location of the burr tip and positions of the one or more fiducial markers.
  • the method 900 includes performing one or more actions based on the position of the burr tip (and/or other portions of the shaver/burr instrument).
  • the one or more actions may include various actions associated with the detection and tracking techniques described herein.
  • actions may include providing visual or other guidance (e.g., via a display or other interface) indicating the position/orientation of the instrument, burr tip, etc.
  • the visual guidance may include a virtual model or other image of the instrument or burr tip overlaying the surgical/anatomical site.
  • the guidance may include instructions to the surgeon associated with the position of the burr tip.
  • the one or more actions may include control of the instrument based on the position of the burr tip (e.g. powering on or off, increasing or decreasing rotation speed, etc.).
  • FIG. 10 shows an example computer system or computing device 1000 configured to implement the various systems and methods of the present disclosure.
  • the computer system 1000 may correspond to one or more computing devices of the system 100, the surgical controller 118, a device that creates a patientspecific instrument, a tablet device within the surgical room, or any other system that implements any or all the various methods discussed in this specification.
  • the computer system 1000 may be configured to implement all or portions of the method 900.
  • the computer system 1000 may be connected (e.g., networked) to other computer systems in a local-area network (LAN), an intranet, and/or an extranet (e.g., device cart 102 network), or at certain times the Internet (e.g., when not in use in a surgical procedure).
  • LAN local-area network
  • intranet e.g., device cart 102 network
  • the Internet e.g., when not in use in a surgical procedure.
  • the computer system 1000 may be a server, a personal computer (PC), a tablet computer or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device.
  • PC personal computer
  • tablet computer any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device.
  • computer shall also be taken to include any collection of computers that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methods discussed herein.
  • the computer system 1000 includes a processing device 1002, a main memory 1004 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 1006 (e.g., flash memory, static random access memory (SRAM)), and a data storage device 1008, which communicate with each other via a bus 1010.
  • main memory 1004 e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)
  • DRAM dynamic random access memory
  • SDRAM synchronous DRAM
  • static memory 1006 e.g., flash memory, static random access memory (SRAM)
  • SRAM static random access memory
  • Processing device 1002 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 1002 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets.
  • the processing device 1002 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • DSP digital signal processor
  • the processing device 1002 is configured to execute instructions for performing any of the operations and steps discussed herein. Once programmed with specific instructions, the processing device 1002, and thus the entire computer system 1000, becomes a special-purpose device, such as the surgical controller 118.
  • the computer system 1000 may further include a network interface device 1012 for communicating with any suitable network (e.g., the device cart 102 network).
  • the computer system 1000 also may include a video display 1014 (e.g., the display device 114), one or more input devices 1016 (e.g., a microphone, a keyboard, and/or a mouse), and one or more speakers 1018.
  • the video display 1014 and the input device(s) 1016 may be combined into a single component or device (e.g., an LCD touch screen).
  • the data storage device 1008 may include a computer-readable storage medium 1020 on which the instructions 1022 (e.g., implementing any methods and any functions performed by any device and/or component depicted described herein) embodying any one or more of the methodologies or functions described herein is stored.
  • the instructions 1022 may also reside, completely or at least partially, within the main memory 1004 and/or within the processing device 1002 during execution thereof by the computer system 1000. As such, the main memory 1004 and the processing device 1002 also constitute computer-readable media. In certain cases, the instructions 1022 may further be transmitted or received over a network via the network interface device 1012.
  • computer-readable storage medium 1020 is shown in the illustrative examples to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • the term “computer-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
  • the term “computer- readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Robotics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Surgical Instruments (AREA)

Abstract

A surgical instrument configured to perform a shaving function and/or a burr function of a surgical procedure includes an outer sheath having a proximal end coupled to a handle and a distal end opposite the proximal end, the outer sheath defining an inner channel having an opening defined at the distal end, an inner cutting assembly disposed within the inner channel of the outer sheath, the inner cutting assembly being configured to rotate within the outer sheath to perform the at least one of the shaving function and the burr function, and the inner cutting assembly including a cutting end exposed within the opening of the inner channel, and a fiducial marker located on a portion of the distal end of the outer sheath that overlaps the cutting end of the inner cutting assembly, the fiducial marker being configured to be detected by an optical tracking system.

Description

BURR TRACKING FOR SURGICAL NAVIGATION PROCEDURES
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of U.S. Provisional App. 63/646,105 filed May 13, 2024, the entire contents of which are incorporated herein by reference.
FIELD
[0002] The present disclosure relates surgical navigation systems and methods, and more particularly to instrument tracking techniques for surgical navigation systems.
BACKGROUND
[0003] The background description provided here is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description that may not otherwise qualify as prior art at the time of filing, are neither expressly nor impliedly admitted as prior art against the present disclosure.
[0004] Arthroscopic surgical procedures are minimally invasive surgical procedures in which access to the surgical site within the body is by way of small keyholes or ports through the patient’s skin. The various tissues within the surgical site are visualized by way of an arthroscope placed through a port or portal, and the internal scene is shown on an external display device. The tissue may be repaired or replaced through the same or additional ports. In computer-assisted surgical procedures (e.g., surgical procedures associated with a knee or knee joint, surgical procedures associated with a hip or hip joint, etc.), the location of various objects with the surgical site may be tracked relative to the bone by way of images captured by an arthroscope and a three- dimensional model of the bone. SUMMARY
[0005] A surgical instrument configured to perform at least one of a shaving function and a burr function of a surgical procedure includes a handle, an outer sheath having a proximal end coupled to the handle and a distal end opposite the proximal end, the outer sheath defining an inner channel having an opening defined at the distal end, an inner cutting assembly disposed within the inner channel of the outer sheath, the inner cutting assembly being configured to rotate within the outer sheath to perform the at least one of the shaving function and the burr function, and the inner cutting assembly including a cutting end exposed within the opening of the inner channel, and at least one fiducial marker located on a portion of the distal end of the outer sheath that overlaps the cutting end of the inner cutting assembly, the at least one fiducial marker being configured to be detected by an optical tracking system.
[0006] In other features, the surgical instrument is a burr/shaver instrument. The cutting end includes a burr tip, and wherein the at least one fiducial marker extends over at least 50% of the burr tip in a distal direction. The outer sheath includes a cutaway exposing at least a portion of the burr tip. The cutaway is defined in two or more sides of the outer sheath. A gap between the cutaway and the at least one fiducial marker is less than 2.0 mm. The inner cutting assembly includes an inner shaft, the at least one fiducial marker includes a plurality of fiducial markers, and at least one of the plurality of fiducial markers is located on the inner shaft. The outer sheath includes at least one of a window and a transparent portion aligned with the at least one of the plurality of fiducial markers located on the inner shaft. The at least one fiducial marker is arranged on a plane that is not parallel to a longitudinal axis of the surgical instrument. The at least one fiducial marker includes first and second fiducial markers, and the first and second fiducial markers have different shapes or different sizes relative to one another.
[0007] In other features, a system includes a computing device configured to receive one or more images of a surgical site, the one or more images including the at least one fiducial marker of the surgical instrument of claim 1 , detect the at least one fiducial marker, and determine a location of the surgical instrument based on the detected at least one fiducial marker.
[0008] A system for tracking a position of a shaver/burr instrument relative to patient anatomy includes memory storing instructions and one or more processing devices configured to execute the instructions. Executing the instructions causes the one or more processing devices to receive one or more images of a surgical site, the one or more images including at least one fiducial marker of the shaver/burr instrument, the at least one fiducial marker overlapping a cutting end of the shaver/burr instrument in a distal direction, detect the at least one fiducial marker of the shaver/burr instrument, and determine a location of the shaver/burr instrument within the surgical site based on the detected at least one fiducial marker.
[0009] The system further includes the shaver/burr instrument. The shaver/burr instrument includes a handle, an outer sheath having a proximal end coupled to the handle and a distal end opposite the proximal end, the outer sheath defining an inner channel having an opening defined at the distal end, and an inner cutting assembly disposed within the inner channel of the outer sheath, the inner cutting assembly being configured to rotate within the outer sheath to perform at least one of a shaving function and a burr function, and the inner cutting assembly including the cutting end. The least one fiducial marker is located on a portion of the distal end of the outer sheath that overlaps the cutting end of the inner cutting assembly.
[0010] In other features, the cutting end includes a burr tip, and the at least one fiducial marker extends over at least 50% of the burr tip in the distal direction. The shaver/burr instrument includes an outer sheath and an inner cutting assembly configured to rotate within the outer sheath, the at least one fiducial marker includes a plurality of fiducial markers, and at least one of the plurality of fiducial markers is located on the inner cutting assembly. The outer sheath includes at least one of a window and a transparent portion aligned with the at least one of the plurality of fiducial markers located on the inner cutting assembly. The at least one fiducial marker is arranged on a plane that is not parallel to a longitudinal axis of the shaver/burr instrument. The at least one fiducial marker includes first and second fiducial markers, and wherein the first and second fiducial markers have different shapes or different sizes relative to one another.
[0011] A method for tracking a position of a shaver/burr instrument includes, using one or more processors configured to execute instructions stored in memory, receiving one or more images of a surgical site, the one or more images including at least one fiducial marker arranged on a portion of the shaver/burr instrument overlapping, in a distal direction, a cutting end of an inner cutting assembly of the shaver/burr instrument, detecting the at least one fiducial marker of the shaver/burr instrument, determining a location of the shaver/burr instrument within the surgical site based on the detected at least one fiducial marker, and at least one of displaying visual guidance based on the determined location of the shaver/burr instrument and controlling operation of the shaver/burr instrument based on the determined location of the shaver/burr instrument. [0012] A surgical instrument configured to perform at least one of a shaving function and a burr function of a surgical procedure includes a handle, an outer sheath having a proximal end coupled to the handle and a distal end opposite the proximal end, the outer sheath defining an inner channel having an opening defined at the distal end, an inner cutting assembly disposed within the inner channel of the outer sheath, the inner cutting assembly being configured to rotate within the outer sheath to perform the at least one of the shaving function and the burr function, and the inner cutting assembly including a cutting end exposed within the opening of the inner channel, and at least one fiducial marker located on a portion of the distal end of the outer sheath located within 10.0 mm of the cutting end of the inner cutting assembly, the at least one fiducial marker being configured to be detected by an optical tracking system.
[0013] Further areas of applicability of the present disclosure will become apparent from the detailed description, the claims and the drawings. The detailed description and specific examples are intended for purposes of illustration only and are not intended to limit the scope of the disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] For a detailed description of example embodiments, reference will now be made to the accompanying drawings in which:
[0015] FIG. 1 shows a surgical system in accordance with at least some embodiments;
[0016] FIG. 2 shows a conceptual drawing of a surgical site with various objects within the surgical site tracked, in accordance with at least some embodiments;
[0017] FIG. 3 shows a method in accordance with at least some embodiments;
[0018] FIG. 4 is an example video display showing portions of a femur and a bone fiducial during a registration procedure, in accordance with at least some embodiments;
[0019] FIG. 5 shows a method in accordance with at least some embodiments;
[0020] FIG. 6 shows an example burr instrument in accordance with at least some embodiments; [0021] FIGS. 7A, 7B, and 7C show (in a perspective view) another example burr instrument in accordance with at least some embodiments;
[0022] FIGS. 8A, 8B, and 8C show other example arrangements of fiducial markers on a burr instrument in accordance with at least some embodiments;
[0023] FIG. 9 shows an example method for performing instrument tracking techniques in accordance with at least some embodiments; and
[0024] FIG. 10 shows an example computer system or computing device configured to implement the various systems and methods of the present disclosure.
[0025] In the drawings, reference numbers may be reused to identify similar and/or identical elements.
DEFINITIONS
[0026] Various terms are used to refer to particular system components. Different companies may refer to a component by different names - this document does not intend to distinguish between components that differ in name but not function. In the following discussion and in the claims, the terms “including” and “comprising” are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to... .” Also, the term “couple” or “couples” is intended to mean either an indirect or direct connection. Thus, if a first device couples to a second device, that connection may be through a direct connection or through an indirect connection via other devices and connections.
[0027] Similarly, spatial and functional relationships between elements (for example, between device, modules, circuit elements, etc.) are described using various terms, including “connected,” “engaged,” “coupled,” “adjacent,” “next to,” “on top of,” “above,” “below,” and “disposed.” Unless explicitly described as being “direct,” when a relationship between first and second elements is described in the above disclosure, that relationship can be a direct relationship where no other intervening elements are present between the first and second elements, but can also be an indirect relationship where one or more intervening elements are present (either spatially or functionally) between the first and second elements. Nevertheless, this paragraph shall serve as antecedent basis in the claims for referencing any electrical connection as “directly coupled” for electrical connections shown in the drawing with no intervening element(s). [0028] Terms of degree, such as “substantially” or “approximately,” are understood by those skilled in the art to refer to reasonable ranges around and including the given value and ranges outside the given value, for example, general tolerances associated with manufacturing, assembly, and use of the embodiments. The term “substantially,” when referring to a structure or characteristic, includes the characteristic that is mostly or entirely present in the characteristic or structure. As one example, numerical values that are described as “approximate” or “approximately” as used herein may refer to a value within +/- 5% of the stated value.
[0029] “A”, “an”, and “the” as used herein refers to both singular and plural referents unless the context clearly dictates otherwise. By way of example, “a processor” programmed to perform various functions refers to one processor programmed to perform each and every function, or more than one processor collectively programmed to perform each of the various functions. To be clear, an initial reference to “a [referent]”, and then a later reference for antecedent basis purposes to “the [referent]”, shall not obviate the fact the recited referent may be plural.
[0030] In general, terminology may be understood at least in part from usage in context. For example, terms, such as “and”, “or”, or “and/or,” as used herein may include a variety of meanings that may depend at least in part upon the context in which such terms are used. Typically, “or” if used to associate a list, such as A, B or C, is intended to mean A, B, and C, here used in the inclusive sense, as well as A, B or C, here used in the exclusive sense. In addition, the term “one or more” as used herein, depending at least in part upon context, may be used to describe any feature, structure, or characteristic in a singular sense or may be used to describe combinations of features, structures or characteristics in a plural sense. Similarly, terms, such as “a,” “an,” or “the,” again, may be understood to convey a singular usage or to convey a plural usage, depending at least in part upon context. In addition, the term “based on” may be understood as not necessarily intended to convey an exclusive set of factors and may, instead, allow for existence of additional factors not necessarily expressly described, again, depending at least in part on context. As used herein, the phrase at least one of A, B, and C should be construed to mean a logical (A OR B OR C), using a non-exclusive logical OR, and should not be construed to mean “at least one of A, at least one of B, and at least one of C.” [0031] The terms “input” and “output” when used as nouns refer to connections (e.g., electrical, software) and/or signals, and shall not be read as verbs requiring action. For example, a timer circuit may define a clock output. The example timer circuit may create or drive a clock signal on the clock output. In systems implemented directly in hardware (e.g., on a semiconductor substrate), these “inputs” and “outputs” define electrical connections and/or signals transmitted or received by those connections. In systems implemented in software, these “inputs” and “outputs” define parameters read by or written by, respectively, the instructions implementing the function. In examples where used in the context of user input, “input” may refer to actions of a user, interactions with input devices or interfaces by the user, etc.
[0032] “Controller,” “module,” or “circuitry” shall mean, alone or in combination, individual circuit components, an application specific integrated circuit (ASIC), a microcontroller with controlling software, a reduced-instruction-set computer (RISC) with controlling software, a digital signal processor (DSP), a processor with controlling software, a programmable logic device (PLD), a field programmable gate array (FPGA), or a programmable system-on-a-chip (PSOC), configured to read inputs and drive outputs responsive to the inputs.
[0033] As used to describe various surgical instruments or devices, such as a probe, the term “proximal” refers to a point or direction nearest a handle of the probe (e.g., a direction opposite the probe tip). Conversely, the term “distal” refers to a point or direction nearest the probe tip (e.g., a direction opposite the handle).
[0034] For the purposes of this disclosure, a non-transitory computer readable medium (or computer-readable storage medium/media) stores computer data, which data can include computer program code (or computer-executable instructions) that is executable by a computer, in machine-readable form. By way of example, and not limitation, a computer readable medium may comprise computer readable storage media, for tangible or fixed storage of data, or communication media for transient interpretation of code-containing signals. Computer readable storage media, as used herein, refers to physical or tangible storage (as opposed to signals) and includes without limitation volatile and non-volatile, removable and non-removable media implemented in any method or technology for the tangible storage of information such as computer-readable instructions, data structures, program modules or other data. Computer readable storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, optical storage, cloud storage, magnetic storage devices, or any other physical or material medium which can be used to tangibly store the desired information or data or instructions and which can be accessed by a computer or processor.
[0035] For the purposes of this disclosure, the term “server” should be understood to refer to a service point that provides processing, database, and communication facilities. By way of example, and not limitation, the term “server” can refer to a single, physical processor with associated communications and data storage and database facilities, or it can refer to a networked or clustered complex of processors and associated network and storage devices, as well as operating software and one or more database systems and application software that support the services provided by the server. Cloud servers are examples.
[0036] For the purposes of this disclosure, a “network” should be understood to refer to a network that may couple devices so that communications may be exchanged, such as between a server and a client device or other types of devices, including between wireless devices coupled via a wireless network, for example. A network may also include mass storage, such as network attached storage (NAS), a storage area network (SAN), a content delivery network (CDN) or other forms of computer or machine- readable media, for example. A network may include the Internet, one or more local area networks (LANs), one or more wide area networks (WANs), wire-line type connections, wireless type connections, cellular or any combination thereof. Likewise, sub-networks, which may employ differing architectures or may be compliant or compatible with differing protocols, may interoperate within a larger network.
[0037] For purposes of this disclosure, a “wireless network” should be understood to couple client devices with a network. A wireless network may employ stand-alone ad- hoc networks, mesh networks, Wireless LAN (WLAN) networks, cellular networks, or the like. A wireless network may further employ a plurality of network access technologies, including Wi-Fi, Long Term Evolution (LTE), WLAN, Wireless Router (WR) mesh, or 2nd, 3rd, 4th or 5th generation (2G, 3G, 4G or 5G) cellular technology, mobile edge computing (MEC), Bluetooth, 802.11 b/g/n, or the like. Network access technologies may enable wide area coverage for devices, such as client devices with varying degrees of mobility, for example. In short, a wireless network may include virtually any type of wireless communication mechanism by which signals may be communicated between devices, such as a client device or a computing device, between or within a network, or the like.
[0038] A computing device may be capable of sending or receiving signals, such as via a wired or wireless network, or may be capable of processing or storing signals, such as in memory as physical memory states, and may, therefore, operate as a server. Thus, devices capable of operating as a server may include, as examples, dedicated rackmounted servers, desktop computers, laptop computers, set top boxes, integrated devices combining various features, such as two or more features of the foregoing devices, or the like.
[0039] For purposes of this disclosure, a client (or consumer or user) device, referred to as user equipment (UE)), may include a computing device capable of sending or receiving signals, such as via a wired or a wireless network. A client device may, for example, include a desktop computer or a portable device, such as a cellular telephone, a smart phone, a display pager, a radio frequency (RF) device, an infrared (IR) device a Near Field Communication (NFC) device, a Personal Digital Assistant (PDA), a handheld computer, a tablet computer, a phablet, a laptop computer, a set top box, a wearable computer, smart watch, an integrated or distributed device combining various features, such as features of the forgoing devices, or the like.
[0040] In some embodiments, as discussed below, the client device can also be, or can communicatively be coupled to, any type of known or to be known medical device (e.g., any type of Class I, II or III medical device), such as, but not limited to, a MRI machine, CT scanner, Electrocardiogram (ECG or EKG) device, photopletismograph (PPG), Doppler and transmit-time flow meter, laser Doppler, an endoscopic device neuromodulation device, a neurostimulation device, and the like, or some combination thereof.
DETAILED DESCRIPTION
[0041] The present disclosure will now be described more fully hereinafter with reference to the accompanying drawings, which form a part hereof, and which show, by way of non-limiting illustration, certain example embodiments. Subject matter may, however, be embodied in a variety of different forms and, therefore, covered or claimed subject matter is intended to be construed as not being limited to any example embodiments set forth herein; example embodiments are provided merely to be illustrative. Likewise, a reasonably broad scope for claimed or covered subject matter is intended. Among other things, for example, subject matter may be embodied as methods, devices, components, or systems. Accordingly, embodiments may, for example, take the form of hardware, software, firmware or any combination thereof (other than software per se). The following detailed description is, therefore, not intended to be taken in a limiting sense.
[0042] Throughout the specification and claims, terms may have nuanced meanings suggested or implied in context beyond an explicitly stated meaning. Likewise, the phrase “in one embodiment” as used herein does not necessarily refer to the same embodiment and the phrase “in another embodiment” as used herein does not necessarily refer to a different embodiment. It is intended, for example, that claimed subject matter include combinations of example embodiments in whole or in part.
[0043] The present disclosure is described below with reference to block diagrams and operational illustrations of methods and devices. It is understood that each block of the block diagrams or operational illustrations, and combinations of blocks in the block diagrams or operational illustrations, can be implemented by means of analog or digital hardware and computer program instructions. These computer program instructions can be provided to a processor of a general purpose computer to alter its function as detailed herein, a special purpose computer, ASIC, or other programmable data processing apparatus, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, implement the functions/acts specified in the block diagrams or operational block or blocks. In some alternate implementations, the functions/acts noted in the blocks can occur out of the order noted in the operational illustrations. For example, two blocks shown in succession can in fact be executed substantially concurrently or the blocks can sometimes be executed in the reverse order, depending upon the functionality/acts involved.
[0044] Computer-Aided Surgery (CAS) and surgical navigation systems support surgeons in planning and performing complex surgical procedures with increased precision and accuracy. As one example surgical procedure, arthroscopy is a minimally invasive medical procedure for diagnosing and treating joint problems. An orthopedic surgeon makes a small incision in the skin of the patient and inserts a lens into the incision. The lens is attached to a camera (e.g., an endoscopic camera) and coupled to a light source, allowing the joint to be visualized and treated. Surgical navigation and CAS systems have had significant impact in minimally invasive surgeries (MIS) such as arthroscopic procedures because the increased difficulty in visualizing the anatomy of the patient further complicates the surgical workflow.
[0045] Video-based surgical navigation (VBSN) according to the principles of the present disclosure use visual fiducials or markers (also called visual markers) attached to patient anatomy to guide the surgeon throughout the medical procedure. The videobased navigation process requires the precise registration of a pre-operative anatomical model with data acquired intra-operatively. The registration process requires the surgeon to digitize the surface of interest that corresponds to the preoperative model. The visual markers attached to the anatomies define reference frames to which the pre-operative model and the intra-operative acquired data are aligned.
[0046] In VBSN, fiducial markers with known visual patterns may be attached both to the targeted anatomy and to the instruments and subsequently tracked such that their relative poses can be accurately estimated (e.g., by applying 3D computer vision methods on the images/video acquired by a camera). These relative poses allow the instruments to be located with respect to the anatomy at every frame time instant. For example, VBSN facilitates the tracking of instruments with respect to the targeted anatomy to which a fiducial is rigidly attached (which may be referred to as a “base marker”).
[0047] More specifically, surgical navigation systems and methods according to the present disclosure are configured to instruments such as shavers or burrs (which may be referred to herein as a “shaver/burr” or “shaver/burr instrument). Generally, a shaver/burr instrument includes a handpiece held by the surgeon to guide and control the instrument, an outer sheath extending distally from the handpiece, and an inner shaver or burr cutting assembly configured to rotate within the sheath (e.g., responsive to control signals received via the handpiece). The shaver/burr includes a corresponding distal tip (i.e. , a blade or burr tip) configured to cut and remove tissue (e.g., using a shaver blade tip) or bone (e.g., using a burr tip).
[0048] Tracking locations of a shaver/burr may be difficult relative to other instruments used in arthroscopic procedures. For example, due to the relationship between multiple components (i.e., the outer sheath and the inner rotating cutting assembly), including both moving and stationary components, distances between and relative positions of the multiple components, manufacturing tolerances, geometric constraints, etc., placement of fiducial markings may be difficult and visibility of the fiducial markings may be reduced for certain types of procedures (e.g., hip procedures) and positions/orientations of the shaver/burr instrument within a joint or other surgical site.
[0049] Accordingly, the surgical navigation systems and methods according to the present disclosure are configured to implement tracking techniques for shaver/burr instruments. For example, a shaver/burr according to the present disclosure incudes various configurations or arrangements of fiducial markers to facilitate tracking by a surgical navigation system. As one example, multiple fiducial markers are positioned on multiple planes at a distal end of the shaver/burr. In some examples, at least one fiducial may be positioned on the rotating cutting assembly to facilitate optical detection (e.g., by an optical tracking system). Various configurations of a shaver/burr instrument including fiducial markers in accordance with the principles of the present disclosure are described below in more detail.
[0050] Although described with respect to shaver/burr instruments, the principles of the present disclosure may be applied to other types of surgical instruments, including, but not limited to, radio frequency (RF) wands. Further, although describe with respect to surgical procedures performed on joints (e.g., hips, knees, etc.), instruments configured according to the principles of the present disclosure may be used for other types (e.g., non-joint) surgical procedures.
[0051] FIG. 1 shows an example surgical system (e.g., a system including or implementing an arthroscopic video-based navigation system) 100 in accordance with at least some embodiments of the present disclosure. In particular, the example surgical system 100 comprises a tower or device cart 102 and various tools or instruments, such as an example mechanical resection instrument 104, an example plasma-based ablation instrument (hereafter just ablation instrument 106), and an endoscope in the example form of an arthroscope 108 and attached camera head or camera 110. In the example systems, the arthroscope 108 may be a rigid device, unlike endoscopes for other procedures, such as upper-endoscopies. The resection instrument 104 may correspond to a shaver/burr instrument configured in accordance with the principles of the present disclosure as described below in more detail. The device cart 102 may comprise a display device 114, a resection controller 116, and a camera control unit (CCU) together with an endoscopic light source and video (e.g., a VBN) controller 118. In example cases the combined CCU and video controller 118 not only provides light to the arthroscope 108 and displays images received from the camera 110, but also implements various additional aspects, such as registering a three-dimensional bone model with the bone visible in the video images, and providing computer-assisted navigation during the surgery. Thus, the combined CCU and video controller are hereafter referred to as surgical controller 118. In other cases, however, the CCU and video controller may be a separate and distinct system from the controller that handles registration and computer-assisted navigation, yet the separate devices would nevertheless be operationally coupled.
[0052] The example device cart 102 further includes a pump controller 122 (e.g., single or dual peristaltic pump). Fluidic connections of the mechanical resection instrument 104 and ablation instrument 106 to the pump controller 122 are not shown so as not to unduly complicate the figure. Similarly, fluidic connections between the pump controller 122 and the patient are not shown so as not to unduly complicate the figure. In the example system, both the mechanical resection instrument 104 and the ablation instrument 106 are coupled to the resection controller 116 being a dualfunction controller. In other cases, however, there may be a mechanical resection controller separate and distinct from an ablation controller. The example devices and controllers associated with the device cart 102 are merely examples, and other examples include vacuum pumps, patient-positioning systems, robotic arms holding various instruments, ultrasonic cutting devices and related controllers, patientpositioning controllers, and robotic surgical systems.
[0053] FIGS. 1 and 2 further show additional instruments that may be present during an arthroscopic surgical procedure. In particular, an example probe 124 (e.g., shown as a touch probe, but which may be a touchless probe in other examples), a drill guide or aimer 126, and a bone fiducial 128 are shown. The probe 124 may be used during the surgical procedure to provide information to the surgical controller 118, such as information to register a three-dimensional bone model to an underlying bone visible in images captured by the arthroscope 108 and camera head 110. In some surgical procedures, the aimer 126 may be used as a guide for placement and drilling with a drill wire to create an initial or pilot tunnel through the bone. The bone fiducial 128 may be affixed or rigidly attached to the bone and serve as an anchor location for the surgical controller 118 to know the position and orientation of the bone (e.g., after registration of a three-dimensional bone model). Additional tools and instruments may be present, such as the drill wire, various reamers for creating the throughbore and counterbore aspects of a tunnel through the bone, and various tools, such as for suturing and anchoring a graft. These additional tools and instruments are not shown so as not to further complicate the figure.
[0054] Example workflow for a surgical procedure is described below. While described with respect to an example anterior cruciate ligament repair procedure, the below techniques may also be performed for other types of surgical procedures, such as hip procedures or other procedures that include joint distraction. A surgical procedure may begin with a planning phase. An example procedure may start with imaging (e.g., X-ray imaging, computed tomography (CT), magnetic resonance imaging (MRI)) of the anatomy of the patient, including the relevant anatomy (e.g., for a knee procedure the lower portion of the femur, the upper portion of the tibia, and the articular cartilage; for a hip procedure, an upper portion of the femur, the acetabulum/hip joint, pelvis, etc.). The imaging may be preoperative imaging, hours or days before the intraoperative repair, or the imaging may take place within the surgical setting just prior to the intraoperative repair. The discussion that follows assumes MRI imaging, but again many different types of imaging may be used. The image slices from the MRI imaging can be segmented such that a volumetric model or three-dimensional model of the anatomy is created. Any suitable currently available, or after developed, segmentation technology may be used to create the three-dimensional model. More specifically to the example of anterior cruciate ligament repair, a three-dimensional bone model of the lower portion of the femur, including the femoral condyles, is created. Conversely, for a hip procedure, a three- dimensional model of the upper portion of the femur and at least a portion of the pelvis (e.g., the acetabulum) is created.
[0055] Using the three-dimensional bone model, an operative plan is created. For a knee procedure, the results of the planning may include: a three-dimensional bone model of the distal end of the femur; a three-dimensional bone model for a proximal end of the tibia; an entry location and exit location through the femur and thus a planned-tunnel path for the femur; and an entry location and exit location through the tibia and thus a planned-tunnel path through the tibia. Other surgical parameters may also be selected during the planning, such as tunnel throughbore diameters, tunnel counterbore diameters and depth, desired post-repair flexion, and the like, but those additional surgical parameters are omitted so as not to unduly complicate the specification.
[0056] Conversely, for a hip procedure, the results of the planning may include a three-dimensional bone model of the proximal end of the femur; a three-dimensional bone model for at least a portion of the pelvis/hip joint (e.g., a region of the pelvis corresponding to the acetabulum); a surgical area of interest within the hip joint; and parameters associated with achieving an amount of distraction in the surgical area of interest to provide sufficient access to the surgical area of interest. For example, example hip procedures may include, but are not limited to, labral repair, femoroacetabular impingement (FAI) debridement (e.g., removal of bone spurs/growths), cartilage repair, and synovectomy (e.g., removal of inflamed tissue). These example procedures typically require access to a specific surgical area of interest within the hip joint (i.e., in a specific area within an interface between the pelvis and the femoral head, such as an area around/surrounding a bone spur or growth, cartilage or tissue to be repaired or removed, etc.).
[0057] The intraoperative aspects include steps and procedures for setting up the surgical system to perform the various repairs. It is noted, however, that some of the intraoperative aspects (e.g., optical system calibration) may take place before any portals or incisions are made through the patient’s skin, and in fact before the patient is wheeled into the surgical room. Nevertheless, such steps and procedures may be considered intraoperative as they take place in the surgical setting and with the surgical equipment and instruments used to perform the actual repair.
[0058] An example procedure can be conducted arthroscopically and is computer- assisted in the sense that the surgical controller 118 is used for arthroscopic navigation within the surgical site. More particularly, in example systems the surgical controller 118 provides computer-assisted navigation during the procedure by tracking locations of various objects within the surgical site, such as the location of the bone within the three-dimensional coordinate space of the view of the arthroscope, and location of the various instruments within the three-dimensional coordinate space of the view of the arthroscope. A brief description of such tracking techniques is described below. [0059] FIG. 2 shows a conceptual drawing of a surgical site with various objects (e.g., surgical instruments/tools) within the surgical site. In particular, visible in FIG. 2 is a distal end of the arthroscope 108, a portion of a bone 200 (e.g., femur), the bone fiducial 128 within the surgical site, and the probe 124.
[0060] The arthroscope 108 illuminates the surgical site with visible light. In the example of FIG. 2, the illumination is illustrated by arrows 208. The illumination provided to the surgical site is reflected by various objects and tissues within the surgical site, and the reflected light that returns to the distal end enters the arthroscope 108, propagates along an optical channel within the arthroscope 108, and is eventually incident upon a capture array within the camera 110 (FIG. 1 ). The images detected by the capture array within the camera 110 are sent electronically to the surgical controller 118 (FIG. 1 ) and displayed on the display device 114 (FIG. 1 ). In one example, the arthroscope 108 is monocular or has a single optical path through the arthroscope for capturing images of the surgical site, notwithstanding that the single optical path may be constructed of two or more optical members (e.g., glass rods, optical fibers). That is to say, in example systems and methods the computer-assisted navigation provided by the arthroscope 108, the camera 110, and the surgical controller 118 is provided with the arthroscope 108 that is not a stereoscopic endoscope having two distinct optical paths separated by an interocular distance at the distal end endoscope.
[0061] During a surgical procedure, a surgeon selects an arthroscope with a viewing direction beneficial for the planned surgical procedure. Viewing direction refers to a line residing at the center of an angle subtended by the outside edges or peripheral edges of the view of an endoscope. The viewing direction for some arthroscopes is aligned with the longitudinal central axis of the arthroscope, and such arthroscopes are referred to as “zero degree” arthroscopes (e.g., the angle between the viewing direction and the longitudinal central axis of the arthroscope is zero degrees). The viewing direction of other arthroscopes forms a non-zero angle with the longitudinal central axis of the arthroscope. For example, for a 30° arthroscope the viewing direction forms a 30° angle to the longitudinal central axis of the arthroscope, the angle measured as an obtuse angle beyond the distal end of the arthroscope. In the example of FIG. 2, the view angle 210 of the arthroscope 108 forms a non-zero angle to the longitudinal central axis 212 of the arthroscope 108. [0062] Still referring to FIG. 2, within the view of the arthroscope 108 is a portion of the bone 200 (in this example, within the intercondylar notch), along with the example bone fiducial 128, and the example probe 124. The example bone fiducial 128 is multifaceted element, with each face or facet having a fiducial disposed or created thereon. However, the bone fiducial need not have multiple faces, and in fact may take any shape so long as that shape can be tracked within the video images. The bone fiducial, such as bone fiducial 128, may be attached to the bone 200 in any suitable form (e.g., via the screw portion of the bone fiducial 128 visible in FIG. 1 ). The patterns of the fiducials on each facet are designed to provide information regarding the position and orientation of the bone fiducial 128 in the three-dimensional coordinate space of the view of the arthroscope 108. More particularly, the pattern is selected such that the position and orientation of the bone fiducial 128 may be determined from images captured by the arthroscope 108 and attached camera (FIG. 1 ).
[0063] The probe 124 is also shown as partially visible within the view of the arthroscope 108. The probe 124 may be used, as discussed more below, to identify a plurality of surface features on the bone 200 as part of the registration of the bone 200 to the three-dimensional bone model. In some cases the probe 124 and/or the aimer 126 may carry their own, unique fiducials, such that their respective poses may be calculated from the one or more fiducial present in the video stream. However, in other cases, and as shown, the medical instrument used to help with registration of the three-dimensional bone model, be it the probe 124, the aimer 126, or any other suitable medical device, may omit carrying fiducials. Stated otherwise, in such examples the medical instrument has no fiducial markings. In such cases, the pose of the medical instrument may be determined by a machine learning model, discussed in more detail below.
[0064] The images captured by the arthroscope 108 and attached camera are subject to optical distortion in many forms. For example, the visual field between distal end of the arthroscope 108 and the bone 200 within the surgical site is filled with fluid, such as bodily fluids and saline used to distend the joint. Many arthroscopes have one or more lenses at the distal end that widen the field of view, and the wider field of view causes a “fish eye” effect in the captured images. Further, the optical elements within the arthroscope (e.g., rod lenses) may have optical aberrations inherent to the manufacturing and/or assembly process. Further still, the camera may have various optical elements for focusing the images received onto the capture array, and the various optical elements may have aberrations inherent to the manufacturing and/or assembly process. In example systems, prior to use within each surgical procedure, the endoscopic optical system is calibrated to account for the various optical distortions. The calibration creates a characterization function that characterizes the optical distortion, and further analysis of the frames of the video stream may be, prior to further analysis, compensated using the characterization function.
[0065] The next example step in the intraoperative procedure is the registration of the bone model created during the planning stage. During the intraoperative repair, the three-dimensional bone model is obtained by or provided to the surgical controller 118. Again using the example of anterior cruciate ligament repair, and specifically computer-assisted navigation for tunnel paths through the femur, the three- dimensional bone model of the lower portion of the femur is obtained by or provided to the surgical controller 118. Thus, the surgical controller 118 receives the three- dimensional bone model, and assuming the arthroscope 108 is inserted into the knee by way of a port or portal through the patient’s skin, the surgical controller 118 also receives video images of a portion of the lower end of the femur. In order to relate the three-dimensional bone model to the images received by way of the arthroscope 108 and camera 110, the surgical controller 118 registers the three-dimensional bone model to the images of the femur received by way of the arthroscope 108 and camera 110.
[0066] In order to perform the registration, and in accordance with example methods, the bone fiducial 128 is attached to the femur. The bone fiducial placement is such that the bone fiducial is within the field of view of the arthroscope 108. In examples for knee procedures, the bone fiducial 128 is placed within the intercondylar notch superior to the expected location of the tunnel through lateral condyle. Conversely, in examples for hip procedures, the bone fiducial 128 is placed on the femoral head. To relate or register bone visible in the video images to the three-dimensional bone model, the surgical controller 118 (FIG. 1 ) is provided or determines a plurality of surface features of an outer surface of the bone. Identifying the surface features may take several forms, including a touch-based registration using the probe 124 without a carried fiducial, a touchless registration technique in which the surface features are identified after resolving the motion of the arthroscope 108 and camera relative to the bone fiducial 128, and a third technique in which uses a patient-specific instrument.
[0067] In the example touch-based registration, the surgeon may touch a plurality of locations using the probe 124 (FIG. 1 ). In some cases, particularly when portions of the outer surface of the bone are exposed to view, receiving the plurality of surface features of the outer surface of the bone may involve the surgeon “painting” the outer surface of the bone. “Painting” is a term of art that does not involve application of color or pigment, but instead implies motion of the probe 124 when the distal end of the probe 124 is touching bone. In this example, the probe 124 does not carry or have a fiducial visible to the arthroscope 108 and the camera 110. It follows that the pose of the probe 124 and the location of the distal tip of the probe 124 needs to be determined in order to gather the surface features for purposes of registering the three- dimensional bone model.
[0068] FIG. 3 shows a method 300 in accordance with at least some embodiments of the present disclosure. The example method 300 may be implemented in software within a computer system, such as the surgical controller 118. In particular, the example method 300 comprises obtaining a three-dimensional bone model (block 302). That is to say, in the example method 300, what is obtained is the three- dimensional bone model that may be created by segmenting a plurality of non-invasive images (e.g., CT, MRI) taken preoperatively or intraoperatively. With the bone segmented from or within the images, the three-dimensional bone model may be created. The three-dimensional bone may take any suitable form, such as a computer- aided design (CAD) model, a point cloud of data points with respect to an arbitrary origin, or a parametric representation of a surface expressed using analytical mathematical equations. Thus, the three-dimensional bone model is defined with respect to the origin and in any suitable an orthogonal basis.
[0069] The next step in the example method 300 is capturing video images of the bone fiducial attached to the bone (block 304). The capturing is performed intraoperatively. In an example, the capturing of video images is by way of the arthroscope 108 and camera 110. Other endoscopes may be used, such as endoscopes in which the capture array resides at the distal end of the device (e.g., chip-on-the-tip devices). However, in open procedures where the skin is cut and pulled away, exposing the bone to the open air, the capturing may be by any suitable camera device, such as one or both cameras of a stereoscopic camera system, or a portable computing device, such as a tablet or smart-phone device. The video images may be provided to the surgical controller 118 in any suitable form.
[0070] The next step in the example method 300 is determining locations of a distal tip of the medical instrument visible within the video images (block 306), where the distal tip is touching the bone in at least some of the frames of the video images, and the medical instrument does not have a fiducial. Determining the locations of the distal tip of the medical instrument may take any suitable form. In one example, determining the locations may include segmenting the medical instrument in the frames of the video images (block 308). The segmenting may take any suitable form, such as applying the video images to a segmentation machine learning algorithm. The segmentation machine learning algorithm may take any suitable form, such as neural network or convolution neural network trained with a training data set showing the medical instrument in a plurality of known orientations. The segmentation machine learning algorithm may produce segmented video images where the medical instrument is identified or highlighted in some way (e.g., box, brightness increased, other objects removed).
[0071] With the segmented video images, the example method 300 may estimate a plurality of poses of the medical instrument within a respective plurality of frames of the video images (block 310). The estimating the poses may take any suitable form, such as applying the video images to a pose machine learning algorithm. The pose machine learning algorithm may take any suitable form, such as neural network or convolution neural network trained to perform six-dimensional pose estimation. The resultant of the pose machine learning algorithm may be, for at least some of the frames of the video image, an estimated pose of the medical instrument in the reference frame of the video images and/or in the reference frame provided by the bone fiducial. That is, the resultant of the pose machine learning algorithm may be a plurality of poses, one pose each for at least some of the frames of the segmented video images. While in many cases a pose may be determined for each frame, in other cases it may not be possible to make a pose estimation for at least some frame because of video quality issues, such as motion blur caused by electronic shutter operation. [0072] The next step in the example method 300 is determining the locations based on the plurality of poses (block 312). In particular, for each frame for which a pose can be estimated, based on a model of the medical device the location of the distal tip can be determined in the reference frame of the video images and/or the bone fiducial. Thus, the resultant is a set of locations that, at least some of which, represent locations of the outer surface of the bone.
[0073] FIG. 3 shows an example three-step process for determining the locations of the distal tip of the medial instrument. However, the method 300 is merely an example, and many variations are possible. For example, a single machine learning model, such as a convolution neural network, may be set up and trained to perform all three steps as a single overall process, though there may be many hidden layers of the convolution neural network. That is, the convolution neural network may segment the medical instrument, perform the six-dimensional pose estimation, and determine the location of the distal tip in each frame. The training data set in such a situation would include a data set in which each frame has the medical device segmented, the sixdimensional pose identified, and the location of the distal tip identified. The output of the determining step 306 may be a segmented video stream distinct from the video images captured at step 304. In such cases, the later method steps may use both segmented video stream and the video images to perform the further tasks. In other cases, the location information may be combined with the video images, such as being embedded in the video images, or added as metadata to each frame of the video images.
[0074] FIG. 4 is an example video display showing portions of a femur and a bone fiducial during a registration procedure. Although described with respect to a distal end of a femur, the principles and techniques described and shown in FIG. 4 can be applied to other anatomical structures/procedures, such as a femoral head for hip procedures as described herein. The display may be shown, for example, on the display device 114 associated with the device cart 102, or any other suitable location. In particular, visible in the main part of the display of FIG. 4 is an intercondylar notch 400, a portion of the lateral condyle 402, a portion the medial condyle 404, and the example bone fiducial 128. Shown in the upper right corner of the example display is a depiction of the bone, which may be a rendering 406 of the bone created from the three-dimensional bone model. Shown on the rendering 406 is a recommended area 408, the recommended area 408 being portions of the surface of the bone to be “painted” as part of the registration process. Shown in the lower right corner of the example display is a depiction of the bone, which again may be a rendering 412 of the bone created from the three-dimensional bone model. Shown on the rendering 412 are a plurality of surface features 416 on the bone model that have been identified as part of the registration process. Further shown in the lower right corner of the example display is progress indicator 418, showing the progress of providing and receiving of locations on the bone. The example progress indicator 418 is a horizontal bar having a length that is proportional to the number of locations received, but any suitable graphic or numerical display showing progress may be used (e.g., 0% to 100%).
[0075] Referring to both the main display and the lower right rendering, as the surgeon touches the outer surface of the bone within the images captured by the arthroscope 108 and camera 110, the surgical controller 118 receives the surface features on the bone, and may display each location both within the main display as dots or locations 416, and within the rendering shown in the lower right corner. More specifically, the example surgical controller 118 overlays indications of identified surface features 416 on the display of the images captured by the arthroscope 108 and camera 110, and in the example case shown, also overlays indications of identified surface features 416 on the rendering 412 of the bone model. Moreover, as the number of identified locations 416 increases, the surgical controller 118 also updates the progress indicator 418.
[0076] Still referring to FIG. 4, in spite of the diligence of the surgeon, not all locations identified by the surgical controller 118 based on the surgeon’s movement of the probe 124 result in valid locations on the surface of the bone. In the example of FIG. 4, as the surgeon moves the probe 124 from the inside surface of the lateral condyle 102 to the inside surface of the medial condyle 104, the surgical controller 118, based on the example six-dimensional pose estimation, receives several locations 420 that likely represent locations at which the distal end of the probe 124 was not in contact with the bone.
[0077] With reference to FIG. 3, the plurality of surface features 416 may be, or the example surgical controller 118 may generate, a registration model relative to the bone fiducial 128 (block 314). The registration model may take any suitable form, such as a computer-aided design (CAD) model or point cloud of data points in any suitable orthogonal basis. The registration model, regardless of the form, may have fewer overall data points or less “structure” than the bone model created by the non-invasive computer imaging (e.g., MRI). However, the goal of the registration model is to provide the basis for the coordinate transforms and scaling used to correlate the bone model to the registration model and relative to the bone fiducial 128. Thus, the next step in the example method 300 is registering the bone model relative to the location of the bone fiducial based on the registration model (block 316). Registration may conceptually involve testing a plurality of coordinate transformations and scaling values to find a correlation that has a sufficiently high correlation or confidence factor. Once a correlation is found with the sufficiently high confidence factor, the bone model is said to be registered to the location of the bone fiducial. Thereafter, the example registration method 300 may end (block 318); however, the surgical controller 118 may then use the registered bone model to provide computer-assisted navigation regarding a procedure involving the bone.
[0078] In the examples discussed to this point, registration of the bone model involves a touch-based registration technique using the probe 124 without a carried fiducial. However, other registration techniques are possible, such as a touchless registration technique. The example touchless registration technique again relies on placement of the bone fiducial 128. As before, when the viewing direction of the arthroscope 108 is relatively constant, the bone fiducial may have fewer faces with respective fiducials. Once placed, the bone fiducial 128 represents a fixed location on the outer surface of the bone in the view of the arthroscope 108, even as the position of the arthroscope 108 is moved and changed relative to the bone fiducial 128. Again, in order to relate or register the bone visible in the video images to the three- dimensional bone model, the surgical controller 118 (FIG. 1 ) determines a plurality of surface features of an outer surface of the bone, and in this example determining the plurality of surface features is based on a touchless registration technique in which the surface features are identified based on motion of the arthroscope 108 and camera 110 relative to the bone fiducial 128.
[0079] Another technique for registering the bone model to the bone uses a patientspecific instrument. In both touch-based and touchless registration techniques, a registration model is created, and the registration model is used to register the bone model to the bone visible in the video images. Conceptually, the registration model is used to determine a coordinate transformation and scaling to align the bone model to the actual bone. However, if the orientation of the bone in the video images is known or can be determined, use of the registration model may be omitted, and instead the coordinate transformations and scaling may be calculated directly.
[0080] FIG. 5 shows a method 500 in accordance with at least some embodiments. The example method may be implemented in software within one or more computer systems, such as, in part, the surgical controller 118. In particular, the example method 500 comprises obtaining a three-dimensional bone model (block 502). In the patient-specific instrument registration technique, what is obtained is the three- dimensional bone model that may be created by segmenting a plurality of non-invasive images (e.g., MRI) taken preoperatively or intraoperatively.
[0081] The method 500 further includes generating a patient-specific instrument that has a feature designed to couple to the bone represented in the bone model in only one orientation (block 504). Generating the patient-specific instrument may first involve selecting a location at which the patient-specific instrument will attach. For example, a device or computer system may analyze the bone model and select the attachment location. In various examples, the attachment location may be a unique location in the sense that, if a patient-specific instrument is made to couple to the unique location, the patient-specific instrument will not couple to the bone at any other location. In the example case of an anterior cruciate ligament repair, the location selected may be at or near the upper or superior portion on the intercondylar notch. If the bone model shows another location with a unique feature, such as a bone spur or other raised or sunken surface anomaly, such a unique location may be selected as the attachment location for the patient-specific instrument. For example, for hip procedures, the location may be selected based on a location, within the hip joint, of a bone spur or other anatomical feature associated with the hip procedure.
[0082] Moreover, forming the patient-specific instrument may take any suitable form. In one example, a device or computer system may directly print, such as using a 3D printer, the patient-specific instrument. In other cases, the device or computer system may print a model of the attachment location, and the model may then become the mold for creating the patient-specific instrument. For example, the model may be the mold for an injection-molded plastic or casting technique. In some examples, the patient-specific instrument carries one or more fiducials, but as mentioned above, in other cases the patient-specific instrument may itself be tracked and thus carry no fiducials.
[0083] The method 500 further includes coupling the patient-specific instrument to the bone, in some cases the patient-specific instrument having the fiducial coupled to an exterior surface (block 506). As described above, the attachment location for the patient-specific instrument can be selected to be unique such that the patient-specific instrument couples to the bone in only one location and in only one orientation. In the example case of an arthroscopic procedure, the patient-specific instrument may be inserted arthroscopically. That is, the attachment location may be selected such that a physical size of the patient-specific instrument enables insertion through the ports/portals in the patient’s skin. In other cases, the patient-specific instrument may be made or constructed of a flexible material that enables the patient-specific instrument to deform for insertion in the surgical site, yet return to the predetermined shape for coupling to the attachment location. However, in open procedures where the skin is cut and pulled away, exposing the bone to the open air, the patient-specific instrument may be a rigid device with fewer size restrictions.
[0084] The method 500 further includes capturing video images of the patientspecific instrument (block 508). Here again, the capturing may be performed intraoperatively. In the example case of an arthroscopic anterior cruciate ligament repair, the capturing of video images is by the surgical controller 118 by way of arthroscope 108 and camera 110. However, in open procedures where the skin is cut and pulled away, exposing the bone to the open air, the capturing may be by any suitable camera device, such as one or both cameras of a stereoscopic camera systems, or a portable computing device, such as a tablet or smart-phone device. In such cases, the video images may be provided to the surgical controller 118 in any suitable form.
[0085] The example method 500 further includes registering the bone model based on the location of the patient-specific instrument (block 510). That is, given that the patient-specific instrument couples to the bone at only one location and in only one orientation, the location and orientation of the patient-specific instrument is directly related to the location and origination of the bone, and thus the coordinate transformations and scaling for the registration may be calculated directly. Thereafter, the example method 500 may end; however, the surgical controller 118 may then use the registered bone model to provide computer-assisted navigation regarding a surgical task or surgical procedure involving the bone.
[0086] For example, with the registered bone model the surgical controller 118 may provide guidance regarding a surgical task of a surgical procedure. The specific guidance is dependent upon the surgical procedure being performed and the stage of the surgical procedure. A non-exhaustive list of guidance comprises: changing a drill path entry point; changing a drill path exit point; aligning an aimer along a planned drill path; showing location at which to cut and/or resect the bone; reaming the bone by a certain depth along a certain direction; placing a device (suture, anchor or other) at a certain location; placing a suture at a certain location; placing an anchor at a certain location; showing regions of the bone to touch and/or avoid; and identifying regions and/or landmarks of the anatomy. In yet still other cases, the guidance may include highlighting within a version of the video images displayed on a display device, which can be the arthroscopic display or a see-through display, or by communicating to a virtual reality device or a robotic tool.
[0087] The systems and methods described above in FIGS. 1 -5 may be implemented with a shaver/burr instrument configured in accordance with the principles of the present disclosure. For example, tracking locations of a shaver/burr instrument using the systems and methods described above may be difficult (e.g., relative to other instruments used in arthroscopic procedures, such as the probe 124). Accordingly, a shaver/burr according to the present disclosure incudes various configurations or arrangements of fiducial markers to facilitate tracking by a surgical navigation system (e.g., the system 100) as described below in more detail.
[0088] FIG. 6 shows an example shaver/burr instrument (in this example, a burr instrument 600) having a distal tip 602 configured in accordance with the principles of the present disclosure. Although described below with respect to the burr instrument 600, the principles of the present disclosure can be implemented with shaver instruments and/or other types of cutting instruments including stationary and rotating components similar to the burr instrument 600.
[0089] The burr instrument 600 includes an outer sheath 604 (e.g., extending distally from a handpiece, not shown in FIG. 6, held by a surgeon) and an inner burr cutting assembly 606 configured to rotate within the sheath 604 (e.g., responsive to control signals received via the handpiece). The cutting assembly 606 includes a corresponding cutting end (e.g., a burr tip 608) configured to, for example, cut and remove bone. The cutting assembly 606 passes through and extends from an opening of an inner channel 610 defined by the outer sheath 604. For example, the burr tip 608 is disposed at a distal end of an inner shaft 612 configured to rotate within the sheath 604. In examples where the instrument is a burr instrument, the cutting end may refer to any portion of the burr tip 608. In examples where the instrument is another type of cutting instrument, the cutting end may refer to a distal tip of a cutting member or assembly (e.g., a blade tip).
[0090] Due to an inclined trajectory of the burr instrument 600 into tissue or bone (i.e. , a trajectory not perpendicular or normal to a surface of the bone), the outer sheath 604 includes a cutaway 614 (a window, opening, etc.) to provide clearance for the cutting assembly 606. For example, the cutaway 614 corresponds to a portion of the sheath 604 that does not extend over the burr tip 608, thereby exposing (i.e., leaving uncovered) the burr tip 608. As one example, the cutaway 614 is defined in two or more sides of the outer sheath 604 such that the cutaway completely exposes (i.e., does not cover any portion of) the burr tip 608 on the two or more sides. As shown, the burr tip 608 extends beyond the distal tip 602.
[0091] At or proximate the distal tip 602, the outer sheath 604 includes one or more planes or planar surfaces 616. As one example, as shown in FIG. 6, the outer sheath 604 includes four (4) planar surfaces 616 (although only two of the planar surfaces 616 are visible). In other words, the outer sheath 604 may be generally square or rectangular. In other examples, the outer sheath 604 may include fewer or more than four of the planar surfaces 616.
[0092] The burr instrument 600 includes one or more fiducial markers 620-1 and 620- 2 (referred to collectively as the fiducial markers 620) positioned on respective planar surfaces 616. Each of the fiducial markers 620 may include one or more actual fiducial markings. For example, as shown, each of the fiducial markers 620 includes two (a pair) of fiducial markings.
[0093] The fiducial markers 620 (and, more specifically, the actual fiducial markings) are located/positioned as close as possible to the distal tip 602 (and, accordingly, the burr tip 608) without intersecting the cutaway 614. In other words, a gap 622 between the fiducial markers 620 and the distal tip 602 and/or the cutaway 614 is minimized. As one example, the gap between at least one of the fiducial markings of each of the fiducial markers 620 and the cutaway 614 is less than 2.0 mm. In various examples, one or more of the fiducial markers 620 may be positioned within 10.00 mm of the cutting end (e.g., the burr tip 608). In various examples, one or more of the fiducial markers 620 may be positioned less than 10.00 mm from the cutting end (e.g., within 8.0 mm, 6.0 mm, 4.0 mm, or 2.0 mm of the cutting end).
[0094] In some examples, at least one of the fiducial markers 620 (as shown, the fiducial marker 620-1 ) extends over/overlaps the burr tip 608 in the distal direction. In other words, the fiducial marker 620-1 extends past a proximal end or edge of the burr tip 608 in the distal direction. As shown, the fiducial marker 620-1 extends over at least 50% of the diameter of the burr tip 608. In this manner, the fiducial marker 620- 1 is positioned as close as possible to the location of the burr tip 608. Accordingly, a detected location of the fiducial marker 620-1 is similar to the actual location of the burr tip 608. In other examples, the fiducial marker 620-1 does not overlap the burr tip 608 and instead terminates within 2.0 mm of the burr tip 608.
[0095] Although only two of the fiducial markers 620 are shown in FIG. 6, the fiducial markers 620 may be positioned such that at least one of the fiducial markers 620 is visible (i.e. , to an endoscopic camera) in a full 360° around a rotation axis of the burr instrument 600. For example, for an outer sheath having a generally square or rectangular shape, the burr instrument 600 may include four fiducial markers 620 arranged on respective sides/planar surfaces. In this manner, visibility of the fiducial markers 620 is increased (i.e., visibility is maximized regardless of a rotational position of the outer sheath 604 relative to an endoscope).
[0096] Systems and methods according to the present disclosure are configured (e.g., using the techniques described above in FIGS. 1 -5) to determine a location of the burr instrument 600 (and, more particularly, a location of the distal tip 602, the burr tip 608, etc.) based on locations of the fiducial markers 620. For example, for a given instrument, locations of various components or geometric entities of the burr instrument 600 relative to the fiducial markers 620 are known (e.g., as previously determined, such as during manufacturing, calibration, etc., and stored as data accessible by the system 100). Upon detection of the fiducial markers 620, locations of various portions of the burr instrument 600, including the burr tip 608, can be determined based on the locations of the fiducial markers and the stored data. In combination with techniques described herein, the location of the burr tip 608 relative to patient anatomy can also be determined.
[0097] FIGS. 7A, 7B, and 7C show (in a perspective view) another example burr instrument 700 according to the principles of the present disclosure. Similar to the burr instrument 600, the burr instrument 700 includes a distal tip 702, an outer sheath 704, an inner cutting assembly 706, and a burr tip 708 disposed at a distal end of an inner shaft 712. FIGS. 7A, 7B, and 7C show the cutting assembly 706 at different rotational positions relative to the outer sheath 704. A cutaway 714 in the sheath 704 provides clearance for and facilitates visibility of the burr tip 708 and the cutting assembly 706.
[0098] The sheath 704 may include one or more planar surfaces 716. Although not shown in FIGS. 7A, 7B, and 7C, the planar surfaces 716 of the sheath 704 may include respective fiducial markers. However, in this example, the cutting assembly 706 (e.g., the inner shaft 712) includes one or more fiducial markers 720 arranged on a corresponding planar (or generally/approximately planer) surface 724 of the shaft 712. In some examples, the surface 724 (i.e. , a fiducial plane) is parallel to a rotation axis of the cutting assembly 706. In other examples, the surface 724 may not be parallel to the rotation axis of the cutting assembly 706.
[0099] Accordingly, in this example, the fiducial marker 720 is arranged directly on the cutting assembly 706 to facilitate optical detection and location of the cutting assembly 706. Optical detection/tracking techniques as described herein (e.g., techniques configured to detect and locate fiducial markers using images obtained using an endoscopic camera) can be configured to detect the fiducial marker 720 even when the cutting assembly 706 is rotating. In some examples, various mitigating technologies (e.g., strobes or tissue removers) may enhance the readability of the rotating fiducial marker 720.
[0100] In some examples, at least a portion of the outer sheath 704 may be transparent or include an aperture or window 726 to enable viewing of the fiducial marker 720 at additional rotational positions of the sheath 704 (e.g., outer sheath positions relative to the inner cutting assembly, the endoscopic camera, etc.).
[0101] FIGS. 8A, 8B, and 8C show other example arrangements of fiducial markers on a burr instrument 800 according to the present disclosure. In various examples, the fiducial markers and/or planes containing the fiducial markers may have various orientations relative to a longitudinal axis of the burr instrument 800. As one example as shown in FIG. 8A, fiducial markers 804 can be positioned at any orientation relative to a longitudinal axis 808 of the burr instrument 800. In other words, rather than simply being parallel to the axis 808 (as shown, for example, in FIG. 6), one or more of the fiducial markers 804 can be skewed, rotated, etc. relative to the axis 808. For example, for square or rectangular fiducial markers, lateral sides of the fiducial markers 804 are not required to be parallel or perpendicular to the axis 808.
[0102] As another example as shown in FIG. 8B, fiducial planes (i.e. , the planes on which the fiducial markers 804 are arranged) can be positioned at any orientation relative to the longitudinal axis 808 of the burr instrument 800. In other words, rather than simply being parallel to the axis 808 (as shown, for example, in FIG. 6), the fiducial planes can be skewed, rotated, etc. relative to the axis 808. As another example also shown in FIG. 8B, the fiducial markers 804 may be arranged on a curved outer sheath 812, and/or the fiducial markers 804 themselves may be curved (i.e., not planar).
[0103] As still another example as shown in FIG. 8C, the fiducial markers 804 may be of different sizes on the same burr instrument 800 to facilitate viewing/detection from an endoscope positioned at different distances relative to the burr instrument 800. Further, the fiducial markers 804 may have non-uniform and/or non-symmetrical shapes.
[0104] In some examples, a pose of a rotating fiducial marker can be tracked (e.g., as a function of time) to determine the rotational axis or other positional/operational features of the burr tip, the cutting assembly, etc.
[0105] Accordingly, in various examples of shaver/burr instruments of the present disclosure, fiducial markers are located in positions at a most distal end of the shaver/burr to achieve the greatest likelihood of being tracked (e.g., within a capsule of a joint). Further, for examples where the fiducials are positioned directly on the inner cutting assembly, tolerance stack-up is minimized (e.g., relative to examples where the fiducials are located only on the outer sheath).
[0106] FIG. 9 shows an example method 900 for performing instrument tracking techniques in accordance with the principles of the present disclosure. As described, the method 900 may be performed by one or more processing devices or processors, computing devices, etc., such as the system 100 or another computing device executing instructions stored in memory. One or more steps of the method 900 may be omitted in various examples, and/or may be performed in a different sequence than shown in FIG. 9. The steps may be performed sequentially or non-sequentially, two or more steps may be performed concurrently, etc.
[0107] At 904, the method 900 includes obtaining images (e.g., real-time or near real-time images) of a surgical environment including patient anatomy with one or more visual markers (e.g., a bone fiducial marker, which may be referred to as a base marker) fixed to patient anatomy as well as visual/fiducial markers associated with one or more surgical instruments. Obtaining the images may include obtaining images using an endoscopic/arthroscopic camera or other imaging device configured to provide an image feed. In some examples, prior to obtaining the images, an image scan of patient anatomy may be performed, such as by performing a pre-operative imaging scan (e.g., a CT scan), retrieving the stored image scan (as data) from memory, etc. In other examples, other imaging techniques may be used. The images obtained from the image feed may be aligned with a model that is generated based on the image scan to provide visual guidance as described herein. The obtained images may include a sparse or dense set of images of the surgical environment including the one or more visual markers (e.g., at least one visual marker).
[0108] At 908, the method 900 includes obtaining one or more images of the surgical environment while an instrument (e.g., a shaver/burr instrument) is present within the surgical environment. As one example, the images can be obtained using an endoscopic camera configured to provide a real-time or near real-time image feed of images to a surgical system. In various examples, the images may include the fiducial marker fixed to patient anatomy and one or more fiducial markers arranged on the shaver/burr instrument in accordance with the principles of the present disclosure.
[0109] At 912, the method 900 includes detecting one or more fiducial markers arranged on the shaver/burr instrument. At 916, the method 900 includes determining a position of one or more portions (e.g., a burr tip) of the shaver/burr instrument based on a location of the detected fiducial markers arranged on the shaver/burr instrument. For example, the position of the burr tip can be calculated based on a known relationship between the location of the burr tip and positions of the one or more fiducial markers. [0110] At 920, the method 900 includes performing one or more actions based on the position of the burr tip (and/or other portions of the shaver/burr instrument). For example, the one or more actions may include various actions associated with the detection and tracking techniques described herein. As one example, actions may include providing visual or other guidance (e.g., via a display or other interface) indicating the position/orientation of the instrument, burr tip, etc. For example, the visual guidance may include a virtual model or other image of the instrument or burr tip overlaying the surgical/anatomical site. As another example, the guidance may include instructions to the surgeon associated with the position of the burr tip. As still another example, the one or more actions may include control of the instrument based on the position of the burr tip (e.g. powering on or off, increasing or decreasing rotation speed, etc.).
[0111] FIG. 10 shows an example computer system or computing device 1000 configured to implement the various systems and methods of the present disclosure. In one example, the computer system 1000 may correspond to one or more computing devices of the system 100, the surgical controller 118, a device that creates a patientspecific instrument, a tablet device within the surgical room, or any other system that implements any or all the various methods discussed in this specification. For example, the computer system 1000 may be configured to implement all or portions of the method 900. The computer system 1000 may be connected (e.g., networked) to other computer systems in a local-area network (LAN), an intranet, and/or an extranet (e.g., device cart 102 network), or at certain times the Internet (e.g., when not in use in a surgical procedure). The computer system 1000 may be a server, a personal computer (PC), a tablet computer or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device. Further, while only a single computer system is illustrated, the term “computer” shall also be taken to include any collection of computers that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methods discussed herein.
[0112] The computer system 1000 includes a processing device 1002, a main memory 1004 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 1006 (e.g., flash memory, static random access memory (SRAM)), and a data storage device 1008, which communicate with each other via a bus 1010.
[0113] Processing device 1002 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 1002 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processing device 1002 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 1002 is configured to execute instructions for performing any of the operations and steps discussed herein. Once programmed with specific instructions, the processing device 1002, and thus the entire computer system 1000, becomes a special-purpose device, such as the surgical controller 118.
[0114] The computer system 1000 may further include a network interface device 1012 for communicating with any suitable network (e.g., the device cart 102 network). The computer system 1000 also may include a video display 1014 (e.g., the display device 114), one or more input devices 1016 (e.g., a microphone, a keyboard, and/or a mouse), and one or more speakers 1018. In one illustrative example, the video display 1014 and the input device(s) 1016 may be combined into a single component or device (e.g., an LCD touch screen).
[0115] The data storage device 1008 may include a computer-readable storage medium 1020 on which the instructions 1022 (e.g., implementing any methods and any functions performed by any device and/or component depicted described herein) embodying any one or more of the methodologies or functions described herein is stored. The instructions 1022 may also reside, completely or at least partially, within the main memory 1004 and/or within the processing device 1002 during execution thereof by the computer system 1000. As such, the main memory 1004 and the processing device 1002 also constitute computer-readable media. In certain cases, the instructions 1022 may further be transmitted or received over a network via the network interface device 1012. [0116] While the computer-readable storage medium 1020 is shown in the illustrative examples to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “computer-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term “computer- readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
[0117] The foregoing description is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses. The broad teachings of the disclosure can be implemented in a variety of forms. Therefore, while this disclosure includes particular examples, the true scope of the disclosure should not be so limited since other modifications will become apparent upon a study of the drawings, the specification, and the following claims. It should be understood that one or more steps within a method may be executed in different order (or concurrently) without altering the principles of the present disclosure. Further, although each of the embodiments is described above as having certain features, any one or more of those features described with respect to any embodiment of the disclosure can be implemented in and/or combined with features of any of the other embodiments, even if that combination is not explicitly described. In other words, the described embodiments are not mutually exclusive, and permutations of one or more embodiments with one another remain within the scope of this disclosure.

Claims

CLAIMS What is claimed is:
1. A surgical instrument configured to perform at least one of a shaving function and a burr function of a surgical procedure, the surgical instrument comprising: a handle; an outer sheath having (i) a proximal end coupled to the handle and (ii) a distal end opposite the proximal end, wherein the outer sheath defines an inner channel having an opening defined at the distal end; an inner cutting assembly disposed within the inner channel of the outer sheath, wherein the inner cutting assembly is configured to rotate within the outer sheath to perform the at least one of the shaving function and the burr function, and wherein the inner cutting assembly includes a cutting end exposed within the opening of the inner channel; and at least one fiducial marker located on a portion of the distal end of the outer sheath that overlaps the cutting end of the inner cutting assembly, wherein the at least one fiducial marker is configured to be detected by an optical tracking system.
2. The surgical instrument of claim 1 , wherein the surgical instrument is a burr/shaver instrument.
3. The surgical instrument of claim 1 , wherein the cutting end includes a burr tip, and wherein the at least one fiducial marker extends over at least 50% of the burr tip in a distal direction.
4. The surgical instrument of claim 1 , wherein the outer sheath includes a cutaway exposing at least a portion of the burr tip.
5. The surgical instrument of claim 4, wherein the cutaway is defined in two or more sides of the outer sheath.
6. The surgical instrument of claim 4, wherein a gap between the cutaway and the at least one fiducial marker is less than 2.0 mm.
7. The surgical instrument of claim 1 , wherein the inner cutting assembly includes an inner shaft, wherein the at least one fiducial marker includes a plurality of fiducial markers, and wherein at least one of the plurality of fiducial markers is located on the inner shaft.
8. The surgical instrument of claim 7, wherein the outer sheath includes at least one of a window and a transparent portion aligned with the at least one of the plurality of fiducial markers located on the inner shaft.
9. The surgical instrument of claim 1 , wherein the at least one fiducial marker is arranged on a plane that is not parallel to a longitudinal axis of the surgical instrument.
10. The surgical instrument of claim 1 , wherein the at least one fiducial marker includes first and second fiducial markers, and wherein the first and second fiducial markers have different shapes or different sizes relative to one another.
11. A system, comprising: a computing device configured to (ii) receive one or more images of a surgical site, the one or more images including the at least one fiducial marker of the surgical instrument of claim 1 , (ii) detect the at least one fiducial marker, and (iii) determine a location of the surgical instrument based on the detected at least one fiducial marker.
12. A system for tracking a position of a shaver/burr instrument relative to patient anatomy, the system comprising: memory storing instructions; and one or more processing devices configured to execute the instructions, wherein executing the instructions causes the one or more processing devices to receive one or more images of a surgical site, the one or more images including at least one fiducial marker of the shaver/burr instrument, wherein the at least one fiducial marker overlaps a cutting end of the shaver/burr instrument in a distal direction, detect the at least one fiducial marker of the shaver/burr instrument, and determine a location of the shaver/burr instrument within the surgical site based on the detected at least one fiducial marker.
13. The system of claim 12, further comprising the shaver/burr instrument.
14. The system of claim 13, wherein the shaver/burr instrument comprises: a handle; an outer sheath having (i) a proximal end coupled to the handle and (ii) a distal end opposite the proximal end, wherein the outer sheath defines an inner channel having an opening defined at the distal end; and an inner cutting assembly disposed within the inner channel of the outer sheath, wherein the inner cutting assembly is configured to rotate within the outer sheath to perform at least one of a shaving function and a burr function, and wherein the inner cutting assembly includes the cutting end, wherein the least one fiducial marker is located on a portion of the distal end of the outer sheath that overlaps the cutting end of the inner cutting assembly.
15. The system of claim 13, wherein the cutting end includes a burr tip, and wherein the at least one fiducial marker extends over at least 50% of the burr tip in the distal direction.
16. The system of claim 13, wherein the shaver/burr instrument includes an outer sheath and an inner cutting assembly configured to rotate within the outer sheath, wherein the at least one fiducial marker includes a plurality of fiducial markers, and wherein at least one of the plurality of fiducial markers is located on the inner cutting assembly.
17. The system of claim 16, wherein the outer sheath includes at least one of a window and a transparent portion aligned with the at least one of the plurality of fiducial markers located on the inner cutting assembly.
18. The system of claim 13, wherein the at least one fiducial marker is arranged on a plane that is not parallel to a longitudinal axis of the shaver/burr instrument.
19. The system of claim 13, wherein the at least one fiducial marker includes first and second fiducial markers, and wherein the first and second fiducial markers have different shapes or different sizes relative to one another.
20. A surgical instrument configured to perform at least one of a shaving function and a burr function of a surgical procedure, the surgical instrument comprising: a handle; an outer sheath having (i) a proximal end coupled to the handle and (ii) a distal end opposite the proximal end, wherein the outer sheath defines an inner channel having an opening defined at the distal end; an inner cutting assembly disposed within the inner channel of the outer sheath, wherein the inner cutting assembly is configured to rotate within the outer sheath to perform the at least one of the shaving function and the burr function, and wherein the inner cutting assembly includes a cutting end exposed within the opening of the inner channel; and at least one fiducial marker located within 10.0 mm of the cutting end, wherein the at least one fiducial marker is configured to be detected by an optical tracking system.
PCT/US2025/028579 2024-05-13 2025-05-09 Burr tracking for surgical navigation procedures Pending WO2025240248A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463646105P 2024-05-13 2024-05-13
US63/646,105 2024-05-13

Publications (1)

Publication Number Publication Date
WO2025240248A1 true WO2025240248A1 (en) 2025-11-20

Family

ID=97720614

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2025/028579 Pending WO2025240248A1 (en) 2024-05-13 2025-05-09 Burr tracking for surgical navigation procedures

Country Status (1)

Country Link
WO (1) WO2025240248A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100249817A1 (en) * 2008-12-16 2010-09-30 Mark Joseph L Positioning system for tissue removal device
US10307180B2 (en) * 2010-07-07 2019-06-04 Carevature Medical Ltd. Flexible surgical device for tissue removal
WO2023064433A1 (en) * 2021-10-13 2023-04-20 Smith & Nephew, Inc. Methods for surgical registration and tracking using hybrid imaging devices and systems thereof
US20230157764A1 (en) * 2010-04-14 2023-05-25 Smith & Nephew, Inc. Systems and methods for patient-based computer assisted surgical procedures
US20230240759A1 (en) * 2022-01-31 2023-08-03 Smith & Nephew, Inc. Modular and depth-sensing surgical handpiece

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100249817A1 (en) * 2008-12-16 2010-09-30 Mark Joseph L Positioning system for tissue removal device
US20230157764A1 (en) * 2010-04-14 2023-05-25 Smith & Nephew, Inc. Systems and methods for patient-based computer assisted surgical procedures
US10307180B2 (en) * 2010-07-07 2019-06-04 Carevature Medical Ltd. Flexible surgical device for tissue removal
WO2023064433A1 (en) * 2021-10-13 2023-04-20 Smith & Nephew, Inc. Methods for surgical registration and tracking using hybrid imaging devices and systems thereof
US20230240759A1 (en) * 2022-01-31 2023-08-03 Smith & Nephew, Inc. Modular and depth-sensing surgical handpiece

Similar Documents

Publication Publication Date Title
US12350002B2 (en) Soft tissue cutting instrument and method of use
AU2020311392B2 (en) Augmented reality assisted joint arthroplasty
EP3273854B1 (en) Systems for computer-aided surgery using intra-operative video acquired by a free moving camera
US20130211232A1 (en) Arthroscopic Surgical Planning and Execution with 3D Imaging
JP2019534717A (en) System for sensory enhancement in medical procedures
US20250031942A1 (en) Methods and systems for intraoperatively selecting and displaying cross-sectional images
AU2022401872B2 (en) Bone reamer video based navigation
US20250032189A1 (en) Methods and systems for generating 3d models of existing bone tunnels for surgical planning
EP4687624A1 (en) Methods and systems of registering a three-dimensional bone model
WO2025240248A1 (en) Burr tracking for surgical navigation procedures
US20250204991A1 (en) Smart, video-based joint distractor positioning system
US20250322514A1 (en) Automatic surgical marker motion detection using scene representations for view synthesis
US20250169890A1 (en) Systems and methods for point and tool activation
WO2025250376A1 (en) Structured light for touchless 3d registration in video-based surgical navigation
US20250049448A1 (en) Tunnel drilling aimer and iso-angle user interface
US20240197410A1 (en) Systems and methods for guiding drilled hole placement in endoscopic procedures
WO2024220671A2 (en) Tissue sensing probe for faster, more accurate registrations
WO2025019559A2 (en) Methods and systems for registering internal and external coordinate systems for surgical guidance
WO2025071739A1 (en) System and method for using machine learning to provide navigation guidance and recommendations related to revision surgery
WO2025122698A9 (en) Methods and systems for tunnel planning and navigation
Stindel et al. Bone morphing: 3D reconstruction without pre-or intra-operative imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25804131

Country of ref document: EP

Kind code of ref document: A1