[go: up one dir, main page]

WO2024161274A1 - Localization and treatment of target tissue using markers coated with near-infrared fluorophores - Google Patents

Localization and treatment of target tissue using markers coated with near-infrared fluorophores Download PDF

Info

Publication number
WO2024161274A1
WO2024161274A1 PCT/IB2024/050797 IB2024050797W WO2024161274A1 WO 2024161274 A1 WO2024161274 A1 WO 2024161274A1 IB 2024050797 W IB2024050797 W IB 2024050797W WO 2024161274 A1 WO2024161274 A1 WO 2024161274A1
Authority
WO
WIPO (PCT)
Prior art keywords
nir
target
images
radiopaque marker
medical device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2024/050797
Other languages
French (fr)
Inventor
Scott E.M. Frushour
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Covidien LP
Original Assignee
Covidien LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Covidien LP filed Critical Covidien LP
Priority to EP24703245.1A priority Critical patent/EP4658201A1/en
Priority to CN202480010464.2A priority patent/CN120787144A/en
Publication of WO2024161274A1 publication Critical patent/WO2024161274A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00743Type of operation; Specification of treatment sites
    • A61B2017/00809Lung operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00831Material properties
    • A61B2017/00867Material properties shape memory effect
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00831Material properties
    • A61B2017/00964Material properties composite
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00982General structural features
    • A61B2017/00995General structural features having a thin film
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • A61B2090/3764Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT] with a rotating C-arm having a cone beam emitting source
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • A61B2090/3941Photoluminescent markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3966Radiopaque markers visible in an X-ray image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3991Markers, e.g. radio-opaque or breast lesions markers having specific anchoring means to fixate the marker to the tissue, e.g. hooks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3995Multi-modality markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras

Definitions

  • the technology of the disclosure is generally related to the use of markers coated with a near-infrared (NIR) fluorophore to guide localization and treatment of target tissue.
  • NIR near-infrared
  • MRI magnetic resonance imaging
  • CT computed tomography
  • fluoroscopy a technique for identifying and navigate to areas of interest within a patient and to a target for biopsy or treatment.
  • preoperative scans may be utilized for target identification and intraoperative guidance.
  • real-time imaging or intraoperative imaging may also be required to obtain a more accurate and current image of the target area and an endoluminal medical tool used to biopsy or treat tissue in the target area.
  • real-time image data displaying the current location of a medical device with respect to the target and its surroundings may be needed to navigate the medical device to the target in a safe and accurate manner (e.g., without causing damage to other organs or tissue).
  • Realtime, intraoperative imaging may expose the patient to unnecessary and/or potentially unhealthy amounts of X-ray radiation.
  • An endoscopic approach has proven useful in navigating to areas of interest within a patient, and particularly so for areas within luminal networks of the body such as the lungs.
  • endobronchial navigation systems have been developed that use previously acquired MRI data or CT image data to generate a three-dimensional (3D) rendering, model, or volume of the particular body part such as the lungs.
  • 3D three-dimensional rendering, model, or volume of the particular body part such as the lungs.
  • the resulting volume generated from the MRI scan or CT scan is then utilized to create a navigation plan to facilitate the advancement of a navigation catheter (or other suitable medical tool) through a bronchoscope and a branch of the bronchus of a patient to an area of interest.
  • a locating or tracking system such as an electromagnetic (EM) tracking system, may be utilized in conjunction with, for example, CT data, to facilitate guidance of the navigation catheter through branches of the bronchus to the area of interest.
  • the navigation catheter may be positioned within one of the airways of the branched luminal networks adjacent to, or within, the area of interest to provide access for one or more medical tools.
  • a 3D volume of a patient’s lungs generated from previously acquired scans, such as CT scans, may not provide a basis sufficient for accurate guiding of medical devices or tools to a target during a surgical procedure.
  • the inaccuracy is caused by deformation of the patient’s lungs during the procedure relative to the lungs at the time of the acquisition of the previously acquired CT data.
  • This deformation may be caused by many different factors including, for example, changes in the body when transitioning from between a sedated state and a non-sedated state, the bronchoscope changing the patient’s pose, the bronchoscope pushing the tissue, different lung volumes (e.g., the CT scans are acquired during inhale while navigation is performed during breathing), different beds, different days, etc.
  • This deformation may lead to significant motion of a target, making it challenging to align a medical tool with the target in order to safely and accurately resect or otherwise treat target tissue.
  • systems and methods are needed to provide real-time imaging during an in-vivo navigation and resection or other treatment procedures. Furthermore, to navigate a medical tool safely and accurately to a remote target and resect or otherwise treat the remote target with the medical tool, either by a surgical robotic system or by a clinician using a guidance system, the systems and methods should minimize the clinician’s and patient’s exposure to intraoperative X-ray radiation.
  • the techniques of this disclosure generally relate to combining NICE technology with fiducial markers. This combination allows clinicians to place fiducial markers at or near a target with radiographic imaging. Then, clinicians can use NIR technology in surgery to visualize the target in real-time without exposing the patient and/or clinicians to radiation from radiographic imaging typically used to visualize fiducial markers.
  • this disclosure provides a method.
  • the method includes receiving preoperative radiographic images of a lung showing a target and receiving intraoperative radiographic images of the lung showing the target.
  • the method also includes displaying the preoperative radiographic images of the lung showing the target and displaying, in the intraoperative radiographic images, placement of at least one radiopaque marker coated with a near-infrared (NIR) fluorophore.
  • the method also includes intraoperatively capturing NIR images of the at least one radiopaque marker and displaying the NIR images of the at least one radiopaque marker.
  • NIR near-infrared
  • implementations of the method may include one or more of the following features.
  • Placing the at least one radiopaque marker may include placing the at least one radiopaque marker at or near the target.
  • the method may include illuminating a medical device at least partially coated with the NIR fluorophore advancing towards the at least one radiopaque marker with NIR light, intraoperatively capturing NIR images of the medical device advancing towards the at least one radiopaque marker, and displaying the NIR images of the medical device advancing towards the at least one radiopaque marker.
  • the method may include displaying the NIR images of the medical device treating the target at or near the at least one radiopaque marker. Treating the target may include resecting the target.
  • the NIR fluorophore may include indocyanine green (ICG) dye, methylene blue (MB) dye, single-walled carbon nanotubes (SWCNTs), quantum dots, or rare-earth-doped nanoparticles (RENPs).
  • ICG indocyanine green
  • MB methylene blue
  • SWCNTs single-walled carbon nanotubes
  • RFPs rare-earth-doped nanoparticles
  • the preoperative radiographic images are computed tomography (CT) images, magnetic resonance imaging (MRI) images, fluoroscopic images, or cone-beam CT (CBCT) images.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • CBCT cone-beam CT
  • the method may include illuminating the at least one radiopaque marker with NIR light.
  • this disclosure provides another method.
  • the method includes receiving preoperative radiographic images of a lung showing a target and receiving intraoperative radiographic images of the lung showing the target.
  • the method also includes displaying the preoperative radiographic images of the lung showing the target and displaying, in the intraoperative radiographic images, placement of at least one radiopaque marker coated with a near-infrared (NIR) fluorophore at or near the target in the lung.
  • the method also includes intraoperatively capturing NIR images of the at least one radiopaque marker and controlling, with a robotic end effector, a medical device to advance towards the target based on the NIR images.
  • NIR near-infrared
  • implementations of the method may include one or more of the following features.
  • a distal portion of the medical device may be coated with the NIR fluorophore
  • the method may include illuminating the at least one radiopaque marker and the distal portion of the medical device with NIR light, intraoperatively capturing NIR images of the at least one radiopaque marker and the distal portion of the medical device, and controlling the medical device to advance towards the target based on the NIR images of the at least one radiopaque marker and the distal portion of the medical device.
  • the method may include controlling the medical device to treat the target based on the NIR images. In aspects, the method may include controlling the medical device to biopsy or resect the target based on the NIR images. In aspects, the method may include illuminating the at least one radiopaque marker with NIR light.
  • this disclosure provides a system. The system includes at least one radiopaque marker coated with near-infrared (NIR) fluorophore, an endoluminal instrument that places the at least one radiopaque marker within a lung of a patient, an NIR camera that captures NIR video of the at least one radiopaque marker, and a treatment instrument.
  • NIR near-infrared
  • the system also includes a processor and a memory having stored thereon instructions, which, when executed by the processor, cause the processor to receive preoperative radiographic images of a lung showing a target and receive intraoperative radiographic images of the lung showing the target.
  • the instructions when executed by the processor, also cause the processor to display the preoperative radiographic images of the lung showing the target, display, in the intraoperative radiographic images, the endoluminal instrument placing at least one radiopaque marker coated with a NIR fluorophore at or near the target in the lung, and intraoperatively capture NIR video of the at least one radiopaque marker to facilitate treatment of the target by the target instrument.
  • implementations of the system may include one or more of the following features.
  • the system may include a robotic end-effector that holds and manipulates the treatment instrument.
  • the instructions when executed by the processor, may cause the processor to control the robotic end-effector to advance the treatment instrument towards the target and to treat the target with the treatment instrument based on the NIR video.
  • the system may include an NIR light source that illuminates the at least one radiopaque marker with NIR light.
  • the NIR light source may be coupled to the treatment instrument.
  • the treatment instrument may be a thoracoscopic surgical instrument.
  • the treatment instrument may be configured to perform a lobectomy, a segmentectomy, or a wedge resection.
  • the instructions when executed by the processor, may cause the processor to display, in the intraoperative radiographic images, the endoluminal instrument placing two or more radiopaque markers coated with an NIR fluorophore in two or more air passages of the lung at or near the target.
  • FIG. 1 is a diagram of a system for navigating to targets via luminal networks in accordance with the disclosure
  • FIG. 2 is a perspective view that illustrates a coil fiducial marker coated with an indocyanine green (ICG) fluorophore;
  • ICG indocyanine green
  • FIG. 3 is a perspective view that illustrates a band fiducial marker coated with the indocyanine green (ICG) fluorophore;
  • ICG indocyanine green
  • FIG. 4 is a perspective view that illustrates a guidewire and a catheter configured to place fiducial markers within lung tissue
  • FIG. 5 is a diagram that illustrates a video assisted thoracoscopic surgery (VATS) after placement of fiducial markers using a bronchoscope;
  • FIG. 6 is a flowchart of an example of a method for visualizing a marker in accordance with the disclosure;
  • FIG. 7 is a block diagram that illustrates a robotic surgical system
  • FIG. 8 is a system block diagram that illustrates a robotic surgical control system for controlling the robotic surgical system of FIG. 7;
  • FIG. 9 is a flowchart of an example of a method for controlling a robotic arm to advance a medical instrument towards a target based on a marker coated with an NIR fluorophore in accordance with the disclosure.
  • FIG. 10 is a block diagram that illustrates a system for visualizing placement of fiducial markers and guidance of a medical instrument towards a target to be treated in accordance with the disclosure.
  • NICE Near-infrared coating of equipment
  • a clinician may use a bronchoscope to install fiducial markers within the lung to guide physicians with radiographic imaging sources where to find a target to be treated, e.g., resection.
  • a detractor of this technology is that the technology requires use of a radiation source (e.g., a fluoroscope), which typically does not provide real-time imaging because use of the radiation source is limited to minimize exposure of the clinicians and patients to large amounts of radiation.
  • ICG indocyanine green
  • NIR near-infrared
  • VATS video-assisted thoracoscopic surgery
  • RATS robotic-assisted thoracoscopic surgery
  • aspects of the disclosure combine NICE technology with fiducial markers. This combination allows clinicians, e.g., pulmonologists, to place fiducial markers with radiographic imagery. Then, surgeons can use NIR technology in surgery to visualize the target in real-time.
  • This disclosure features systems and methods that involve placing fiducial markers coated with NIR fluorophores at or near target tissue and visualizing the fiducial markers in real time via an NIR camera while operating a medical instrument, e.g., via controlling a robotic arm holding the medical instrument, to ensure the medical instrument can safely and accurately treat, e.g., resect, the target tissue.
  • the systems and methods involve receiving and displaying high resolution preprocedural or preoperative 3D images, e.g., preoperative CT images, to enable a clinician to identify target tissue, e.g., a tumor, to be treated, e.g., resected, in the 3D images.
  • the systems and methods also involve receiving and displaying intraoperative radiographic images of the lung showing the target, and displaying, in the intraoperative radiographic images, placement of at least one radiopaque marker coated with an NIR fluorophore at or near the target in the lung. Then, NIR images of the at least one radiopaque marker are intraoperatively captured and displayed to enable a clinician or robot to visualize the at least one radiopaque marker and thus localize the target for treatment without the need for radiographic imaging.
  • FIG. 1 is a perspective view of an example of a system 100 for facilitating navigation and placement of medical tools, e.g., markers and resection tools, to a soft-tissue target via airways of the lungs.
  • the system 100 may optionally be configured to generate a three-dimensional (3D) reconstruction of the target area from 2D fluoroscopic images.
  • intraoperative 2D fluoroscopic images may be captured only during critical parts of a procedure, e.g., to confirm placement of a medical tool in a target, in order to minimize human exposure to X-ray radiation.
  • the system 100 may be further configured to facilitate approach of a medical tool to the target area by using Electromagnetic Navigation Bronchoscopy (ENB) and for determining the location of a medical tool with respect to the target.
  • ENB Electromagnetic Navigation Bronchoscopy
  • One aspect of the system 100 is a software component for reviewing computed tomography (CT) image data that has been acquired separately from system 100.
  • CT computed tomography
  • the review of the CT image data allows a user to identify one or more targets, plan a pathway to an identified target (planning phase), navigate a catheter 102 to the target (navigation phase) using a user interface, and confirming placement of an EM sensor 104 relative to the target.
  • EMN system is the ELECTROMAGNETIC NAVIGATION BRONCHOSCOPY® system, which is referred to as ENB, currently sold by Medtronic PLC.
  • ENB ELECTROMAGNETIC NAVIGATION BRONCHOSCOPY®
  • the target may be tissue of interest identified by review of the CT image data during the planning phase.
  • a medical tool such as a biopsy tool or other tool, may be inserted into the catheter 102 to obtain a tissue sample from the tissue located at, or proximate to, the target.
  • the catheter 102 is part of a catheter guide assembly 110, which also includes a handle 106 for controlling the catheter 102.
  • the catheter 102 is inserted into a bronchoscope 108 using the handle 106 for access to a luminal network of the patient P.
  • the catheter 102 of the catheter guide assembly 110 may be inserted into a working channel of the bronchoscope 108 for navigation through a patient’s luminal network.
  • a locatable guide (not shown), including an electromagnetic (EM) sensor 104 is inserted into the catheter 102 and locked into position such that the EM sensor 104 extends a desired distance beyond the distal tip of the catheter 102.
  • EM electromagnetic
  • the catheter guide assembly 110 is currently marketed and sold by Medtronic PLC under the brand names SUPERDIMENSION® Procedure Kits, or EDGETM Procedure Kits, and are contemplated as useable with aspects of the disclosure.
  • the system 100 generally includes an operating table 112 configured to support a patient P, a bronchoscope 108 configured for insertion through patient P’s mouth into patient P’s airways; monitoring equipment 114 coupled to the bronchoscope 108 (e.g., a video display, for displaying the video images received from the video imaging system of bronchoscope 108); a locating or tracking system 115 including a locating module 116, patient motion sensors 118, and a transmitter mat 120, which may include multiple markers; and a computer system 122 including software and/or hardware used to facilitate identification of a target, pathway planning to the target, navigation of a medical tool to the moving target, and/or confirmation and/or determination of placement of the catheter 102, or a suitable medical tool therethrough, relative to the target.
  • the computer system 122 may be similar to workstation 1001 of FIG. 10 and may be configured to execute the methods of the disclosure including the methods of FIGS. 6 and 9.
  • An imaging system 124 capable of acquiring fluoroscopic or x-ray images or video of the patient P is optionally included in some aspects of the system 100.
  • the images, sequence of images, or video captured by the imaging system 124 may be stored within the imaging system 124 or transmitted to the computer system 122 for storage, processing, and display. Additionally, imaging system 124 may move relative to the patient P so that images may be acquired from different angles or perspectives relative to patient P to create a sequence of 2D images, such as a video.
  • the pose of the imaging system 124 relative to the patient P and while capturing the images may be estimated via markers incorporated with the transmitter mat 120.
  • the markers are positioned under patient P, between patient P and operating table 112 and between patient P and a radiation source or a sensing unit of the imaging system 124.
  • the markers incorporated with the transmitter mat 120 may be two separate elements which may be coupled in a fixed manner or alternatively may be manufactured as a single unit.
  • the imaging system 124 may include a single imaging system or more than one imaging systems. As illustrated in FIG. 1, the imaging system 124 may include a fluoroscopic imaging system, which is merely used to confirm placement of a medical tool near a target prior to biopsy, treatment, or resection of a target.
  • the computer system 122 may be any suitable computer system including a processor and storage medium, wherein the processor is capable of executing instructions stored on the storage medium.
  • the computer system 122 may further include a database configured to store patient data, preoperative CT data sets, navigation plans, optionally fluoroscopic data sets including fluoroscopic images and video, optionally fluoroscopic 3D reconstruction, and any other such data.
  • the computer system 122 may include inputs for, or may otherwise be configured to receive, preoperative CT data sets, optional fluoroscopic images/video, and other data described herein.
  • the computer system 122 includes a display configured to display graphical user interfaces.
  • the computer system 122 may be connected to one or more networks through which one or more databases may be accessed.
  • the computer system 122 utilizes previously acquired CT image data for determining regular motion of a patient, e.g., motion caused by respiration, utilizes the same or different previously acquired CT image data for generating and viewing a three-dimensional model or rendering of patient P’s airways, enables the identification of a target on the three-dimensional model (automatically, semi-automatically, or manually), and allows for determining a pathway through patient P’s airways to tissue located at and around the target. More specifically, the CT images acquired from the previous CT scans are processed and assembled into a three-dimensional CT volume, which is then utilized to generate a three-dimensional model of patient P’s airways.
  • the three-dimensional model may be displayed on a display associated with the computer system 122, or in any other suitable fashion. Using the computer system 122, various views of the three-dimensional model or enhanced two-dimensional images generated from the three-dimensional model are presented. The enhanced two-dimensional images may possess some three-dimensional capabilities because they are generated from three-dimensional data.
  • the three-dimensional model may be manipulated to facilitate identification of target on the three-dimensional model or two-dimensional images, and selection of a suitable pathway through patient P’s airways to access tissue located at the target can be made. Once selected, the pathway plan, three- dimensional model, and images derived therefrom, can be saved and exported to a navigation system for use during the navigation phase(s).
  • One such planning software is the ILLUMISITE® planning suite currently sold by Medtronic PLC.
  • a six degrees-of-freedom electromagnetic locating or a tracking system 115 is utilized for performing registration of the images and the pathway for navigation, although other configurations are also contemplated.
  • the tracking system 115 includes the tracking module 116, the patient motion sensors 118, which may also be used as patient motion sensors, and the transmitter mat 120 (including the markers).
  • the tracking system 115 is configured for use with a locatable guide and particularly EM sensor 104. As described above, the locatable and the EM sensor 104 are configured for insertion through the catheter 102 into patient P’s airways (either with or without the bronchoscope 108) and are selectively lockable relative to one another via a locking mechanism.
  • the transmitter mat 120 is positioned beneath patient P.
  • the transmitter mat 120 generates an electromagnetic field around at least a portion of the patient P within which the positions of the patient motion sensors 118 and the EM sensor 104 can be determined with use of a tracking module 116.
  • a second EM sensor 126 may also be incorporated into the end of the catheter 102.
  • the second EM sensor 126 may be a five degree-of-freedom sensor or a six degree -of-freedom sensor.
  • One or more of the patient motion sensors 118 are attached to the chest of the patient P or at suitable positions on the patient’s body that optimize the sensing of the motion of the patient.
  • the six degrees of freedom coordinates of the patient motion sensors 118 are sent to the computer system 122 (which includes the appropriate software) where they are used to calculate a patient coordinate frame of reference.
  • the six degrees of freedom coordinates of the patient motion sensors 118 are also sent to the computer system 122 where they are used to track the real-time motion of the patient, which may be caused by respiration cycles, e.g., inspiration and expiration, of the patient.
  • Registration is generally performed to coordinate locations of the three-dimensional model and two-dimensional images from the planning phase, with the patient P’s airways as observed through the bronchoscope 108, and allow for the navigation phase to be undertaken with precise knowledge of the location of the EM sensor 104, even in portions of the airway where the bronchoscope 108 cannot reach.
  • Registration of the patient P’s location on the transmitter mat 120 may be performed by moving the EM sensor 104 through the airways of the patient P. More specifically, data pertaining to locations of the EM sensor 104, while locatable guide is moving through the airways, is recorded using the transmitter mat 120, the patient motion sensors 118, and the tracking system 115. A shape resulting from this location data is compared to an interior geometry of passages of the three-dimensional model generated in the planning phase, and a location correlation between the shape and the three-dimensional model based on the comparison is determined, e.g., utilizing the software on the computer system 122. In addition, the software identifies non-tissue space (e.g., air filled cavities) in the three-dimensional model.
  • non-tissue space e.g., air filled cavities
  • the software aligns, or registers, an image representing a location of the EM sensor 104 with the three-dimensional model and/or two-dimensional images generated from the three-dimension model, which are based on the recorded location data and an assumption that the locatable guide remains located in non-tissue space in patient P’s airways.
  • a manual registration technique may be employed by navigating the bronchoscope 108 with the EM sensor 104 to pre-specified locations in the lungs of the patient P, and manually correlating the images from the bronchoscope to the model data of the three-dimensional model.
  • the instant disclosure is not so limited and may be used in conjunction with flexible sensors, ultrasonic sensors, or other suitable motion sensors. Additionally, the methods described herein may be used in conjunction with robotic systems such that robotic actuators drive the catheter 102, bronchoscope 108, or other medical tool proximate the target.
  • An example of a robotic system is illustrated in FIGS. 7 and 8.
  • a user interface is displayed in the navigation software which sets for the pathway that the clinician is to follow to reach the target.
  • the locatable guide may be unlocked from the catheter 102 and removed, leaving the catheter 102 in place as a guide channel for guiding medical tools including without limitation, optical systems, ultrasound probes, NIR fluorophore-coated tools, radiopaque fiducial marker placement tools, resection tools, biopsy tools, ablation tools (i.e., microwave ablation tools), laser probes, cryogenic probes, sensor probes, and aspirating needles to the target.
  • a medical tool may be then inserted through the catheter 102 and navigated to the target or to a specific area adjacent to the target.
  • a local registration process may optionally be performed for each target to reduce the CT-to-body divergence.
  • a sequence of fluoroscopic images may be captured and acquired via the imaging system 124, optionally by a user and according to directions displayed via the computer system 122.
  • a fluoroscopic 3D reconstruction may be then generated via the computer system 122. The generation of the fluoroscopic 3D reconstruction is based on the sequence of fluoroscopic images and the projections of structure of markers incorporated with transmitter mat 120 on the sequence of fluoroscopic images.
  • One or more slices of the 3D reconstruction may be then generated based on the pre -operative CT scan and via the computer system 122.
  • the one or more slices of the 3D reconstruction and the fluoroscopic 3D reconstruction may be then displayed to the user on a display via the computer system 122, optionally simultaneously.
  • the slices of 3D reconstruction may be presented on the user interface in a scrollable format where the user is able to scroll through the slices in series.
  • the clinician may be directed to identify and mark the target while using the slices of the 3D reconstruction as a reference.
  • the user may also be directed to identify and mark the navigation catheter tip in the sequence of fluoroscopic 2D images. An offset between the location of the target and the navigation catheter tip may be then determined or calculated via the computer system 122.
  • the offset may be then utilized, via the computer system 122, to correct the location and/or orientation of the navigation catheter on the display with respect to the target and/or correct the registration between the three-dimensional model and tracking system 115 in the area of the target and/or generate a local registration between the three-dimensional model and the fluoroscopic 3D reconstruction in the target area.
  • a fluoroscopic 3D reconstruction is displayed in a confirmation screen.
  • the confirmation screen may include a slider that may be selected and moved by the user to review a video loop of the fluoroscopic 3D reconstruction, which shows the marked target and navigation catheter tip from different perspectives.
  • the clinician may select an “Accept” button, at which point the local registration process ends and the position of the navigation catheter is updated. The clinician may then use the navigation views in, for example, the peripheral navigation screen to fine tune the alignment of the navigation catheter to the target before beginning an endoscopic procedure.
  • the clinician or robotic arm may insert a medical tool in the catheter 102 and advance the medical tool towards the target. While advancing the medical tool towards the target, the clinician may view a user interface screen which includes a 3D medical tool tip view of a 3D model of a target, in which the 3D model of the target is moved according to motion of the target determined from pre- operative CT scans of a patient’s motion and the motion sensed by the patient motion sensors.
  • This user interface screen allows the clinician to not only see the medical tool in real-time, but also allows the clinician to see whether the medical tool is aligned with the moving target.
  • the user interface screen may also provide a graphical indication of whether the medical tool is aligned in three-dimensions with the target.
  • the user interface shows the target overlay in a first color, e.g., green.
  • the user interface shows the target overlay in a second color different from the first color, e.g., orange or red.
  • FIG. 2 illustrates a nitinol coil fiducial marker 200, which is an example of a fluorophore-coated marker that may be placed at or near a target within an air passage of the lung.
  • the nitinol coil fiducial marker 200 includes a metallic seed 202 and a nitinol wire coil 204 secured to the metallic seed 202.
  • the metallic seed 202 may be made of pure gold, a gold alloy, or any other pure metal or metal alloy that is radiopaque.
  • the shape memory in the nitinol wire coil 204 may be designed to secure the nitinol coil fiducial marker in place and minimize migration.
  • the metallic seed 202 is coated with an indocyanine green (ICG) fluorophore 205.
  • ICG indocyanine green
  • the metallic seed 202 may be coated with any other NIR fluorophore suitable for use in a mammalian body.
  • the NIR fluorophore may include an inorganic fluorophore, an organic dye, and/or a fluorescent protein (FP).
  • FP fluorescent protein
  • the advantages of inorganic fluorophores include prolonged emission wavelength and high fluorescence quantum yield (QY), which is the ratio of the number of photons emitted by the fluorophore to the number of photons absorbed by the fluorophore. These advantages make inorganic fluorophores suitable for in vivo bioimaging in deeper tissue areas.
  • the inorganic fluorophore may include single-walled carbon nanotubes (SWCNTs), quantum dots, and rare-earth-doped nanoparticles (RENPs).
  • SWCNTs single-walled carbon nanotubes
  • quantum dots quantum dots
  • RTPs rare-earth-doped nanoparticles
  • FIG. 3 is a perspective view that illustrates band fiducial markers 300a, 300b coated with the indocyanine green (ICG) fluorophore 205.
  • the band fiducial marker 300a includes three metallic seeds 302a and two columns 304a connecting the three metallic seeds 302a.
  • the band fiducial marker 300b includes two metallic seeds 302b and one column 304b connecting the two metallic seeds 302b.
  • the metallic seeds 302a, 302b may be made of pure gold, a gold alloy, or any other pure metal or metal alloy that is radiopaque.
  • the columns 304a, 304b may be designed to be strong enough to allow the band fiducial markers 300a, 300b to be pushed into tissue.
  • the metallic seeds 302a, 302b are coated with the indocyanine green (ICG) fluorophore 205.
  • FIG. 4 is a perspective view that illustrates a guidewire 410 and a catheter 420 configured to place fiducial markers within lung tissue.
  • the curve 422 and the coating 424 of the catheter 420 facilitate the placement of fiducial markers in soft lung tissue
  • the catheter 420 may be placed through a working channel of the bronchoscope 108 and the guidewire 410, in turn, may be placed through the catheter 420.
  • the guidewire 410 and the catheter 420 may be operated to place fiducial markers, e.g., the ICG fluorophore -coated fiducial markers illustrated in FIGS. 2 and 3.
  • FIG. 5 is a diagram that illustrates a video assisted thoracoscopic surgery (VATS) after placement of fiducial markers 525 using a bronchoscope 522.
  • the video assisted thoracoscopic surgery involves use of and at least two surgical instruments.
  • the thoracoscope 524 may include a flexible catheter, and a small video camera and visible and NIR light emitters disposed at the distal portion of the flexible catheter.
  • fiducial markers coated with ICG fluorophore 525a, 525b are placed in bronchi 514 surrounding a tumor 515.
  • the bronchoscope 522 which is navigated through the trachea 512 and bronchi 514 of the patient’s lung, may be used together with the guidewire 410 and/or catheter 420 of FIG. 4 to place the fiducial markers 525a, 525b.
  • a VATS procedure using the NIR emitters to excite fluorescence of the ICG fluorophore may be performed to locate and resect the tumor 515.
  • the clinician may operate thoracoscope to cause the NIR emitters to emit NIR radiation and excite fluorescence of the ICG fluorophore, which may be captured by the video camera. In this way the clinician is not only able to locate the tumor 515, but also visualize adjacent lung tissue that should not be resected or harmed in any way.
  • the bronchoscope 522 may also include an NIR emitter that emits NIR radiation to excite fluorescence of the ICG fluorophore from a different perspective to increase fluorescence and thus enhance visualization of the fiducial markers.
  • FIG. 6 is a flowchart of an example of a method for visualizing a marker.
  • preoperative radiographic images of a lung showing a target are received.
  • the preoperative radiographic images may be computed tomography (CT) images, cone beam computed tomography (CBCT) images, or 3D fluoroscopic images.
  • CT computed tomography
  • CBCT cone beam computed tomography
  • 3D fluoroscopic images In some aspects, the preoperative radiographic images may be substituted with or augmented by preoperative magnetic resonance imaging (MRI) images.
  • intraoperative radiographic images of the lung showing the target are received.
  • the intraoperative radiographic images may be captured by a radiographic imaging system that minimizes radiation exposure to the patient and clinicians.
  • the radiographic imaging system may be a C-arm fluoroscope, a CBCT system, or a 3D fluoroscopic system.
  • the preoperative radiographic images of the lung showing the target are displayed.
  • the preoperative radiographic images may be displayed on a display or monitor arranged in an operation room such that clinicians can easily view the images while performing a procedure on the lungs.
  • placement of at least one radiopaque marker coated with an NIR fluorophore at or near the target in the lung is displayed in the intraoperative radiographic images.
  • NIR images of the at least one radiopaque marker are captured intraoperatively.
  • Block 608 may involve emitting or directing NIR radiation towards the at least one radiopaque marker coated with an NIR fluorophore at or near the target in the lung.
  • the NIR radiation may be emitted from an NIR emitter, e.g., a semiconductor emitter, disposed near the visible light source of the laparoscope, thoracoscope, bronchoscope, and/or any other type of endoscope used in the procedure.
  • the NIR images of the at least one radiopaque marker are displayed to the clinicians.
  • FIG. 7 is a block diagram that illustrates a robotic surgical system 700 that may be used to perform the bronchoscopic and/or thoracoscopic procedures in accordance with some aspects of this disclosure.
  • the robotic surgical system 700 includes a first robotic arm 702 and a second robotic arm 704 attached to robotic arm bases 706 and 708, respectively.
  • the first robotic arm 702 and the second robotic arm 704 include a first end effector 716 and a second end effector 718, respectively.
  • the end effector 716, 718 may include robotic manipulators or grippers suitable for operating endoscopic catheters and medical tools of this disclosure.
  • the first end effector 716 operates one or more medical tools 712, including a resection tool for resecting a lung tumor.
  • the second end effector 718 operates a sheath or catheter 710, e.g., a bronchoscope or a thoracoscope, which may include one or more working channels for receiving and guiding the one or more medical tools 712.
  • the robotic surgical system 700 may further include an electromagnetic (EM) generator 714, which is configured to generate an EM field, which is sensed by an EM sensor incorporated into or disposed on the one or more medical tools 712 and by EM patient motion sensors (not shown) disposed on and/or in the patient.
  • EM electromagnetic
  • the EM generator 714 may be embedded in the operating table 715 or may be incorporated into a pad that may be placed between the operating table 715 and the patient 711.
  • the first and second robotic arms 702, 704 may be controlled to align the end effectors 716 and 718 such that proximal end portions of the catheters 710a, 710b is distal to the proximal end portions of the one or more tools 712, and such that the one or more tools 712 remain axially aligned with the catheters 710a, 710b.
  • the first robotic arm 702 inserts the bronchoscopic catheter 710a through, for example, a tracheal tube (not shown) in the mouth of the patient 711, and into the bronchial system of the patient 711. Then, the second robotic arm 704 inserts the one or more tools 712, e.g., a resection tool, through the catheter 102 to reach a target within the bronchial system of the patient 711.
  • the first and second robotic arms 702, 704 may move the catheter 710 and one or more tools 712, e.g., a resection tool, axially relative to each other and into or out of the patient 711 under the control of a surgeon (not shown) at a control console (not shown).
  • a navigation phase may include advancing a bronchoscopic catheter 710a or a thoracoscopic catheter 710b along with the one or more tools 712 into the patient 711, and then advancing the one or more tools 712 beyond the distal end portions of the catheters 710a, 710b to reach a desired destination such as a location at or near a target, e.g., a lung tumor.
  • a desired destination such as a location at or near a target, e.g., a lung tumor.
  • Other modes of navigation may be used, such as by using a guide wire through a working channel of the bronchoscopic catheter 710a.
  • the surgeon may use a visual guidance modality or a combination of visual guidance modalities to aid in navigating to a lung tumor, placing a NIR fluorophore -coated, radiopaque fiducial marker, and performing the lung tumor biopsy or resection procedure, such as fluoroscopy, video, computed tomography (CT), or magnetic resonance imaging (MRI).
  • a visual guidance modality or a combination of visual guidance modalities to aid in navigating to a lung tumor, placing a NIR fluorophore -coated, radiopaque fiducial marker, and performing the lung tumor biopsy or resection procedure, such as fluoroscopy, video, computed tomography (CT), or magnetic resonance imaging (MRI).
  • the one or more medical tools 712 are deployed through longitudinally-aligned working channels within the bronchoscopic catheter 710a or the thoracoscopic catheter 710b.
  • the one or more medical tools 712 may be deployed to place the NIR fluorophore-coated fiducial markers in lung bronchi near the lung tumor.
  • the thoracoscopic catheter 710b having an NIR light source and an NIR camera may be used to locate and visualize the position and size of the lung tumor based on the fluorescence of the NIR fluorophore- coated fiducial markers.
  • the robotic arms 702, 704 may include three joints 701 and three-arm segments 705. In other aspect, the robotic arms 702, 704 may include greater than or less than three joints 701 and three arm segments 705.
  • FIG. 8 is a block diagram that illustrates a robotic control system 800 for controlling the robotic surgical system 700 of FIG. 7.
  • the robotic control system 800 includes a control system 810, which controls the robotic surgical system 700.
  • the control system 810 may execute the method 900 of FIG. 9 described herein.
  • the control system 810 may interface with a display 822, a user controller 825, an NIR light source 821, an endoscopic NIR camera, 830, and an endoscopic video camera 826.
  • the control system 810 may be coupled to the robotic surgical system 700, directly or indirectly, e.g., by wireless communication.
  • the control system 810 includes a processor 812, a memory 814 coupled to the processor 812, a random access memory (RAM) 816 coupled to the processor 812, and a communications interface 818 coupled to the processor 812.
  • the processor 812 may include one or more hardware processors.
  • the control system 810 may be a stationary computer, such as a personal computer, or a portable computer such as a tablet computer. Alternatively, the control system 810 may be incorporated into one of the robotic arm bases 706, 708.
  • the control system 810 may also interface with a user controller 825, which may be used by a surgeon to control the robotic arm system 824 to perform a lung tumor resection procedure.
  • the memory 814 may be any computer-readable storage media that can be accessed by the processor 812. That is, computer readable storage media may include non-transitory, volatile, and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data.
  • computer-readable storage media may include RAM, ROM, EPROM, EEPROM, flash memory or other solid-state memory technology, CD- ROM, DVD, Blu-Ray, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information, and which may be accessed by processor 812.
  • An application stored in the memory 814 may, when executed by processor 812, cause display 822 to present a user interface (not shown).
  • the user interface may be configured to present to the user images from the endoscopic video camera 826 and the endoscopic NIR camera 830.
  • the user interface may be further configured to direct the user to select the target by, among other things, identifying and marking the target in the image data.
  • Communications interface 818 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the internet.
  • LAN local area network
  • WAN wide area network
  • Communications interface 818 may be used to connect the control system 810 with the endoscopic video camera 826 and the endoscopic NIR camera 830.
  • Communications interface 818 may be also used to receive image data from the memory 814 and path planning data.
  • Communications interface 818 may also be coupled to or in communication with one or more patient motion sensors (not shown).
  • the control system 810 may also include an input device (not shown), which may be any device through which a user may interact with the control system 810, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface.
  • the control system 810 may also include an output module (not shown), which may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art.
  • FIG. 9 is a flowchart of an example of a method for controlling one or more robotic arms, e.g., robotic arms 702, 704 of FIG. 7, to orient and advance a catheter, e.g., the bronchoscopic catheter or the thoracoscopic catheter relative to a target, e.g., a lung tumor, place NIR fluorophore-coated, radiopaque fiducial markers at or near the target, and perform a biopsy, treatment, or resection procedure.
  • Blocks 602-607 are performed as in FIG. 6.
  • the at least one NIR fluorophore-coated, radiopaque marker and the distal portion of the medical tool are illuminated with NIR light.
  • NIR images of the at least one radiopaque marker and the distal portion of the medical tool are intraoperatively captured. Then, at block 906, the medical tool is controlled by one or more robotic arms to advance towards the target based on the NIR images of the at least one radiopaque marker and the distal portion of the medical tool.
  • FIG. 10 is a schematic diagram of a system 1000 configured for use with the methods of the disclosure including the methods of FIGS. 6 and 9.
  • the system 1000 may include a workstation 1001, and optionally an imaging system 1015, e.g., a fluoroscopic imaging system and/or a CT imaging system for capturing preoperative 3D images.
  • the workstation 1001 may be coupled with the imaging system 1015, directly or indirectly, e.g., by wireless communication.
  • the workstation 1001 may include a memory 1002, a processor 1004, a display 1006 and an input device 1010.
  • the processor 1004 may include one or more hardware processors.
  • the workstation 1001 may optionally include an output module 1012 and a network interface 1008.
  • the memory 1002 may store an application 1018 and image data 1014.
  • the application 1018 may include instructions executable by the processor 1004 for executing the methods of the disclosure including the methods of FIGS. 6 and 9.
  • the application 1018 may further include a user interface 1016.
  • the 1014 may include preoperative CT image data, fluoroscopic image data, or fluoroscopic 3D reconstruction data.
  • the processor 1004 may be coupled with the memory 1002, the display 1006, the input device 1010, the output module 1012, the network interface 1008, and the imaging system 1015.
  • the workstation 1001 may be a stationary computer system, such as a personal computer, or a portable computer system such as a tablet computer.
  • the workstation 1001 may embed multiple computers.
  • the memory 1002 may include any non-transitory computer-readable storage media for storing data and/or software including instructions that are executable by the processor 1004 and which control the operation of the workstation 1001 and, in some aspects, may also control the operation of the imaging system 1015.
  • the imaging system
  • the imaging system 1015 may be used to capture a sequence of preoperative CT images of a portion of a patient’s body, e.g., the lungs, as the portion of the patient’s body moves, e.g., as the lungs move during a respiratory cycle.
  • the imaging system 1015 may include a fluoroscopic imaging system that captures a sequence of fluoroscopic images based on which a fluoroscopic 3D reconstruction is generated and to capture a live 2D fluoroscopic view to confirm placement of a medical tool.
  • the memory 1002 may include one or more storage devices such as solid-state storage devices, e.g., flash memory chips. Alternatively, or in addition to the one or more solid-state storage devices, the memory 1002 may include one or more mass storage devices connected to the processor 1004 through a mass storage controller (not shown) and a communications bus (not shown).
  • computer-readable media can be any available media that can be accessed by the processor 1004. That is, computer readable storage media may include non-transitory, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data.
  • computer-readable storage media may include RAM, ROM, EPROM, EEPROM, flash memory or other solid-state memory technology, CD-ROM, DVD, Blu-Ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information, and which may be accessed by workstation 1001.
  • the application 1018 may, when executed by the processor 1004, cause the display 1006 to present the user interface 1016.
  • the user interface 1016 may be configured to present to the user a single screen including a three-dimensional (3D) view of a 3D model of a target from the perspective of a tip of a medical tool, a live two-dimensional (2D) fluoroscopic view showing the medical tool, and a target mark, which corresponds to the 3D model of the target, overlaid on the live 2D fluoroscopic view.
  • the user interface 1016 may be further configured to display the target mark in different colors depending on whether the medical tool tip is aligned with the target in three dimensions.
  • the network interface 1008 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the Internet.
  • the network interface 1008 may be used to connect between the workstation 1001 and the imaging system 1015.
  • the network interface 1008 may be also used to receive the image data 1014.
  • the input device 1010 may be any device by which a user may interact with the workstation 1001, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface.
  • the output module 1012 may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art.
  • any connectivity port or bus such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art.
  • the described techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit.
  • Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable logic arrays
  • processors may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

Endoscopic systems and methods use near-infrared (NIR) fluorophore coated markers to treat target tissue. The systems and methods involve receiving preoperative and intraoperative radiographic images of a lung showing a target, displaying the preoperative radiographic images, and displaying, in the intraoperative radiographic images, placement of at least one radiopaque marker coated with a NIR fluorophore at or near the target in the lung. The systems and methods also involve intraoperatively capturing NIR images of the at least one radiopaque marker and/or a medical device at least partially coated with the NIR fluorophore advancing towards the at least one radiopaque marker, and displaying the NIR images. The systems and methods may involve illuminating the at least one radiopaque marker and/or the medical device with the NIR fluorophore. The systems and methods may involve controlling, with a robotic end effector, a medical device to advance towards the target based on the NIR images.

Description

LOCALIZATION AND TREATMENT OF TARGET TISSUE USING MARKERS COATED WITH NEAR-INFRARED FLUOROPHORES
[0001] This Application claims priority from U.S. Provisional Patent Application 63/442,667, filed 1 February 2023, the entire content of which is incorporated herein by reference.
FIELD
[0002] The technology of the disclosure is generally related to the use of markers coated with a near-infrared (NIR) fluorophore to guide localization and treatment of target tissue.
BACKGROUND
[0003] There are several commonly applied medical methods, such as endoscopic procedures or minimally invasive procedures, for treating various maladies affecting organs including the liver, brain, heart, lungs, gall bladder, kidneys, and bones. Often, one or more imaging modalities, such as magnetic resonance imaging (MRI), ultrasound imaging, computed tomography (CT), or fluoroscopy are employed by clinicians to identify and navigate to areas of interest within a patient and to a target for biopsy or treatment. In some procedures, preoperative scans may be utilized for target identification and intraoperative guidance. In some cases, real-time imaging or intraoperative imaging may also be required to obtain a more accurate and current image of the target area and an endoluminal medical tool used to biopsy or treat tissue in the target area. Furthermore, real-time image data displaying the current location of a medical device with respect to the target and its surroundings may be needed to navigate the medical device to the target in a safe and accurate manner (e.g., without causing damage to other organs or tissue). Realtime, intraoperative imaging, however, may expose the patient to unnecessary and/or potentially unhealthy amounts of X-ray radiation.
[0004] An endoscopic approach has proven useful in navigating to areas of interest within a patient, and particularly so for areas within luminal networks of the body such as the lungs. To enable the endoscopic approach, and more particularly the bronchoscopic approach in the lungs, endobronchial navigation systems have been developed that use previously acquired MRI data or CT image data to generate a three-dimensional (3D) rendering, model, or volume of the particular body part such as the lungs. [0005] The resulting volume generated from the MRI scan or CT scan is then utilized to create a navigation plan to facilitate the advancement of a navigation catheter (or other suitable medical tool) through a bronchoscope and a branch of the bronchus of a patient to an area of interest. A locating or tracking system, such as an electromagnetic (EM) tracking system, may be utilized in conjunction with, for example, CT data, to facilitate guidance of the navigation catheter through branches of the bronchus to the area of interest. In certain instances, the navigation catheter may be positioned within one of the airways of the branched luminal networks adjacent to, or within, the area of interest to provide access for one or more medical tools.
[0006] However, a 3D volume of a patient’s lungs, generated from previously acquired scans, such as CT scans, may not provide a basis sufficient for accurate guiding of medical devices or tools to a target during a surgical procedure. In some cases, the inaccuracy is caused by deformation of the patient’s lungs during the procedure relative to the lungs at the time of the acquisition of the previously acquired CT data. This deformation (CT-to-Body divergence) may be caused by many different factors including, for example, changes in the body when transitioning from between a sedated state and a non-sedated state, the bronchoscope changing the patient’s pose, the bronchoscope pushing the tissue, different lung volumes (e.g., the CT scans are acquired during inhale while navigation is performed during breathing), different beds, different days, etc. This deformation may lead to significant motion of a target, making it challenging to align a medical tool with the target in order to safely and accurately resect or otherwise treat target tissue.
[0007] Thus, systems and methods are needed to provide real-time imaging during an in-vivo navigation and resection or other treatment procedures. Furthermore, to navigate a medical tool safely and accurately to a remote target and resect or otherwise treat the remote target with the medical tool, either by a surgical robotic system or by a clinician using a guidance system, the systems and methods should minimize the clinician’s and patient’s exposure to intraoperative X-ray radiation.
SUMMARY
[0008] The techniques of this disclosure generally relate to combining NICE technology with fiducial markers. This combination allows clinicians to place fiducial markers at or near a target with radiographic imaging. Then, clinicians can use NIR technology in surgery to visualize the target in real-time without exposing the patient and/or clinicians to radiation from radiographic imaging typically used to visualize fiducial markers.
[0009] In one aspect, this disclosure provides a method. The method includes receiving preoperative radiographic images of a lung showing a target and receiving intraoperative radiographic images of the lung showing the target. The method also includes displaying the preoperative radiographic images of the lung showing the target and displaying, in the intraoperative radiographic images, placement of at least one radiopaque marker coated with a near-infrared (NIR) fluorophore. The method also includes intraoperatively capturing NIR images of the at least one radiopaque marker and displaying the NIR images of the at least one radiopaque marker.
[0010] In aspects, implementations of the method may include one or more of the following features. Placing the at least one radiopaque marker may include placing the at least one radiopaque marker at or near the target. The method may include illuminating a medical device at least partially coated with the NIR fluorophore advancing towards the at least one radiopaque marker with NIR light, intraoperatively capturing NIR images of the medical device advancing towards the at least one radiopaque marker, and displaying the NIR images of the medical device advancing towards the at least one radiopaque marker.
[0011] In aspects, the method may include displaying the NIR images of the medical device treating the target at or near the at least one radiopaque marker. Treating the target may include resecting the target.
[0012] In aspects, the NIR fluorophore may include indocyanine green (ICG) dye, methylene blue (MB) dye, single-walled carbon nanotubes (SWCNTs), quantum dots, or rare-earth-doped nanoparticles (RENPs). In aspects, the preoperative radiographic images are computed tomography (CT) images, magnetic resonance imaging (MRI) images, fluoroscopic images, or cone-beam CT (CBCT) images. In aspects, the method may include illuminating the at least one radiopaque marker with NIR light.
[0013] In another aspect, this disclosure provides another method. The method includes receiving preoperative radiographic images of a lung showing a target and receiving intraoperative radiographic images of the lung showing the target. The method also includes displaying the preoperative radiographic images of the lung showing the target and displaying, in the intraoperative radiographic images, placement of at least one radiopaque marker coated with a near-infrared (NIR) fluorophore at or near the target in the lung. The method also includes intraoperatively capturing NIR images of the at least one radiopaque marker and controlling, with a robotic end effector, a medical device to advance towards the target based on the NIR images.
[0014] In aspects, implementations of the method may include one or more of the following features. In aspects, a distal portion of the medical device may be coated with the NIR fluorophore, and the method may include illuminating the at least one radiopaque marker and the distal portion of the medical device with NIR light, intraoperatively capturing NIR images of the at least one radiopaque marker and the distal portion of the medical device, and controlling the medical device to advance towards the target based on the NIR images of the at least one radiopaque marker and the distal portion of the medical device.
[0015] In aspects, the method may include controlling the medical device to treat the target based on the NIR images. In aspects, the method may include controlling the medical device to biopsy or resect the target based on the NIR images. In aspects, the method may include illuminating the at least one radiopaque marker with NIR light. In another aspect, this disclosure provides a system. The system includes at least one radiopaque marker coated with near-infrared (NIR) fluorophore, an endoluminal instrument that places the at least one radiopaque marker within a lung of a patient, an NIR camera that captures NIR video of the at least one radiopaque marker, and a treatment instrument.
[0016] The system also includes a processor and a memory having stored thereon instructions, which, when executed by the processor, cause the processor to receive preoperative radiographic images of a lung showing a target and receive intraoperative radiographic images of the lung showing the target. The instructions, when executed by the processor, also cause the processor to display the preoperative radiographic images of the lung showing the target, display, in the intraoperative radiographic images, the endoluminal instrument placing at least one radiopaque marker coated with a NIR fluorophore at or near the target in the lung, and intraoperatively capture NIR video of the at least one radiopaque marker to facilitate treatment of the target by the target instrument. [0017] In aspects, implementations of the system may include one or more of the following features. The system may include a robotic end-effector that holds and manipulates the treatment instrument. The instructions, when executed by the processor, may cause the processor to control the robotic end-effector to advance the treatment instrument towards the target and to treat the target with the treatment instrument based on the NIR video.
[0018] In aspects, the system may include an NIR light source that illuminates the at least one radiopaque marker with NIR light. The NIR light source may be coupled to the treatment instrument. The treatment instrument may be a thoracoscopic surgical instrument. The treatment instrument may be configured to perform a lobectomy, a segmentectomy, or a wedge resection.
[0019] In aspects, the instructions, when executed by the processor, may cause the processor to display, in the intraoperative radiographic images, the endoluminal instrument placing two or more radiopaque markers coated with an NIR fluorophore in two or more air passages of the lung at or near the target.
[0020] The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF DRAWINGS
[0021] FIG. 1 is a diagram of a system for navigating to targets via luminal networks in accordance with the disclosure;
[0022] FIG. 2 is a perspective view that illustrates a coil fiducial marker coated with an indocyanine green (ICG) fluorophore;
[0023] FIG. 3 is a perspective view that illustrates a band fiducial marker coated with the indocyanine green (ICG) fluorophore;
[0024] FIG. 4 is a perspective view that illustrates a guidewire and a catheter configured to place fiducial markers within lung tissue;
[0025] FIG. 5 is a diagram that illustrates a video assisted thoracoscopic surgery (VATS) after placement of fiducial markers using a bronchoscope; [0026] FIG. 6 is a flowchart of an example of a method for visualizing a marker in accordance with the disclosure;
[0027] FIG. 7 is a block diagram that illustrates a robotic surgical system;
[0028] FIG. 8 is a system block diagram that illustrates a robotic surgical control system for controlling the robotic surgical system of FIG. 7;
[0029] FIG. 9 is a flowchart of an example of a method for controlling a robotic arm to advance a medical instrument towards a target based on a marker coated with an NIR fluorophore in accordance with the disclosure; and
[0030] FIG. 10 is a block diagram that illustrates a system for visualizing placement of fiducial markers and guidance of a medical instrument towards a target to be treated in accordance with the disclosure.
DETAILED DESCRIPTION
[0031] Near-infrared coating of equipment (NICE) is a technology that allows for the fluorescent labeling of surgical or medical devices for image-guided surgery. NICE is capable of coating various different substrates including metals.
[0032] A clinician may use a bronchoscope to install fiducial markers within the lung to guide physicians with radiographic imaging sources where to find a target to be treated, e.g., resection. A detractor of this technology is that the technology requires use of a radiation source (e.g., a fluoroscope), which typically does not provide real-time imaging because use of the radiation source is limited to minimize exposure of the clinicians and patients to large amounts of radiation.
[0033] Clinicians also use indocyanine green (ICG) dye and near-infrared (NIR) cameras without fiducial markers to localize anatomy during video-assisted thoracoscopic surgery (VATS) or robotic-assisted thoracoscopic surgery (RATS) resection of the lung. These procedures typically involve sequestering a pulmonary vein feeding a given lobe or segment of the lung, introducing the ICG dye systemically, and watching for flow of the ICG dye into the target anatomy.
[0034] Aspects of the disclosure combine NICE technology with fiducial markers. This combination allows clinicians, e.g., pulmonologists, to place fiducial markers with radiographic imagery. Then, surgeons can use NIR technology in surgery to visualize the target in real-time. [0035] This disclosure features systems and methods that involve placing fiducial markers coated with NIR fluorophores at or near target tissue and visualizing the fiducial markers in real time via an NIR camera while operating a medical instrument, e.g., via controlling a robotic arm holding the medical instrument, to ensure the medical instrument can safely and accurately treat, e.g., resect, the target tissue.
[0036] The systems and methods involve receiving and displaying high resolution preprocedural or preoperative 3D images, e.g., preoperative CT images, to enable a clinician to identify target tissue, e.g., a tumor, to be treated, e.g., resected, in the 3D images. The systems and methods also involve receiving and displaying intraoperative radiographic images of the lung showing the target, and displaying, in the intraoperative radiographic images, placement of at least one radiopaque marker coated with an NIR fluorophore at or near the target in the lung. Then, NIR images of the at least one radiopaque marker are intraoperatively captured and displayed to enable a clinician or robot to visualize the at least one radiopaque marker and thus localize the target for treatment without the need for radiographic imaging.
[0037] In accordance with aspects of the disclosure, the systems and methods of the disclosure using markers coated with an NIR fluorophore to guide treatment of target tissue may be a portion of a larger workflow of a navigation system, such as an electromagnetic navigation system. FIG. 1 is a perspective view of an example of a system 100 for facilitating navigation and placement of medical tools, e.g., markers and resection tools, to a soft-tissue target via airways of the lungs. The system 100 may optionally be configured to generate a three-dimensional (3D) reconstruction of the target area from 2D fluoroscopic images. In the case where the system 100 is configured to generate a 3D reconstruction, intraoperative 2D fluoroscopic images may be captured only during critical parts of a procedure, e.g., to confirm placement of a medical tool in a target, in order to minimize human exposure to X-ray radiation. The system 100 may be further configured to facilitate approach of a medical tool to the target area by using Electromagnetic Navigation Bronchoscopy (ENB) and for determining the location of a medical tool with respect to the target.
[0038] One aspect of the system 100 is a software component for reviewing computed tomography (CT) image data that has been acquired separately from system 100. The review of the CT image data allows a user to identify one or more targets, plan a pathway to an identified target (planning phase), navigate a catheter 102 to the target (navigation phase) using a user interface, and confirming placement of an EM sensor 104 relative to the target. One such EMN system is the ELECTROMAGNETIC NAVIGATION BRONCHOSCOPY® system, which is referred to as ENB, currently sold by Medtronic PLC. The target may be tissue of interest identified by review of the CT image data during the planning phase. Following navigation, a medical tool, such as a biopsy tool or other tool, may be inserted into the catheter 102 to obtain a tissue sample from the tissue located at, or proximate to, the target.
[0039] As shown in FIG. 1, the catheter 102 is part of a catheter guide assembly 110, which also includes a handle 106 for controlling the catheter 102. In practice, the catheter 102 is inserted into a bronchoscope 108 using the handle 106 for access to a luminal network of the patient P. Specifically, the catheter 102 of the catheter guide assembly 110 may be inserted into a working channel of the bronchoscope 108 for navigation through a patient’s luminal network. A locatable guide (not shown), including an electromagnetic (EM) sensor 104 is inserted into the catheter 102 and locked into position such that the EM sensor 104 extends a desired distance beyond the distal tip of the catheter 102. The position and orientation of the EM sensor 104 relative to the reference coordinate system, and thus the distal portion of the catheter 102, within an electromagnetic field can be derived. The catheter guide assembly 110 is currently marketed and sold by Medtronic PLC under the brand names SUPERDIMENSION® Procedure Kits, or EDGE™ Procedure Kits, and are contemplated as useable with aspects of the disclosure.
[0040] The system 100 generally includes an operating table 112 configured to support a patient P, a bronchoscope 108 configured for insertion through patient P’s mouth into patient P’s airways; monitoring equipment 114 coupled to the bronchoscope 108 (e.g., a video display, for displaying the video images received from the video imaging system of bronchoscope 108); a locating or tracking system 115 including a locating module 116, patient motion sensors 118, and a transmitter mat 120, which may include multiple markers; and a computer system 122 including software and/or hardware used to facilitate identification of a target, pathway planning to the target, navigation of a medical tool to the moving target, and/or confirmation and/or determination of placement of the catheter 102, or a suitable medical tool therethrough, relative to the target. The computer system 122 may be similar to workstation 1001 of FIG. 10 and may be configured to execute the methods of the disclosure including the methods of FIGS. 6 and 9.
[0041] An imaging system 124 capable of acquiring fluoroscopic or x-ray images or video of the patient P is optionally included in some aspects of the system 100. The images, sequence of images, or video captured by the imaging system 124 may be stored within the imaging system 124 or transmitted to the computer system 122 for storage, processing, and display. Additionally, imaging system 124 may move relative to the patient P so that images may be acquired from different angles or perspectives relative to patient P to create a sequence of 2D images, such as a video. The pose of the imaging system 124 relative to the patient P and while capturing the images may be estimated via markers incorporated with the transmitter mat 120. The markers are positioned under patient P, between patient P and operating table 112 and between patient P and a radiation source or a sensing unit of the imaging system 124. The markers incorporated with the transmitter mat 120 may be two separate elements which may be coupled in a fixed manner or alternatively may be manufactured as a single unit. The imaging system 124 may include a single imaging system or more than one imaging systems. As illustrated in FIG. 1, the imaging system 124 may include a fluoroscopic imaging system, which is merely used to confirm placement of a medical tool near a target prior to biopsy, treatment, or resection of a target.
[0042] The computer system 122 may be any suitable computer system including a processor and storage medium, wherein the processor is capable of executing instructions stored on the storage medium. The computer system 122 may further include a database configured to store patient data, preoperative CT data sets, navigation plans, optionally fluoroscopic data sets including fluoroscopic images and video, optionally fluoroscopic 3D reconstruction, and any other such data. Although not explicitly illustrated, the computer system 122 may include inputs for, or may otherwise be configured to receive, preoperative CT data sets, optional fluoroscopic images/video, and other data described herein. Additionally, the computer system 122 includes a display configured to display graphical user interfaces. The computer system 122 may be connected to one or more networks through which one or more databases may be accessed.
[0043] With respect to the planning phase, the computer system 122 utilizes previously acquired CT image data for determining regular motion of a patient, e.g., motion caused by respiration, utilizes the same or different previously acquired CT image data for generating and viewing a three-dimensional model or rendering of patient P’s airways, enables the identification of a target on the three-dimensional model (automatically, semi-automatically, or manually), and allows for determining a pathway through patient P’s airways to tissue located at and around the target. More specifically, the CT images acquired from the previous CT scans are processed and assembled into a three-dimensional CT volume, which is then utilized to generate a three-dimensional model of patient P’s airways. The three-dimensional model may be displayed on a display associated with the computer system 122, or in any other suitable fashion. Using the computer system 122, various views of the three-dimensional model or enhanced two-dimensional images generated from the three-dimensional model are presented. The enhanced two-dimensional images may possess some three-dimensional capabilities because they are generated from three-dimensional data. The three-dimensional model may be manipulated to facilitate identification of target on the three-dimensional model or two-dimensional images, and selection of a suitable pathway through patient P’s airways to access tissue located at the target can be made. Once selected, the pathway plan, three- dimensional model, and images derived therefrom, can be saved and exported to a navigation system for use during the navigation phase(s). One such planning software is the ILLUMISITE® planning suite currently sold by Medtronic PLC.
[0044] With respect to the navigation phase, a six degrees-of-freedom electromagnetic locating or a tracking system 115, or other suitable system for determining location, is utilized for performing registration of the images and the pathway for navigation, although other configurations are also contemplated. The tracking system 115 includes the tracking module 116, the patient motion sensors 118, which may also be used as patient motion sensors, and the transmitter mat 120 (including the markers). The tracking system 115 is configured for use with a locatable guide and particularly EM sensor 104. As described above, the locatable and the EM sensor 104 are configured for insertion through the catheter 102 into patient P’s airways (either with or without the bronchoscope 108) and are selectively lockable relative to one another via a locking mechanism.
[0045] The transmitter mat 120 is positioned beneath patient P. The transmitter mat 120 generates an electromagnetic field around at least a portion of the patient P within which the positions of the patient motion sensors 118 and the EM sensor 104 can be determined with use of a tracking module 116. A second EM sensor 126 may also be incorporated into the end of the catheter 102. The second EM sensor 126 may be a five degree-of-freedom sensor or a six degree -of-freedom sensor. One or more of the patient motion sensors 118 are attached to the chest of the patient P or at suitable positions on the patient’s body that optimize the sensing of the motion of the patient. The six degrees of freedom coordinates of the patient motion sensors 118 are sent to the computer system 122 (which includes the appropriate software) where they are used to calculate a patient coordinate frame of reference. The six degrees of freedom coordinates of the patient motion sensors 118 are also sent to the computer system 122 where they are used to track the real-time motion of the patient, which may be caused by respiration cycles, e.g., inspiration and expiration, of the patient. Registration is generally performed to coordinate locations of the three-dimensional model and two-dimensional images from the planning phase, with the patient P’s airways as observed through the bronchoscope 108, and allow for the navigation phase to be undertaken with precise knowledge of the location of the EM sensor 104, even in portions of the airway where the bronchoscope 108 cannot reach. [0046] Registration of the patient P’s location on the transmitter mat 120 may be performed by moving the EM sensor 104 through the airways of the patient P. More specifically, data pertaining to locations of the EM sensor 104, while locatable guide is moving through the airways, is recorded using the transmitter mat 120, the patient motion sensors 118, and the tracking system 115. A shape resulting from this location data is compared to an interior geometry of passages of the three-dimensional model generated in the planning phase, and a location correlation between the shape and the three-dimensional model based on the comparison is determined, e.g., utilizing the software on the computer system 122. In addition, the software identifies non-tissue space (e.g., air filled cavities) in the three-dimensional model. The software aligns, or registers, an image representing a location of the EM sensor 104 with the three-dimensional model and/or two-dimensional images generated from the three-dimension model, which are based on the recorded location data and an assumption that the locatable guide remains located in non-tissue space in patient P’s airways. Alternatively, a manual registration technique may be employed by navigating the bronchoscope 108 with the EM sensor 104 to pre-specified locations in the lungs of the patient P, and manually correlating the images from the bronchoscope to the model data of the three-dimensional model. [0047] Though described herein with respect to EMN systems using EM sensors, the instant disclosure is not so limited and may be used in conjunction with flexible sensors, ultrasonic sensors, or other suitable motion sensors. Additionally, the methods described herein may be used in conjunction with robotic systems such that robotic actuators drive the catheter 102, bronchoscope 108, or other medical tool proximate the target. An example of a robotic system is illustrated in FIGS. 7 and 8.
[0048] Following registration of the patient P to the image data and pathway plan, a user interface is displayed in the navigation software which sets for the pathway that the clinician is to follow to reach the target. Once the catheter 102 has been successfully navigated proximate the target as depicted on the user interface, the locatable guide may be unlocked from the catheter 102 and removed, leaving the catheter 102 in place as a guide channel for guiding medical tools including without limitation, optical systems, ultrasound probes, NIR fluorophore-coated tools, radiopaque fiducial marker placement tools, resection tools, biopsy tools, ablation tools (i.e., microwave ablation tools), laser probes, cryogenic probes, sensor probes, and aspirating needles to the target. A medical tool may be then inserted through the catheter 102 and navigated to the target or to a specific area adjacent to the target.
[0049] Prior to inserting the medical tool through the catheter 102, a local registration process may optionally be performed for each target to reduce the CT-to-body divergence. In a capture phase of the local registration process, a sequence of fluoroscopic images may be captured and acquired via the imaging system 124, optionally by a user and according to directions displayed via the computer system 122. A fluoroscopic 3D reconstruction may be then generated via the computer system 122. The generation of the fluoroscopic 3D reconstruction is based on the sequence of fluoroscopic images and the projections of structure of markers incorporated with transmitter mat 120 on the sequence of fluoroscopic images. One or more slices of the 3D reconstruction may be then generated based on the pre -operative CT scan and via the computer system 122. The one or more slices of the 3D reconstruction and the fluoroscopic 3D reconstruction may be then displayed to the user on a display via the computer system 122, optionally simultaneously. The slices of 3D reconstruction may be presented on the user interface in a scrollable format where the user is able to scroll through the slices in series. [0050] In a marking phase of the local registration process, the clinician may be directed to identify and mark the target while using the slices of the 3D reconstruction as a reference. The user may also be directed to identify and mark the navigation catheter tip in the sequence of fluoroscopic 2D images. An offset between the location of the target and the navigation catheter tip may be then determined or calculated via the computer system 122. The offset may be then utilized, via the computer system 122, to correct the location and/or orientation of the navigation catheter on the display with respect to the target and/or correct the registration between the three-dimensional model and tracking system 115 in the area of the target and/or generate a local registration between the three-dimensional model and the fluoroscopic 3D reconstruction in the target area.
[0051] In an optional confirmation phase of the local registration process, a fluoroscopic 3D reconstruction is displayed in a confirmation screen. The confirmation screen may include a slider that may be selected and moved by the user to review a video loop of the fluoroscopic 3D reconstruction, which shows the marked target and navigation catheter tip from different perspectives. After confirming that there are marks on the target and navigation catheter tip throughout the video, the clinician may select an “Accept” button, at which point the local registration process ends and the position of the navigation catheter is updated. The clinician may then use the navigation views in, for example, the peripheral navigation screen to fine tune the alignment of the navigation catheter to the target before beginning an endoscopic procedure.
[0052] After the local registration process, the clinician or robotic arm may insert a medical tool in the catheter 102 and advance the medical tool towards the target. While advancing the medical tool towards the target, the clinician may view a user interface screen which includes a 3D medical tool tip view of a 3D model of a target, in which the 3D model of the target is moved according to motion of the target determined from pre- operative CT scans of a patient’s motion and the motion sensed by the patient motion sensors. This user interface screen allows the clinician to not only see the medical tool in real-time, but also allows the clinician to see whether the medical tool is aligned with the moving target. The user interface screen may also provide a graphical indication of whether the medical tool is aligned in three-dimensions with the target. For example, when the medical tool is aligned in three -dimensions with the target, the user interface shows the target overlay in a first color, e.g., green. On the other hand, when the medical tool is not aligned with the target in three dimensions, the user interface shows the target overlay in a second color different from the first color, e.g., orange or red.
[0053] FIG. 2 illustrates a nitinol coil fiducial marker 200, which is an example of a fluorophore-coated marker that may be placed at or near a target within an air passage of the lung. The nitinol coil fiducial marker 200 includes a metallic seed 202 and a nitinol wire coil 204 secured to the metallic seed 202. The metallic seed 202 may be made of pure gold, a gold alloy, or any other pure metal or metal alloy that is radiopaque. The shape memory in the nitinol wire coil 204 may be designed to secure the nitinol coil fiducial marker in place and minimize migration. As shown in FIG. 2, the metallic seed 202 is coated with an indocyanine green (ICG) fluorophore 205.
[0054] In aspects, the metallic seed 202 may be coated with any other NIR fluorophore suitable for use in a mammalian body. The NIR fluorophore may include an inorganic fluorophore, an organic dye, and/or a fluorescent protein (FP). The advantages of inorganic fluorophores include prolonged emission wavelength and high fluorescence quantum yield (QY), which is the ratio of the number of photons emitted by the fluorophore to the number of photons absorbed by the fluorophore. These advantages make inorganic fluorophores suitable for in vivo bioimaging in deeper tissue areas. The inorganic fluorophore may include single-walled carbon nanotubes (SWCNTs), quantum dots, and rare-earth-doped nanoparticles (RENPs). The advantage of quantum dots is that they provide high QY (e.g., 15% versus 0.5% for SWCNTs), high resistance to photobleaching, and long emission wavelengths.
[0055] FIG. 3 is a perspective view that illustrates band fiducial markers 300a, 300b coated with the indocyanine green (ICG) fluorophore 205. The band fiducial marker 300a includes three metallic seeds 302a and two columns 304a connecting the three metallic seeds 302a. The band fiducial marker 300b includes two metallic seeds 302b and one column 304b connecting the two metallic seeds 302b. The metallic seeds 302a, 302b may be made of pure gold, a gold alloy, or any other pure metal or metal alloy that is radiopaque. The columns 304a, 304b may be designed to be strong enough to allow the band fiducial markers 300a, 300b to be pushed into tissue. The metallic seeds 302a, 302b are coated with the indocyanine green (ICG) fluorophore 205.
[0056] FIG. 4 is a perspective view that illustrates a guidewire 410 and a catheter 420 configured to place fiducial markers within lung tissue. The curve 422 and the coating 424 of the catheter 420 facilitate the placement of fiducial markers in soft lung tissue The catheter 420 may be placed through a working channel of the bronchoscope 108 and the guidewire 410, in turn, may be placed through the catheter 420. In this manner, the guidewire 410 and the catheter 420 may be operated to place fiducial markers, e.g., the ICG fluorophore -coated fiducial markers illustrated in FIGS. 2 and 3.
[0057] FIG. 5 is a diagram that illustrates a video assisted thoracoscopic surgery (VATS) after placement of fiducial markers 525 using a bronchoscope 522. The video assisted thoracoscopic surgery (VATS) involves use of and at least two surgical instruments. The thoracoscope 524 may include a flexible catheter, and a small video camera and visible and NIR light emitters disposed at the distal portion of the flexible catheter. As shown in FIG. 5, fiducial markers coated with ICG fluorophore 525a, 525b are placed in bronchi 514 surrounding a tumor 515. The bronchoscope 522, which is navigated through the trachea 512 and bronchi 514 of the patient’s lung, may be used together with the guidewire 410 and/or catheter 420 of FIG. 4 to place the fiducial markers 525a, 525b.
[0058] Once the fiducial markers coated with ICG fluorophore 525a, 525b are placed in bronchi 514, a VATS procedure using the NIR emitters to excite fluorescence of the ICG fluorophore may be performed to locate and resect the tumor 515. For example, as shown in FIG. 5, the clinician may operate thoracoscope to cause the NIR emitters to emit NIR radiation and excite fluorescence of the ICG fluorophore, which may be captured by the video camera. In this way the clinician is not only able to locate the tumor 515, but also visualize adjacent lung tissue that should not be resected or harmed in any way. In some aspects, the bronchoscope 522 may also include an NIR emitter that emits NIR radiation to excite fluorescence of the ICG fluorophore from a different perspective to increase fluorescence and thus enhance visualization of the fiducial markers.
[0059] FIG. 6 is a flowchart of an example of a method for visualizing a marker. At block 602, preoperative radiographic images of a lung showing a target are received. The preoperative radiographic images may be computed tomography (CT) images, cone beam computed tomography (CBCT) images, or 3D fluoroscopic images. In some aspects, the preoperative radiographic images may be substituted with or augmented by preoperative magnetic resonance imaging (MRI) images. At block 604, intraoperative radiographic images of the lung showing the target are received. The intraoperative radiographic images may be captured by a radiographic imaging system that minimizes radiation exposure to the patient and clinicians. For example, the radiographic imaging system may be a C-arm fluoroscope, a CBCT system, or a 3D fluoroscopic system.
[0060] At block 606, the preoperative radiographic images of the lung showing the target are displayed. The preoperative radiographic images may be displayed on a display or monitor arranged in an operation room such that clinicians can easily view the images while performing a procedure on the lungs. At block 607, placement of at least one radiopaque marker coated with an NIR fluorophore at or near the target in the lung is displayed in the intraoperative radiographic images.
[0061] At block 608, NIR images of the at least one radiopaque marker are captured intraoperatively. Block 608 may involve emitting or directing NIR radiation towards the at least one radiopaque marker coated with an NIR fluorophore at or near the target in the lung. The NIR radiation may be emitted from an NIR emitter, e.g., a semiconductor emitter, disposed near the visible light source of the laparoscope, thoracoscope, bronchoscope, and/or any other type of endoscope used in the procedure. At block 610, the NIR images of the at least one radiopaque marker are displayed to the clinicians.
[0062] FIG. 7 is a block diagram that illustrates a robotic surgical system 700 that may be used to perform the bronchoscopic and/or thoracoscopic procedures in accordance with some aspects of this disclosure. The robotic surgical system 700 includes a first robotic arm 702 and a second robotic arm 704 attached to robotic arm bases 706 and 708, respectively. The first robotic arm 702 and the second robotic arm 704 include a first end effector 716 and a second end effector 718, respectively. The end effector 716, 718 may include robotic manipulators or grippers suitable for operating endoscopic catheters and medical tools of this disclosure. The first end effector 716 operates one or more medical tools 712, including a resection tool for resecting a lung tumor. The second end effector 718 operates a sheath or catheter 710, e.g., a bronchoscope or a thoracoscope, which may include one or more working channels for receiving and guiding the one or more medical tools 712.
[0063] The robotic surgical system 700 may further include an electromagnetic (EM) generator 714, which is configured to generate an EM field, which is sensed by an EM sensor incorporated into or disposed on the one or more medical tools 712 and by EM patient motion sensors (not shown) disposed on and/or in the patient. In aspects, the EM generator 714 may be embedded in the operating table 715 or may be incorporated into a pad that may be placed between the operating table 715 and the patient 711.
[0064] The first and second robotic arms 702, 704 may be controlled to align the end effectors 716 and 718 such that proximal end portions of the catheters 710a, 710b is distal to the proximal end portions of the one or more tools 712, and such that the one or more tools 712 remain axially aligned with the catheters 710a, 710b.
[0065] In one aspect, the first robotic arm 702 inserts the bronchoscopic catheter 710a through, for example, a tracheal tube (not shown) in the mouth of the patient 711, and into the bronchial system of the patient 711. Then, the second robotic arm 704 inserts the one or more tools 712, e.g., a resection tool, through the catheter 102 to reach a target within the bronchial system of the patient 711. The first and second robotic arms 702, 704 may move the catheter 710 and one or more tools 712, e.g., a resection tool, axially relative to each other and into or out of the patient 711 under the control of a surgeon (not shown) at a control console (not shown).
[0066] A navigation phase may include advancing a bronchoscopic catheter 710a or a thoracoscopic catheter 710b along with the one or more tools 712 into the patient 711, and then advancing the one or more tools 712 beyond the distal end portions of the catheters 710a, 710b to reach a desired destination such as a location at or near a target, e.g., a lung tumor. Other modes of navigation may be used, such as by using a guide wire through a working channel of the bronchoscopic catheter 710a. The surgeon may use a visual guidance modality or a combination of visual guidance modalities to aid in navigating to a lung tumor, placing a NIR fluorophore -coated, radiopaque fiducial marker, and performing the lung tumor biopsy or resection procedure, such as fluoroscopy, video, computed tomography (CT), or magnetic resonance imaging (MRI).
[0067] In aspects, the one or more medical tools 712 are deployed through longitudinally-aligned working channels within the bronchoscopic catheter 710a or the thoracoscopic catheter 710b. In the case of the bronchoscopic catheter 710a, the one or more medical tools 712 may be deployed to place the NIR fluorophore-coated fiducial markers in lung bronchi near the lung tumor. Thereafter, the thoracoscopic catheter 710b having an NIR light source and an NIR camera may be used to locate and visualize the position and size of the lung tumor based on the fluorescence of the NIR fluorophore- coated fiducial markers. Then, one or more medical tools 712 may be deployed via the thoracoscopic catheter 710b to perform a biopsy or resection procedure. In aspects, the robotic arms 702, 704 may include three joints 701 and three-arm segments 705. In other aspect, the robotic arms 702, 704 may include greater than or less than three joints 701 and three arm segments 705.
[0068] FIG. 8 is a block diagram that illustrates a robotic control system 800 for controlling the robotic surgical system 700 of FIG. 7. The robotic control system 800 includes a control system 810, which controls the robotic surgical system 700. For example, the control system 810 may execute the method 900 of FIG. 9 described herein. The control system 810 may interface with a display 822, a user controller 825, an NIR light source 821, an endoscopic NIR camera, 830, and an endoscopic video camera 826. The control system 810 may be coupled to the robotic surgical system 700, directly or indirectly, e.g., by wireless communication. The control system 810 includes a processor 812, a memory 814 coupled to the processor 812, a random access memory (RAM) 816 coupled to the processor 812, and a communications interface 818 coupled to the processor 812. The processor 812 may include one or more hardware processors. The control system 810 may be a stationary computer, such as a personal computer, or a portable computer such as a tablet computer. Alternatively, the control system 810 may be incorporated into one of the robotic arm bases 706, 708. The control system 810 may also interface with a user controller 825, which may be used by a surgeon to control the robotic arm system 824 to perform a lung tumor resection procedure.
[0069] It should be appreciated by those skilled in the art that the memory 814 may be any computer-readable storage media that can be accessed by the processor 812. That is, computer readable storage media may include non-transitory, volatile, and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media may include RAM, ROM, EPROM, EEPROM, flash memory or other solid-state memory technology, CD- ROM, DVD, Blu-Ray, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information, and which may be accessed by processor 812. [0070] An application stored in the memory 814 may, when executed by processor 812, cause display 822 to present a user interface (not shown). The user interface may be configured to present to the user images from the endoscopic video camera 826 and the endoscopic NIR camera 830. Optionally, the user interface may be further configured to direct the user to select the target by, among other things, identifying and marking the target in the image data.
[0071] Communications interface 818 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the internet. Communications interface 818 may be used to connect the control system 810 with the endoscopic video camera 826 and the endoscopic NIR camera 830. Communications interface 818 may be also used to receive image data from the memory 814 and path planning data. Communications interface 818 may also be coupled to or in communication with one or more patient motion sensors (not shown). The control system 810 may also include an input device (not shown), which may be any device through which a user may interact with the control system 810, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface. The control system 810 may also include an output module (not shown), which may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art.
[0072] FIG. 9 is a flowchart of an example of a method for controlling one or more robotic arms, e.g., robotic arms 702, 704 of FIG. 7, to orient and advance a catheter, e.g., the bronchoscopic catheter or the thoracoscopic catheter relative to a target, e.g., a lung tumor, place NIR fluorophore-coated, radiopaque fiducial markers at or near the target, and perform a biopsy, treatment, or resection procedure. Blocks 602-607 are performed as in FIG. 6. At block 902, the at least one NIR fluorophore-coated, radiopaque marker and the distal portion of the medical tool are illuminated with NIR light. At block 904, NIR images of the at least one radiopaque marker and the distal portion of the medical tool are intraoperatively captured. Then, at block 906, the medical tool is controlled by one or more robotic arms to advance towards the target based on the NIR images of the at least one radiopaque marker and the distal portion of the medical tool.
[0073] Reference is now made to FIG. 10, which is a schematic diagram of a system 1000 configured for use with the methods of the disclosure including the methods of FIGS. 6 and 9. The system 1000 may include a workstation 1001, and optionally an imaging system 1015, e.g., a fluoroscopic imaging system and/or a CT imaging system for capturing preoperative 3D images. In some aspects, the workstation 1001 may be coupled with the imaging system 1015, directly or indirectly, e.g., by wireless communication. The workstation 1001 may include a memory 1002, a processor 1004, a display 1006 and an input device 1010. The processor 1004 may include one or more hardware processors. The workstation 1001 may optionally include an output module 1012 and a network interface 1008. The memory 1002 may store an application 1018 and image data 1014. The application 1018 may include instructions executable by the processor 1004 for executing the methods of the disclosure including the methods of FIGS. 6 and 9.
[0074] The application 1018 may further include a user interface 1016. The image data
1014 may include preoperative CT image data, fluoroscopic image data, or fluoroscopic 3D reconstruction data. The processor 1004 may be coupled with the memory 1002, the display 1006, the input device 1010, the output module 1012, the network interface 1008, and the imaging system 1015. The workstation 1001 may be a stationary computer system, such as a personal computer, or a portable computer system such as a tablet computer. The workstation 1001 may embed multiple computers.
[0075] The memory 1002 may include any non-transitory computer-readable storage media for storing data and/or software including instructions that are executable by the processor 1004 and which control the operation of the workstation 1001 and, in some aspects, may also control the operation of the imaging system 1015. The imaging system
1015 may be used to capture a sequence of preoperative CT images of a portion of a patient’s body, e.g., the lungs, as the portion of the patient’s body moves, e.g., as the lungs move during a respiratory cycle. Optionally, the imaging system 1015 may include a fluoroscopic imaging system that captures a sequence of fluoroscopic images based on which a fluoroscopic 3D reconstruction is generated and to capture a live 2D fluoroscopic view to confirm placement of a medical tool. In one aspect, the memory 1002 may include one or more storage devices such as solid-state storage devices, e.g., flash memory chips. Alternatively, or in addition to the one or more solid-state storage devices, the memory 1002 may include one or more mass storage devices connected to the processor 1004 through a mass storage controller (not shown) and a communications bus (not shown).
[0076] Although the description of computer-readable media contained herein refers to solid-state storage, it should be appreciated by those skilled in the art that computer- readable storage media can be any available media that can be accessed by the processor 1004. That is, computer readable storage media may include non-transitory, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media may include RAM, ROM, EPROM, EEPROM, flash memory or other solid-state memory technology, CD-ROM, DVD, Blu-Ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information, and which may be accessed by workstation 1001.
[0077] The application 1018 may, when executed by the processor 1004, cause the display 1006 to present the user interface 1016. The user interface 1016 may be configured to present to the user a single screen including a three-dimensional (3D) view of a 3D model of a target from the perspective of a tip of a medical tool, a live two-dimensional (2D) fluoroscopic view showing the medical tool, and a target mark, which corresponds to the 3D model of the target, overlaid on the live 2D fluoroscopic view. The user interface 1016 may be further configured to display the target mark in different colors depending on whether the medical tool tip is aligned with the target in three dimensions.
[0078] The network interface 1008 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the Internet. The network interface 1008 may be used to connect between the workstation 1001 and the imaging system 1015. The network interface 1008 may be also used to receive the image data 1014. The input device 1010 may be any device by which a user may interact with the workstation 1001, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface. The output module 1012 may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art. From the foregoing and with reference to the various figures, those skilled in the art will appreciate that certain modifications can be made to the disclosure without departing from the scope of the disclosure. [0079] It should be understood that various aspects disclosed herein may be combined in different combinations than the combinations specifically presented in the description and accompanying drawings. It should also be understood that, depending on the example, certain acts or events of any of the processes or methods described herein may be performed in a different sequence, may be added, merged, or left out altogether (e.g., all described acts or events may not be necessary to carry out the techniques). In addition, while certain aspects of this disclosure are described as being performed by a single module or unit for purposes of clarity, it should be understood that the techniques of this disclosure may be performed by a combination of units or modules associated with, for example, a medical device.
[0080] In one or more examples, the described techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
[0081] Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.

Claims

WHAT IS CLAIMED IS:
1. A method comprising: receiving preoperative radiographic images of a lung showing a target; receiving intraoperative radiographic images of the lung showing the target; displaying the preoperative radiographic images of the lung showing the target; displaying, in the intraoperative radiographic images, placement of at least one radiopaque marker coated with a near-infrared (NIR) fluorophore; intraoperatively capturing NIR images of the at least one radiopaque marker; and displaying the NIR images of the at least one radiopaque marker.
2. The method of claim 1, wherein placing the at least one radiopaque marker includes placing the at least one radiopaque marker at or near the target.
3. The method of claim 2, further comprising: illuminating a medical device at least partially coated with the NIR fluorophore advancing towards the at least one radiopaque marker with NIR light; intraoperatively capturing NIR images of the medical device advancing towards the at least one radiopaque marker; and displaying the NIR images of the medical device advancing towards the at least one radiopaque marker.
4. The method of claim 3, further comprising displaying the NIR images of the medical device treating the target at or near the at least one radiopaque marker.
5. The method of claim 4, wherein treating the target includes resecting the target.
6. The method of claim 1, wherein the NIR fluorophore includes indocyanine green (ICG) dye, methylene blue (MB) dye, single-walled carbon nanotubes (SWCNTs), quantum dots, or rare-earth-doped nanoparticles (RENPs).
7. The method of claim 1, wherein the preoperative radiographic images are computed tomography (CT) images, magnetic resonance imaging (MRI) images, fluoroscopic images, or cone-beam CT (CBCT) images.
8. The method of claim 1, further comprising illuminating the at least one radiopaque marker with NIR light;
9. A method comprising: receiving preoperative radiographic images of a lung showing a target; receiving intraoperative radiographic images of the lung showing the target; displaying the preoperative radiographic images of the lung showing the target; displaying, in the intraoperative radiographic images, placement of at least one radiopaque marker coated with a near-infrared (NIR) fluorophore at or near the target in the lung; intraoperatively capturing NIR images of the at least one radiopaque marker; and controlling, with a robotic end effector, a medical device to advance towards the target based on the NIR images.
10. The method of claim 9, wherein a distal portion of the medical device is coated with the NIR fluorophore, and wherein the method further comprises: illuminating the at least one radiopaque marker and the distal portion of the medical device with NIR light; intraoperatively capturing NIR images of the at least one radiopaque marker and the distal portion of the medical device; and controlling the medical device to advance towards the target based on the NIR images of the at least one radiopaque marker and the distal portion of the medical device.
11. The method of claim 9, further comprising controlling the medical device to treat the target based on the NIR images.
12. The method of claim 11, further comprising controlling the medical device to biopsy or resect the target based on the NIR images.
13. The method of claim 9, further comprising illuminating the at least one radiopaque marker with NIR light;
14. A system comprising:
At least one radiopaque marker coated with near-infrared (NIR) fluorophore; an endoluminal instrument configured to place the at least one radiopaque marker within a lung of a patient; an NIR camera configured to capture NIR video of the at least one radiopaque marker; a treatment instrument; a processor; and a memory having stored thereon instructions, which, when executed by the processor, cause the processor to: receive preoperative radiographic images of a lung showing a target; receive intraoperative radiographic images of the lung showing the target; display the preoperative radiographic images of the lung showing the target; display, in the intraoperative radiographic images, the endoluminal instrument placing at least one radiopaque marker coated with a NIR fluorophore at or near the target in the lung; and intraoperative ly capture NIR video of the at least one radiopaque marker to facilitate treatment of the target by the target instrument.
15. The system of claim 14, further comprising a robotic end-effector configured to hold and manipulate the treatment instrument, wherein the instructions, when executed by the processor, further cause the processor to control the robotic end-effector to advance the treatment instrument towards the target and to treat the target with the treatment instrument based on the NIR video.
16. The system of claim 14, further comprising an NIR light source configured to illuminate the at least one radiopaque marker with NIR light.
17. The system of claim 16, wherein the NIR light source is coupled to the treatment instrument.
18. The system of claim 14, wherein the treatment instrument is a thoracoscopic surgical instrument.
19. The system of claim 14, wherein the treatment instrument is configured to perform a lobectomy, a segmentectomy, or a wedge resection.
20. The system of claim 14, wherein the instructions, when executed by the processor, further cause the processor to display, in the intraoperative radiographic images, the endoluminal instrument placing two or more radiopaque markers coated with a NIR fluorophore in two or more air passages of the lung at or near the target.
PCT/IB2024/050797 2023-02-01 2024-01-29 Localization and treatment of target tissue using markers coated with near-infrared fluorophores Ceased WO2024161274A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP24703245.1A EP4658201A1 (en) 2023-02-01 2024-01-29 Localization and treatment of target tissue using markers coated with near-infrared fluorophores
CN202480010464.2A CN120787144A (en) 2023-02-01 2024-01-29 Localization and treatment of target tissue using near infrared fluorophore coated markers

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363442667P 2023-02-01 2023-02-01
US63/442,667 2023-02-01

Publications (1)

Publication Number Publication Date
WO2024161274A1 true WO2024161274A1 (en) 2024-08-08

Family

ID=89834235

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2024/050797 Ceased WO2024161274A1 (en) 2023-02-01 2024-01-29 Localization and treatment of target tissue using markers coated with near-infrared fluorophores

Country Status (3)

Country Link
EP (1) EP4658201A1 (en)
CN (1) CN120787144A (en)
WO (1) WO2024161274A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170020628A1 (en) * 2013-11-25 2017-01-26 Body Vision Medical Ltd. Surgical devices and methods of use thereof
US20210077195A1 (en) * 2018-05-16 2021-03-18 University Of Maryland, College Park Confidence-based robotically-assisted surgery system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170020628A1 (en) * 2013-11-25 2017-01-26 Body Vision Medical Ltd. Surgical devices and methods of use thereof
US20210077195A1 (en) * 2018-05-16 2021-03-18 University Of Maryland, College Park Confidence-based robotically-assisted surgery system

Also Published As

Publication number Publication date
EP4658201A1 (en) 2025-12-10
CN120787144A (en) 2025-10-14

Similar Documents

Publication Publication Date Title
US11547377B2 (en) System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction
AU2018290995B2 (en) System and method for identifying, marking and navigating to a target using real-time two-dimensional fluoroscopic data
US12318151B2 (en) Integration of multiple data sources for localization and navigation
US9770216B2 (en) System and method for navigating within the lung
CN112386336A (en) System and method for fluorescence-CT imaging with initial registration
US20190247127A1 (en) 3d reconstruction and guidance based on combined endobronchial ultrasound and magnetic tracking
US11672415B2 (en) Marker placement
CN111317569A (en) System and method for imaging a patient
JP2020124501A (en) Systems and methods for visualizing navigation of medical devices relative to targets
US20190246946A1 (en) 3d reconstruction and guidance based on combined endobronchial ultrasound and magnetic tracking
EP3910591B1 (en) Mapping disease spread
WO2024161274A1 (en) Localization and treatment of target tissue using markers coated with near-infrared fluorophores
US20260020922A1 (en) Systems and methods of moving a medical tool with a target in a visualization or robotic system for higher yields
US20240307126A1 (en) Systems and methods for active tracking of electromagnetic navigation bronchoscopy tools with single guide sheaths
CN118662231A (en) Active tracking system and method for electromagnetic navigation bronchoscopy tools with a single guide sheath
CN120018824A (en) Systems and methods for detecting and correcting for patient and/or imaging system motion for targeted coverage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24703245

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202480010464.2

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

WWP Wipo information: published in national office

Ref document number: 202480010464.2

Country of ref document: CN

WWP Wipo information: published in national office

Ref document number: 2024703245

Country of ref document: EP