WO2001065490A2 - Method and system for selecting and displaying a portion of an image of a body - Google Patents
Method and system for selecting and displaying a portion of an image of a body Download PDFInfo
- Publication number
- WO2001065490A2 WO2001065490A2 PCT/IL2001/000183 IL0100183W WO0165490A2 WO 2001065490 A2 WO2001065490 A2 WO 2001065490A2 IL 0100183 W IL0100183 W IL 0100183W WO 0165490 A2 WO0165490 A2 WO 0165490A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pointer
- viewpoint
- stylus
- target point
- image volume
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
Definitions
- the present invention relates to interactive displays and, more particularly, to a
- Slices of these image volumes commonly are used to guide surgical procedures, but in a manner derived from the old days of X-ray film: many successive slices of the image volume, taken in a predetermined orientation relative to the image volume, are posted on a chart in the operating room, and the surgeon performs surgery on the patient with
- the digital image volume is registered to the
- the medical instrument is provided with a magnetic field sensor that is used to measure the position and orientation of the medical instrument.
- the fiducial marker includes a similar magnetic field sensor so that an icon representing the medical instrument can be displayed, superposed on a display of the
- 3D digital image to represent the true position and orientation of the medical
- a slice of the 3D digital image may be extracted for display from the 3D digital image in any convenient (vertical, horizontal, oblique) orientation relative to the (typically parallelipipedal) 3D data volume.
- image volume of a body having a surface including the steps of: (a) registering the
- a memory for storing the image volume; (c) a processor for retrieving the portion of the image volume from the memory, with reference to the viewpoint; and (d) a monitor
- a user including: (a) a handle; (b) a stylus, operationally connected to the handle; and (c) a mechanism for providing signals representative of a location and an orientation of an element of the device selected from the group consisting of the
- the primary application of the present invention is to the interactive display of slices of a 3D image volume of the body of a surgical patient, and the present invention is described herein with reference to this primary application. Nevertheless, it is to be understood that the scope of the present invention includes the interactive display of any body, animate or inanimate, for which a 3D image volume has been acquired.
- a “viewpoint” is an abstract geometric entity, in the space occupied by and surrounding the body, that defines the portion of the image volume
- viewpoint is patterned after the use of this term to describe the relationship of a camera to an object that the
- the camera is photographing.
- the viewpoint may include as many
- the "viewpoints" of the present invention generally include a target point that may be
- the present invention may also include orientations with respect to the body and planes
- the method of the present invention begins by registering the image volume
- fiducial points are matched to corresponding reference points in the image volume. According to another preferred embodiment of the present invention
- coordinates of a feature of the body are sampled and matched to a representation of that feature in the image volume.
- the feature may be the surface of the body, or alternatively a feature internal to the body.
- the coordinates of the surface of the skeletal bone are sampled using an echolocation procedure such as ultrasound.
- viewpoint style is used herein to refer to the qualitative aspects of the viewpoint. These qualitative aspects include, but are not necessarily limited to, the type of coordinates included in
- the viewpoint including specific values of the coordinates, are selected.
- One preferred viewpoint of the present invention includes a target point on the
- One or more slices of the image volume are displayed with reference to the target point and the display orientation.
- the stylus is perpendicular to the handle.
- the tip of the stylus is pointed at the target point and
- the handle is pivoted to indicate the display orientation.
- the tip of the handle is pivoted to indicate the display orientation.
- This hand-held pointer constitutes an invention in its own right: a six-degree-of freedom input/control device.
- the image volume is stored in a memory, and slices of the image volume are retrieved from the memory and displayed on a monitor, under the control of a processor.
- the pointer includes a mechanism for providing signals representative, both of the location of the tip of the stylus, and of the spatial orientation of the pointer itself. These signals are sent to the processor, which retrieves and displays the slices of the image volume in accordance with these signals.
- a preferred optical embodiment of this mechanism includes coils rigidly mounted in the handle of the pointer.
- a preferred optical embodiment of this mechanism includes light emitting
- the display orientation of the slices of the image volume usually coincides
- viewpoint styles also defined with reference to the body using the hand-held pointer.
- the pointer functions as a virtual camera, with the tip of the stylus defining the three spatial coordinates of the virtual camera and with the pivot orientation of the pointer defining the pitch and
- the stylus defines an axis that runs through the tip of the stylus, and pointing the stylus at the body defines, as a viewpoint, a point along the axis within the body.
- the display consists of three mutually perpendicular planes of the image volume whose common intersection point is the viewpoint.
- both the stylus and the handle define respective axes.
- the display again consists of three mutually perpendicular slices of the image volume: a first slice in a plane that includes the stylus axis and that is perpendicular to the body axis, a second slice in a plane parallel to the plane defined
- the pointer functions in two modes, a control mode and a viewpoint definition mode.
- control mode the pointer is used to control the
- the pointer sends control signals to
- control signals include, inter alia, signals that define the desired viewpoint style.
- the signals are sent from the pointer to the
- FIG. 1 is an illustration of a system of the present invention
- FIG. 2 illustrates the selection of slices of the image volume for display
- FIG. 3 shows two preferred embodiments of the pointer of FIG. 1
- FIGS 4 and 5 illustrate two other methods of selecting slices of the image
- the present invention is of a method and system for interactively displaying
- the present invention can be used to display slices of an image volume of the body of a patient immediately prior to
- Figure 1 illustrates a system of the present
- Image volume 14 is a three dimensional parallelipipedal array of voxels
- imaged in image volume 14 is shown as a dashed box 14' in Figure 1. Note that part
- box 14' extends past body 10, so that part of the surface 12 of body 10 is included
- a processor 16 retrieves selected slices of image volume 14
- the slices to be displayed are selected using a pointer 26.
- 26 is a pointed tip 28 which is pointed at a target point 30 on surface 12 of body 10, or
- Figure 2 is an enlargement of a portion of Figure 1 that shows how pointer 26 is used to select slices
- pointer 26 is shown pointing at target point 30.
- Pointer 26 defines an axis 34 that intersects tip 28.
- Pointer 26 is pivoted
- processor 16 computes the pixels of
- buttons 36 and 38 pushing button 36 moves the display depth to a deeper slice
- pushing button 38 moves the display depth to a shallower slice.
- pivoting pointer 26 about target point 30, as shown in Figure 2B changes the
- pointer 26 is oriented at angle ⁇ d from vertical, slice 40d is displayed, and when
- processor 16 by a datalink that is represented symbolically in Figure 1 by an arrow 32.
- datalink 32 is a wireless datalink.
- target point 30 determined by target point 30. Specifically, the point on the slice, at which the
- Processor 16 immediately alters both the lateral positioning and display of the slice
- pivot orientation As noted above, the orientation of pointer 26 is referred to herein as the "pivot orientation”. A suitable mechanism is provided to indicate the pivot orientation of
- a transmitter 22 drives AC currents in three
- Processor 16 infers, from these signals, the
- pointer 26 Preferably, the location of tip 28 is expressed in Cartesian coordinates and
- Figures 3A and 3B are side and top views, respectively, of a preferred
- This embodiment of pointer 26 includes a lozenge-shaped handle 50 that defines a handle axis 35 and from which emerges a stylus 48 that
- axis 34 includes tip 28 and that defines axis 34. To distinguish axis 34 from axis 35 in the
- axis 34 is referred to herein as the "stylus
- Stylus 48 is rigidly attached to handle 50, with stylus axis 34 substantially
- Handle 50 includes buttons 36 and 38 as described
- Buttons 42 and 44 are used to toggle pointer 26 between viewpoint definition
- trackball 46 In control mode, trackball 46 is used to move a cursor on the screen of monitor
- buttons 42 and 44 are used to select and/or activate
- magnification of the displayed slices may be adjusted using a slider, as may the default 45-degree orientations of the three slices of the "3D textured plane" display.
- a force sensor such as the TrackPointTM sensor used in IBM laptop computers may be used. Such a force sensor is sensitive to
- buttons 36 and 38 replaces both trackball 46 and one of buttons 36 and 38.
- the force sensor is mounted on a rotary dialer knob such as the dialer knob used in the ABA2020 VoicePodTM record/playback device
- Twisting the force sensor counterclockwise represents a -z
- buttons 36 and 38 are not needed.
- handle 50 is partially cut away to show three mutually
- perpendicular sensor coils 52 that are used to sense the fields generated by antennas
- Coils 52 are rigidly attached to handle 50, so that a simple rigid translation and
- sensor is embedded inside stylus 48 or is rigidly attached to stylus 48, near tip 28.
- sensor coils 52 in handle 50 The location of sensor coils 52 in handle 50 is illustrative, rather than
- Appropriate field sensors may be positioned anywhere on or in pointer 26.
- the NOG ATM Cardiac Mechanics and Hemodynamic Mapping System produced by Biosense Webster (a subsidiary of Johnson and Johnson of New Brunswick NJ), includes such a sensor, mounted within a catheter, that is small
- Figure 3C is a front view of an alternative preferred embodiment of pointer 26.
- this embodiment of pointer 26 includes four infrared light emitting
- diodes 54 for use in conjunction with an optical tracking system, such as the Optotrak system available from Northern Digital Inc. of Toronto, Ontario, Canada.
- tip 28 is placed in contact with the fiducial points and the coordinates of the fiducial points are
- surface 12 may be used as fiducial points.
- small blocks of a material that has a high contrast in the imaging modality used to create image volume 14 are
- image volume 14 is a CAT scan, then the material of the blocks is radio-opaque.
- volume 14 serve as the reference points.
- a second registration method is based on fitting a mathematical surface to a
- image volume 14 is
- body 10 Some parts of body 10, such as a leg, are too symmetrical for the second
- an interior feature such as the surface of a
- skeletal bone is used for registration.
- the mathematical surface is fitted to a portion
- distal end 58 is preferably, for this purpose, and as illustrated in Figure 3B, distal end 58
- stylus 48 includes a piezoelectric sensor 56 that serves as both an ultrasound transmitter and an ultrasound receiver. Sensor 56 is connected to appropriate
- image volume 14 is registered to body 10 by
- target point 30 is only one of many possible viewpoint styles, each of which defines a different display mode.
- a second viewpoint style is defined by using the preferred
- Stylus 48 is
- coordinates of the virtual camera are the spatial coordinates of tip 28.
- Pitch is defined as rotation in the planes of axes 34 and 35.
- Yaw is defined as rotation about body axis
- Roll is defined as rotation about stylus axis 34.
- surfaces of internal organs of patient 10, as represented in image volume 14, are contoured and shaded by methods that are well-known in the art, and are displayed as a three-dimensional rendition of these organs as these organs would be photographed
- volume 14 parallel to stylus axis 35 is displayed, to simulate the output of an x-ray
- a third viewpoint style is defined as illustrated in Figure 4.
- Point 30' is defined to be at a distance d from tip 28 along stylus axis 34.
- Pointer 26 is
- the display consists of three orthogonal planes 60, 62 and 64, of the voxels of image volume 14, that all intersect at
- a fourth viewpoint style is defined as illustrated in Figure 5. This viewpoint
- Viewpoint plane 66 is
- Viewpoint plane 68 is the
- Viewpoint plane 70 is perpendicular to viewpoint
- Image slice 72 is the intersection of viewpoint plane
- Image slice 74 is parallel to viewpoint plane 68.
- slice 76 is parallel to viewpoint plane 70.
- the lateral position of image slice 74 within imaged volume 14' is controlled using trackball 46.
- volume 14' is not shown explicitly in Figure 5, and image slices 72, 74 and 76 are
- stylus 48 need not be rigidly attached to handle 50.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2001235951A AU2001235951A1 (en) | 2000-02-29 | 2001-02-27 | Method and system for selecting and displaying a portion of an image of a body |
EP01908094A EP1259940A2 (en) | 2000-02-29 | 2001-02-27 | Method and system for selecting and displaying a portion of an image of a body |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US51562400A | 2000-02-29 | 2000-02-29 | |
US09/515,624 | 2000-02-29 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2001065490A2 true WO2001065490A2 (en) | 2001-09-07 |
WO2001065490A3 WO2001065490A3 (en) | 2002-03-28 |
Family
ID=24052105
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2001/000183 WO2001065490A2 (en) | 2000-02-29 | 2001-02-27 | Method and system for selecting and displaying a portion of an image of a body |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP1259940A2 (en) |
AU (1) | AU2001235951A1 (en) |
WO (1) | WO2001065490A2 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10313829A1 (en) * | 2003-03-21 | 2004-10-07 | Aesculap Ag & Co. Kg | Medical navigation system and method for use thereof, whereby the surgical instrument being used has an attached marker element so that the imaged operation area can be centered on the instrument |
US9700342B2 (en) | 2014-03-18 | 2017-07-11 | Monteris Medical Corporation | Image-guided therapy of a tissue |
US10092367B2 (en) | 2014-03-18 | 2018-10-09 | Monteris Medical Corporation | Image-guided therapy of a tissue |
US10188462B2 (en) | 2009-08-13 | 2019-01-29 | Monteris Medical Corporation | Image-guided therapy of a tissue |
US10327830B2 (en) | 2015-04-01 | 2019-06-25 | Monteris Medical Corporation | Cryotherapy, thermal therapy, temperature modulation therapy, and probe apparatus therefor |
US10675113B2 (en) | 2014-03-18 | 2020-06-09 | Monteris Medical Corporation | Automated therapy of a three-dimensional tissue region |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8256430B2 (en) | 2001-06-15 | 2012-09-04 | Monteris Medical, Inc. | Hyperthermia treatment and probe therefor |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0908849A1 (en) * | 1996-06-25 | 1999-04-14 | Hitachi Medical Corporation | Method and apparatus for determining visual point and direction of line of sight in three-dimensional image construction method |
EP0919956A2 (en) * | 1997-11-26 | 1999-06-02 | Picker International, Inc. | Image display |
-
2001
- 2001-02-27 AU AU2001235951A patent/AU2001235951A1/en not_active Abandoned
- 2001-02-27 EP EP01908094A patent/EP1259940A2/en not_active Withdrawn
- 2001-02-27 WO PCT/IL2001/000183 patent/WO2001065490A2/en not_active Application Discontinuation
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0908849A1 (en) * | 1996-06-25 | 1999-04-14 | Hitachi Medical Corporation | Method and apparatus for determining visual point and direction of line of sight in three-dimensional image construction method |
EP0919956A2 (en) * | 1997-11-26 | 1999-06-02 | Picker International, Inc. | Image display |
Non-Patent Citations (1)
Title |
---|
MORI K ET AL: "VIRTUALIZED ENDOSCOPE SYSTEM - AN APPLICATION OF VIRTUAL REALITY TECHNOLOGY TO DIAGNOSTIC ACID" IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, INSTITUTE OF ELECTRONICS INFORMATION AND COMM. ENG. TOKYO, JP, vol. E79-D, no. 6, 1 June 1996 (1996-06-01), pages 809-819, XP000595187 ISSN: 0916-8532 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10313829A1 (en) * | 2003-03-21 | 2004-10-07 | Aesculap Ag & Co. Kg | Medical navigation system and method for use thereof, whereby the surgical instrument being used has an attached marker element so that the imaged operation area can be centered on the instrument |
DE10313829B4 (en) * | 2003-03-21 | 2005-06-09 | Aesculap Ag & Co. Kg | Method and device for selecting an image section from an operating area |
US10188462B2 (en) | 2009-08-13 | 2019-01-29 | Monteris Medical Corporation | Image-guided therapy of a tissue |
US10610317B2 (en) | 2009-08-13 | 2020-04-07 | Monteris Medical Corporation | Image-guided therapy of a tissue |
US10548678B2 (en) | 2012-06-27 | 2020-02-04 | Monteris Medical Corporation | Method and device for effecting thermal therapy of a tissue |
US9700342B2 (en) | 2014-03-18 | 2017-07-11 | Monteris Medical Corporation | Image-guided therapy of a tissue |
US10092367B2 (en) | 2014-03-18 | 2018-10-09 | Monteris Medical Corporation | Image-guided therapy of a tissue |
US10342632B2 (en) | 2014-03-18 | 2019-07-09 | Monteris Medical Corporation | Image-guided therapy of a tissue |
US10675113B2 (en) | 2014-03-18 | 2020-06-09 | Monteris Medical Corporation | Automated therapy of a three-dimensional tissue region |
US10327830B2 (en) | 2015-04-01 | 2019-06-25 | Monteris Medical Corporation | Cryotherapy, thermal therapy, temperature modulation therapy, and probe apparatus therefor |
US11672583B2 (en) | 2015-04-01 | 2023-06-13 | Monteris Medical Corporation | Cryotherapy, thermal therapy, temperature modulation therapy, and probe apparatus therefor |
Also Published As
Publication number | Publication date |
---|---|
EP1259940A2 (en) | 2002-11-27 |
WO2001065490A3 (en) | 2002-03-28 |
AU2001235951A1 (en) | 2001-09-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11504095B2 (en) | Three-dimensional imaging and modeling of ultrasound image data | |
US6991605B2 (en) | Three-dimensional pictograms for use with medical images | |
US6049622A (en) | Graphic navigational guides for accurate image orientation and navigation | |
US7072707B2 (en) | Method and apparatus for collecting and processing physical space data for use while performing image-guided surgery | |
US5823958A (en) | System and method for displaying a structural data image in real-time correlation with moveable body | |
US6379302B1 (en) | Navigation information overlay onto ultrasound imagery | |
US7831096B2 (en) | Medical navigation system with tool and/or implant integration into fluoroscopic image projections and method of use | |
US9320569B2 (en) | Systems and methods for implant distance measurement | |
US7643862B2 (en) | Virtual mouse for use in surgical navigation | |
US6442417B1 (en) | Method and apparatus for transforming view orientations in image-guided surgery | |
AU2008249201B2 (en) | Flashlight view of an anatomical structure | |
EP2096523A1 (en) | Location system with virtual touch screen | |
US20080154120A1 (en) | Systems and methods for intraoperative measurements on navigated placements of implants | |
US20080089566A1 (en) | Systems and methods for implant virtual review | |
JP2009090120A (en) | Improved system and method for positive displacement registration | |
WO2010056561A1 (en) | Systems and methods for image presentation for medical examination and interventional procedures | |
WO2002024094A2 (en) | Non-ivasive system and device for locating a surface of an object in a body | |
CN109833092A (en) | Internal navigation system and method | |
JP6548110B2 (en) | Medical observation support system and 3D model of organ | |
US20080240534A1 (en) | Method and Device For Navigating and Measuring in a Multidimensional Image Data Set | |
WO2001065490A2 (en) | Method and system for selecting and displaying a portion of an image of a body | |
RU2735068C1 (en) | Body cavity map | |
US6028912A (en) | Apparatus and method for point reconstruction and metric measurement on radiographic images | |
US20240394996A1 (en) | Method for analysing 3d medical image data, computer program and 3d medical image data evaluation device | |
EP3637374A1 (en) | Method and system for visualising a spatial surface curvature of a 3d-object, computer program product, and computer-readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
AK | Designated states |
Kind code of ref document: A3 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A3 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2001908094 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2001908094 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2001908094 Country of ref document: EP |
|
NENP | Non-entry into the national phase in: |
Ref country code: JP |