WO2003077758A1 - System and method for analyzing and displaying computed tomography data - Google Patents
System and method for analyzing and displaying computed tomography data Download PDFInfo
- Publication number
- WO2003077758A1 WO2003077758A1 PCT/US2003/007996 US0307996W WO03077758A1 WO 2003077758 A1 WO2003077758 A1 WO 2003077758A1 US 0307996 W US0307996 W US 0307996W WO 03077758 A1 WO03077758 A1 WO 03077758A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- colon
- data
- view
- ray
- wall
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/42—Detecting, measuring or recording for evaluating the gastrointestinal, the endocrine or the exocrine systems
- A61B5/4222—Evaluating particular parts, e.g. particular organs
- A61B5/4255—Intestines, colon or appendix
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/032—Transmission computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/48—Diagnostic techniques
- A61B6/481—Diagnostic techniques involving the use of contrast agents
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/50—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/055—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/021—Flattening
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/028—Multiple view windows (top-side-front-sagittal-orthogonal)
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Definitions
- the present invention generally relates to analysis and display of data acquired using computed tomography representing various organs of the human body. More specifically, the present invention relates to a computerized system for analyzing data acquired from tubular structures of the body, for example, the colon or gastro-intestinal tract, and displaying that data in a manner providing for rapid evaluation of the tissue structures of tubular structure and the accurate determination of any abnormalities present.
- Description of Related Art :
- colon cancer In industrialized countries, colorectal cancer is the second leading cause of deaths from malignancy. In the United States, almost 150,000 people are determined to have colon cancer annually. Unfortunately, colon cancer also causes approximately 60,000 deaths annually, with only lung cancer causing more deaths among the population of America. What makes these statistics truly unfortunate is that colon cancers are perhaps one of the most preventable of cancers because they usually begin as benign polyps which grow slowly for five to ten years before becoming cancerous. If these polyps are detected and removed, the risk of developing colon cancer is greatly reduced. Regrettably, widespread colorectal screening and preventive efforts are hampered by several practical impediments, including limited resources, methodologic inadequacies, and poor patient acceptance leading to poor compliance.
- a fecal occult blood test fails to detect the majority of cancers and pre-cancerous polyps. Since sigmoidoscopy only examines a portion of the colon, it also misses many polyps. The accuracy of barium enema varies among centers and is therefore not always reliable.
- a technique using helical (also known as spiral) computed tomography (CT) to create computer simulated intraluminal flights through the colon was proposed as a novel approach for detecting colorectal neoplasms by Nining D J, Shifrin R Y, Grishaw E K, Liu K, Gelfand D W, Virtual colonoscopy (Abst), Radiology Scientific Prgm 1994; 193(P):446.
- This technique was first described by Nining et al. in an earlier abstract by Nining D J, Gelfand DW, ⁇ oninvasive colonoscopy using helical CT scanning, 3D reconstruction, and virtual reality (Abst), SGR Scientific Program, 1994.
- This technique referred to as “virtual colonoscopy” requires a cleansed colon insufflated with air, a helical CT scan of approximately 30-60 seconds, and specialized three-dimensional (3D) imaging software to extract and display the mucosal surface.
- the resulting endoluminal images generated from the CT scan data are displayed to a medical practitioner for diagnostic purposes.
- the technique of reformatting 2D cross sections perpendicular to the colon midline is also described in U.S. Pat. No. 5,458,111, issued Oct. 17, 1995 to Coin.
- direct interpretation of the cross-sectional images is difficult because a scan of the colon consists of several hundred axial tomograms.
- the prior techniques typically required massive computing capacity and specialized work stations. Moreover, the prior techniques have also been limited to providing views of the colorectal pathway that may occlude the presence of polyps, and which do not provide any information regarding the subsurface morphology of the colorectal pathway, or other tubular structure being analyzed. Typically, the only views provided to the physician or technician viewing the data display are two and three- dimensional endoluminal, coronal, sagittal and axial views of the tubular structure being evaluated.
- a CT scanner and computer workstation are used to image tubular structures of the human body, such as the digestive tract of a living person.
- the CT scanner is used to generate cross-sectional axial images of a human colon which are then transferred to the computer workstation.
- a colon midline is defined which follows the colon lumen.
- the computer workstation supports colon midline definition by generating and displaying reformatted cross-sectional images, volume rendered scouts, and interluminal views. Semi-automatic midline defining tools may also be included. After the midline is defined, a montage of images is displayed for diagnostic purposes.
- the images include axial sections, transluminal cross section, and intraluminal volume rendered images.
- Semiautomatic methods are used to delineate the three-dimensional shape of the colon by identifying its approximate midline. This is supported by the display of coronal, sagittal, and axial views as well as an off-axis plane orthogonal to the midline. Straightened curved sections along the midline and a rendered view from the end of the midline may also be displayed. Editing the midline in any of these views will typically cause the other views to update accordingly.
- data are extracted so the colon can be examined efficiently.
- a medical practitioner such as a radiologist, examines a tubular structure of a patient, such as the patient's colon, by moving a virtual view point along the delineated midline.
- Three-dimensional orthogonal off-axis cross sections, volume rendered extra- luminal scouts or the original axial 2D images, and a high resolution perspective rendering of the colon's inner surface may be displayed so that the practitioner can observe the structures of the tissues of tubular structure being analyzed.
- the perspective views can typically be re-oriented in any direction.
- a rotatable longitudinal sectioning along the colon's midline may also displayed. This view, which may be advanced along the entire length of the colon, enhances both navigation of the path forward or backward along the 3D images and interpretation of the 3D images.
- volume rendering of the CT data is performed by custom algorithms that use pre-computation of texture and other techniques to achieve interactive performance on moderately priced workstations.
- the first view is typically a forward intra-luminal view which encompasses a view of the colon from the view point looking away from a terminating location of the colon, such as the anal verge.
- the second view may be a backward intra-luminal view which encompasses a view of the colon from the view point looking toward a terminating location of the colon, such as the anal verge. Displaying both views makes it less likely that a feature of interest will be obscured due to the topology of the colon.
- a practitioner By pointing the cursor on the image and simultaneously pressing a key or otherwise issuing an appropriate instruction to the computer work station, a practitioner can move the view point off the previously defined midline, and all images, including 2D reformatted images, 3D intra-luminal images, and the 2D axial image may be updated to views corresponding to the designated position.
- the practitioner can randomly move the view point by using the pointing device, or return to the nearest point on the predetermined midline.
- an unfolded or opened view of the colon may also be displayed to the observer.
- the opened view ofthe colon corresponds to a view of the entire colon as if it had been physically divided and opened for inspection, just like cutting a garden hose in half along its longitudinal axis, maintaining the curvature of the colon.
- this "unfolding" technique only allows a view of the internal structure of the tubular structure or colon, and does not provide a truly “flattened” view of the internal wall of the colon, wherein the inherent curvature of the colon is "flattened”, thus allowing structures that protrude from the inner wall of the colon, such as Haustral ridges, to hide structures such as small polyps or other anomalies from the practitioners view.
- the present invention is embodied in a system and method of rendering three-dimensional images that provide a user with views of data generated by a scanner of selected portions of the human body that are designed to assist the user, or another person, such as a surgeon or physician, in diagnosing disease conditions and in prescribing treatment for those conditions. Also disclosed is a system and method for further processing three-dimensional volume data using appropriate pattern recognition and analysis software to identify abnormalities in imaged bodily structures, particularly in tubular structures, such as the colon, of the body.
- the present invention is embodied in a method of imaging a tubular body structure having a lumen defined by a wall, comprising the steps of providing a data set containing data representing a plurality of cross-sectional images of a tubular structure of the body taken along a longitudinal axis of the tubular body, processing the data set to reconstruct a three-dimensional image of the tubular body, identifying a central pathway through a lumen of the three- dimensional image, selecting a starting point along the central pathway, processing the data beginning at the starting point and continuing along the longitudinal length of the three-dimensional image of the colon and rendering a flattened view of the three-dimensional image, storing data representing the flattened view in an image buffer, and displaying the flattened view of the image.
- the present invention is embodied in a method that further includes selecting a point along the central pathway, processing the data at the selected point and rendering an image of a cross-section of the wall of the tubular structure at the selected point, and displaying the image of the cross-section of the wall of the tubular structure.
- the invention includes processing the data and rendering the flattened image, projecting a ray from the starting point to the wall, the direction ofthe ray corresponding to an angle of view, and adding a voxel value to an image buffer.
- the method may also include shifting the angle of view by one degree, and projecting another ray from the starting point to the wall, the direction ofthe ray corresponding to the shifted angle of view if the angle of view has not been shifted a total of 360 degrees from the initial starting point.
- the method also includes advancing the starting point along the longitudinal axis of the lumen by a selected value if the angle of view has been shifted a total of 360 degrees from the initial starting point, and repeating the steps of projecting the ray and storing voxel values in an image buffer until the entire length of the lumen has been processed.
- the present invention includes a method for generating a view of the interior of a wall of tubular structure of a body including tissue adjacent the exterior of the wall, comprising:
- the present invention includes comparing a three- dimensional volume data set to a library of geometrical patterns representative of predetermined abnormalities, and identifying a structure contained in the three- dimensional volume data set as abnormal if the structure is determined to match at least one of the library of geometrical patterns within a predetermined tolerance.
- the present invention includes further processing the identified abnormal structure to determine if the identified structure is not abnormal.
- the present invention is embodied in a computer having a processor, a program memory and a semi -permanent storage memory.
- the processor is programmed by a software program designed to receive data sets generated by a scanner, process the data in the data sets in its memory to render three-dimensional views of the scanned data, and also render flattened views of the scanned data. Alternatively, the processor processes the scanned data to render subsurface views of a structure of interest
- FIGURE 1 is a graphical representation of a system for taking CT scans of a patient, including communication means for transmitting the CT scan data to a remote computer for processing and display in accordance with embodiments of the present invention
- FIG. 2 is a block diagram illustrating a generalized process using embodiments of the present invention for diagnosis
- FIG. 3 is a view of a two-dimensional image of a patient's colon showing identification of luminal center points of a fly by wire pathway;
- FIG. 4 is a view of a representative display showing various views of a patient's colon derived from CT scan data using the methods of present invention;
- FIG. 5 is a schematic view showing the general processes of a software program incorporating methods of the present invention
- FIG. 6 is a view of a two-dimensional image of a patient's colon rendered in accordance with the methods ofthe present invention
- FIG. 7 is a schematic view of a section of a colon showing the drawing of a ray from a luminal center point to the wall of the colon;
- FIG. 8 is a side view of a flattened section ofthe colon section of FIG. 7;
- FIG. 9 is a graphical display showing the relationship of a flattened section and an overhead view of a section of the colon;
- FIG. 10 is a flowchart representing the steps involved in flattening a three- dimensional image of a colon section
- FIG. 11 is a composite view of a cross-sectional view of a colon including points representing the stepping of a ray from a luminal center point to and through the colon wall, coordinated with two views representative of the display of images rendered using the methods ofthe present invention.
- FIG. 12 is a flowchart representing the steps involved in calculating and displaying a subsurface view of a section of a colon wall
- FIG. 13 is a flowchart representing an embodiment of the present invention that includes computer aided diagnosis performed using pattern recognition software analysis of the stored data generated using aspects of the present invention
- FIG. 14 is a graphical representation of one embodiment of a report that may be generated using the system and methods of the present invention.
- FIG. 15 is a graphical representation of another embodiment of a report generated using the system and methods of the present invention that is coordinated with the embodiment of FIG. 14.
- the present invention generally relates to a method and system for generating and displaying interactive two- and three-dimensional figures representative of structures of the human body.
- the three-dimensional structures are in the general form of selected regions of the body and, in particular, body organs with hollow lumens such as colons, tracheobronchial airways, blood vessels, and the like.
- interactive, three-dimensional renderings of a selected body organ are generated from a series of acquired two-dimensional images generated from data acquired by a computerized tomographic scanner, or CAT scan.
- the data set representative of the scan ofthe patient is extracted and stored with appropriate identifying information.
- the data set may then be transferred to a computer work station for analysis.
- the scan data may be compressed so that it can be transmitted to an analysis site by a variety of means.
- the present invention is particularly suited for use on computer work stations based upon a personal computer, such as a Macintosh computer or IBM clone computer that have appropriate amounts of working memory, processing power and hard drive storage capacity.
- the inventors have operated software embodying the present invention of a Macintosh G4, sold by Apple Computer, Inc., as well as other computers.
- CT scan data may be sent over the internet, local area network or telephone lines using suitable communications devices and techniques well known in the art to computers and workstations located remotely from the scanner location operating under the control of software incorporating aspects of the present invention for further processing and analysis.
- a patient is initially prepared for imaging by cleansing the patient's colon.
- the purpose ofthe cleansing procedure is to eliminate feces from the colon.
- an absolutely clean colon is desired prior to CT scanning, as any retained feces or fluid can simulate or mask small polyps.
- the colon is insufflated with gas to distend the colon. Distending the colon assures that the interior surface of the colon will be clearly visible in the final image display.
- a rectal catheter i.e., a standard barium enema catheter (0.5 inch (1.27 cm)) diameter is inserted and gas is introduced into the colon.
- the colon is filled to a predetermined pressure or volume by introducing the gas either as discrete puffs or at a constant flow rate.
- air can be used, C0 2 may be preferred since C0 2 passes through the colonic mucosa into the bloodstream and is subsequently exhaled, thereby decreasing the amount of bloating and cramping experienced by a patient after the examination.
- the colon is then scanned by, for example, a helical CT scanner to produce a series of two-dimensional images of the colon.
- the picture elements, or pixels, in the images represent at least one physical property associated with the colon.
- the physical property for example, may be the x-ray attenuation of the colon wall or of the air column within the colon.
- the images are generally taken at regularly spaced locations throughout the abdominal region of the patient. The smaller the spacing between successive images, the better the resolution in the final displayed imagery. Preferably, the spacing between successive images is approximately 1 mm to produce isotropic voxels since the X and Y dimensions of each voxel are each approximately 1 mm.
- the scanning parameters may consist, for example, of a 0.20 inch (5 mm) x-ray beam collimation, 0.4 inch/sec (10 mm/sec) table speed to provide 2: 1 pitch, and a 0.04 inch (1 mm) image reconstruction interval.
- the x-ray beam collimation is the width of the x-ray beam which thereby establishes the CT slice thickness and Z-axis spatial resolution.
- the pitch is the CT table speed divided by the collimation.
- the image reconstruction interval reflects the interval at which two-dimensional images are reconstructed.
- a 50 second scan at a table speed of 10 mm per second in a Z-axis direction creates a volume of data having a Z-axis length of 500 mm.
- Using a 1 mm reconstruction interval over the length ofthe Z-axis of the volume of data produces 500 images with each image representing a 5 mm thick section along the Z-axis.
- the scan is done twice, once with the patient in a prone position, and once with a patient in a supine position, thus generating approximately 1000 images.
- Newer multi-slice CT systems including those having 4 and 16 channels of data acquisition bandwidth, can obtain 5 mm thick slices every 2 mm in about 30-40 seconds, yielding data sets having 200-250 images. Scanning 2.5 mm thick slices every 1.0 mm increases that number to 400-500 images per data set.
- the rectal catheter is removed, the gas is expelled, and the patient is discharged.
- FIG. 1 generally depicts a system that may be used to render images in accordance with the principals of the present invention.
- System 10 includes a scanner 12 that is connected to a computer 14.
- Computer 14 is used to control the operation of the scanner 12, and is also used to acquire data generated by the scanner 12.
- the computer 14 may be any computer having sufficient computational power, storage and memory so that it is capable of running customized software for controlling the scanner and acquiring and storing the data generated by the scanner during a scan of a patient's body, or selected portion of a patient's body.
- the computer 14 also includes, besides a processor or microprocessor capable of being programmed using custom software, a memory 18 which may be random access memory or read only memory, and suitable storage media 20, which may include, for example, a hard drive, a CD or DVD drive, or other storage media suitable for storing the data generated by the scanner.
- a display 16 is connected to computer 14 to allow an operator to view displays of data and operating parameters for the scanner, as well as images rendered from the data generated by the scanner.
- computer 14 will also be operably connected to a device that allows input of commands by the user of the scanner. Such input devices include a mouse or other pointing device, a key board, bar code scanner or other device.
- the computer 14 may also be operably connected to a modem or other communication device that allows provides connectivity to a communication means 22, such as a telephone line, cable, DSL connection, local area network, the
- Computer 14 to communicate with other suitably programmed computer systems or workstations 24, providing for the transmission of data generated by the scanner to the other computer systems or workstations 24.
- Computer or workstation 24 is similarly equipped to communicate via communications means 22 with computer 14, providing for communication of analysis and reports generated on the computer 24 to a user of computer 14.
- Computer 24 also includes a processor or microprocess, memory 26 and storage media 26. Additionally, a mouse or other pointing device 32 may also be connected, either through a wired or wireless connection, to allow input of commands to the software program running on computer 24 to control the operation of aspects ofthe software program.
- the processor of computer 24 will typically be programmed with custom software incorporating the methods of the present invention to provide various views of the data generated by the scan which are displayed on the display 30. While the methods of the present invention will be described below as if the software operating in accordance with those methods is controlling computer 24, it will be understood by those skilled in the art that the same software may operate and control computer 14, providing a user of computer 14 with the same analysis, views and reports that are generated by computer 24 without the need to transmit data generated during the scan of a patient to a remote location.
- FIG. 2 a generalized flow-chart illustrating the flow of data through a system operating software programmed to carry out the methods of the present invention is depicted.
- a database containing CT scan data is acquired, either by receiving the database over a wire based or wireless communication system.
- the database may be manually provided to the system, such as through the use of a portably hard drive, Zip disk, CD or DVD or other portable storage media having sufficient capacity.
- the scan data typically consists of a series of two-dimensional images that represent slices taken through the structure that was the subject of the scan.
- the two-dimensional images are converted or transformed into a three-dimensional image.
- the two-dimensional images may be typically obtained from a helical CT scanner operated by a computer console.
- the scanner may be a General Electric HiSpeed Advantage Helical CT Scanner connected with an optional General Electric Independent computer console or physician's console.
- the computer console may, however, be an integral part of the helical CT scanner instead of a separate independent console.
- the two-dimensional images can also be obtained from ultrasound, positron emission tomography, emission computed tomography, and magnetic resonance imaging.
- the physical property measured is directly associated with the scanning technique used to produce the two- dimensional images.
- CT images the physical property measured is typically x- ray attenuation
- MRI magnetic resonance images
- Each of the individual two-dimensional images defines a two-dimensional (X-Y) matrix of picture elements, or pixels, with each pixel representing a predetermined physical property associated with the three-dimensional structure or organ at a particular location within the three-dimensional structure.
- Successive two-dimensional images are spaced in a third-dimensional direction (Z) throughout the three-dimensional structure.
- the analysis continues by stacking the series of two-dimensional images to form a three-dimensional volume 13, thereby defining a three- dimensional matrix (X, Y and Z axes) that represents at least one physical property associated with the three-dimensional structure at coordinates positioned throughout the three-dimensional volume.
- the three-dimensional matrix is composed of three- dimensional volume elements, or voxels, which are analogous to two-dimensional pixels.
- a targeted volume may be selected for three-dimensional rendering.
- the targeted volume may include a selected organ or region of interest that is to be isolated from the original three-dimensional volume.
- the targeted volume may include an entire organ or, alternatively, may include an air column or volume confined within an organ or a lumen of the organ.
- a dataset representing the original three-dimensional volume may be optionally subjected to a dataset reduction process to decrease image or spatial resolution or to divide the original volume into smaller subvolumes of data prior to isolating the targeted volume.
- Dataset reduction and/or subvolume selection are only necessary if the capabilities of the graphics computer are inadequate to process the full three-dimensional volume for effective or efficient three-dimensional rendering.
- the amount of reduction may vary depending upon the size of the original dataset and the processing speed and capability ofthe computer used for three-dimensional rendering.
- the three-dimensional image of the organ or region of interest is segmented or isolated from the volume of data.
- a range of tissue values which depends on the physical property measured, for example, x-ray attenuation, may be selected to designate the organ or region of interest.
- the organ or region of interest is then isolated from the volume of data.
- the region of interest may, for example, be the air column comprising the colon or tracheobronchial airways, or some other body part having a lumen filled with a homogeneous substance such as air, blood, urine, contrast agent and the like.
- the region of interest may be a section of bone.
- the segmentation process may be effected, for example, by designating a range of physical property values bounded by threshold limits that function to designate the organ of interest.
- voxels falling within a selected thresholding range are assigned a single value, for example, 255 to provide a white color
- voxels falling outside of the selected thresholding range are assigned a different single value, for example, 0 to provide a black color.
- the selected volume is thresholded by comparing each voxel in the selected volume to the threshold limits and by assigning the appropriate color value 0 or 255 to each such voxel depending on whether each such voxel falls inside or outside the threshold range defined by the threshold limits.
- the target volume is formed having equal voxel values. More specifically, a target organ is produced having a white color while all volumes outside the threshold limits are produced having a black color. With the exception of various artifacts, which can be eliminated using image-processing techniques, only the thresholded target organ is colored white and everything else is colored black.
- the data may be input into a sub-program that segments the data and allows a radiologist or technician to annotate the data set and prepare an overall report, such as is represented by box 55. This step is optional, and may be omitted without departing from the principals of the present invention.
- the scan data is modified and output as two-dimensional images, and are stored in a predetermined memory location, such as a folder identified with pertinent patient identifying information, in a suitable file format such as, for example, a "pict" file.
- a suitable file format such as, for example, a "pict" file.
- the data may be imported into a sub-program that allows the data to be manipulated as described above and rendered into three-dimensional images, as in box 60.
- a sub-program that allows the data to be manipulated as described above and rendered into three-dimensional images, as in box 60.
- One program that can be used has been developed by the inventors and titled VISUALIZE, although any rendering program that can transform or convert two-dimensional images into three-dimensional images having appropriate resolution may be used.
- the rendering process can be accomplished using a general purpose volume visualization program, such as IRIS ExplorerTM.
- Images of three-dimensional objects are projected to two-dimensional screen coordinates for displaying the data to an observer using ray-tracing techniques.
- Imaginary rays sent from a user's viewpoint pass through a viewing plane, referenced to the user's monitor screen, and into the object. If a ray intersects an object, the corresponding viewing plane pixel is painted with a color different from the background color. If no intersection is found, the pixel is painted as background.
- the criteria used to stop a ray determines what value is projected onto the viewing plane pixel. This process is different from surface rendering, which typically uses polygon rendering to generate an image; in contrast, volume rendering projects a weighted average of all voxels along a ray to generate an image.
- the three-dimensional images may now be displayed on a monitor in a form that provides the user with the ability to "fly" through the tubular structure, for example, the colon.
- a central pathway through the colon is determined in box 65. This step will be discussed in more detail below.
- the data is further processed in box 70 to provide a flattened view of the colon.
- the flattened view of the colon provides a view that may be likened to carefully slitting the colon and spreading the colon on a board so that the curvature of the colonic wall no longer obscures structures such as polyps, that may otherwise be hidden when viewed using more traditional multi-planar views.
- further processing of the data may be performed to generate a subsurface view of the wall of the colon, synchronized with the flattened view, to assist in diagnosis of colonic wall abnormalities, such as flat tumors or wall constrictions, that might otherwise not be apparent when viewing only the flattened image.
- data sets containing the processed image data are stored. It will be understood that these data sets contain data arranged in a form that allows a user to view images in the data set as if the data set were a movie. This arrangement allows a physician evaluation the scan data to fly through the colon in any direction, and to jump to a selected portion of the colon where abnormalities have been detected and marked for further analysis.
- the data sets may be transmitted to a radiologist who receives the flattened volume view, 3D volume and subsurface view data for analysis, as indicated in box 75. The radiologist then views the data, and renders his diagnosis ofthe case in box 80. The diagnosis report is then sent to a surgeon and/or the patient's physician in box 85.
- FIG. 3 the determination of a central pathway in the colon will now be described.
- a section of the colon here, the sigmoid colon
- Center points 100 along the lumen of the colon are identified, using either specialized software, or manually by an appropriately trained technician. While automated center “fly by wire” pathway generation may be used, often manual determination is advantageous to ensure that an accurate pathway through what is sometimes a tortuous, constricted or collapsed pathway is obtained.
- a series of points have been identified within the central lumen of the colon by a technician.
- the center line of the colon can be determined in one of several ways. For example, in one method of determining the central path through the lumen of the colon, a first point is selected which lies within the lumen of the colon at a desired starting location. The plane through the starting point that has the minimum cross- sectional area is determined and the center of such area is marked. A new point is then selected which lies, for example, 1 cm away from center point in a perpendicular direction relative to the surface area in the plane. A new plane of minimum area that dissects the colon and passes through the new point is determined and the corresponding center of that new area is marked. This iterative process is continued until a central path connecting the center points is determined.
- the data may be processed in real time to provide a movie-like fly through of the entire length of the colon, or selected portion of the colon.
- the system and method of the present invention allows the data display to be halted whenever a user desires with an appropriate mouse click or key board command. This allows the user to halt the fly through if the user observes an area ofthe colon that requires closer evaluation, such as for the presence of a polyp or tumor.
- the location within the colon where an anomaly is identified by the user may be displayed to the user in a variety of ways. Animation of three-dimensional objects, such as that described above as a
- “fly through”, is achieved by rapidly displaying multiple three-dimensional views of the volume.
- Surface rendering calculations using presently available computer platforms are fast enough to allow interactive manipulation of the volume.
- the rendering software typically includes modules that provide the ability to enhance an object's features by altering the shade of color of each viewing plane pixel. For example, “shaded surface” images add an illusion of depth by coloring pixels closer to the camera viewpoint with lighter shades. Pixels in a shaded surface image reflect the distance between the anatomical structure and the user's viewpoint. Information on the relative "position" of the user and a virtual light source, along with the information about the rendered structure, are used to appropriately shade the rendered volume to produce a realistic impression of the anatomy.
- the rendering or converging step 60 occurs rapidly and interactively, thus giving the user the ability to "fly” through the volume of data.
- the direction of "flight” is controlled by using the mouse or commands entered through the key board through directional pointing of the cursor, and the speed (both backwards and forwards) may also controlled by pressing buttons on the mouse or key board.
- the speed of interactive three-dimensional rendering produces a "virtual reality" environment and allows the user to examine the image data in a manner that is analogous to real endoscopy.
- the path of each simulated flight whether relative short, or local, or a long distance flight through out the whole or a major portion of the colon, can be recorded and used in a "playback" mode to retrace the flight path.
- Individual three- dimensional scenes may also be stored on the computer like photographs. Each simulated "flight" through the colon can be recorded on VHS videotape on a video recorder, or on other suitable storage media, for archival purposes if desired. Additionally, the data may be stored so that it may be transmitted to gastroenterologists and surgeons for further review and analysis.
- FIG. 4 An example of such a display is depicted in FIG. 4.
- FIG. 4 shows six separate synchronized views of the colon that were rendered using the methods of the present invention.
- reference line 210 points to a particular location within the colon that has been identified during the fly through as having structure that required further evaluation.
- the user flies through the colon using the mouse to control the speed and direction of flight.
- the images may be advanced or backed up, allowing the user to fly both upstream and downstream.
- the reference line moves on the image titled "Variable Width Flattened View", to be described in more detail below, indicating the location within the colon of the displayed views.
- Each of the other views such as the Mag Axial, Mag Sagittal (MPR), Endoluminal and Mag Coronal (MPR) views are synchronized with the reference line, allowing a practitioner to study an anomalous structure from a variety of view points.
- the display shown in FIG. 4 includes a SubSurface View, which will be described in more detail below. This view allows a physician or trained technician to also inspect anomalies in the wall of the colon. This ability to synchronously display the wall thickness of the colon allows for diagnoses of tumors and other disease that may not manifest as polyps or other anomalous structures that may be seen on the interior surface of the colon.
- the subsurface view is capable of being "windowed", that it, the parameters used to generate the view may be altered as desired by the user.
- the available contrast limits may be adjusted to optimize display of small abnormalities, or the resolution of the view may be adjusted.
- limits on the degree of parameter adjustment or windowing may be programmed into the software to prevent high contrast, narrow, "lung” setting that are commonly used in two- dimensional axial, sagittal and coronal polyp searches. If the "lung" windows used are too narrow, a user viewing two-dimensional multiplanar views may not observe a constricting tumor if the user fails to adjust the windowing parameters to evaluate the thickness of the wall of the colon.
- FIG. 5 is a flow chart which sets forth in general the hierarchical structure and flow of information through the system and method ofthe present invention.
- System 300 begins, as is typical for software programs, by calling and initializing various variables, objects, look up tables and other user definable sub-processes in box 310. After the initialization is completed in box 310, the monitor displays a default window layout in step 320. Main loop 330 may then be initialized, and awaits user input to determine which portion ofthe image to be displayed is rendered on the monitor.
- the user may use a mouse, or keyboard command, or another pointing device, to generate signals that direct the computer upon which the software is operating to scroll through the stacked three- dimensional images that form the three-dimensional volume to be displayed.
- various sub- programs are initiated, such as sub-program 350, which opens the three- dimensional volume and provides a pre-flattened large axial multi-planar window layout in step 360.
- this step provides rapid rendering of two-dimensional multiplanar images which may be used to by the user to manually designate center points of the luminal image.
- the data comprising these views may be analyzed using software designed to automatically determine and designate luminal center points, and automatically advance, slice by slice, through the entire length of the colonic image, or a designated portion of the colonic image, to determine the fly by wire pathway through the colon.
- the software embodying the present invention performs the calculations necessary to flatten the segment of the tubular structure being rendered and creates a flattened volume in box 380, and then displays the flattened view in step 390. While the various views are being calculated, an overview of the colon may also be generated in steps 400 through 420 and displayed along side the various other views.
- box 430 provides for further analysis of the flattened view to identify abnormalities in the colon. For example, when a user observes an abnormality in a particular flattened section of the colon, that abnormality may be marked as in box 440, and measured as in box 450. Additionally, local voxel density measurements may be made of a suspected abnormality to attempt to distinguish the suspected abnormality from, for example, fecal material or an air filled diverticulum. These measurements may be recorded for further analysis, and the location of the abnormality, established by marking the abnormality in box 440, may also be stored, or provided in a menu of marked abnormalities or provided as a print out for further analysis by a user.
- FIG. 5 also depicts the various sub-processes that can be undertaken from the multi-planar (MPR) views that are displayed in FIG. 4. For example, if an abnormality in a particular MPR view is observed, it may be marked as in box 470. Additionally, as the fly through of the colon proceeds, the abnormality may be tracked to determine its extent and general shape, as indicated in box 480.
- the images provided to the user comprise, in essence, an endoluminal movie created along the fly by wire pathway from one end of the colon, for example, the rectum, to the other end, or cecum, of the colon. The movie may also run in reverse to allow a flight from the cecum to the rectum.
- the fly through includes a pointer that can be locked onto the location of a abnormality by the user, and that allows the movie to be played in order to display various aspects of the abnormality, assisting in location, identification, observation and diagnosis of the abnormality.
- the various MPR views also provide the user with the ability to alter the windowing of the view, that is, adjusting the size of the window so that abnormalities or features of interest of various size may be included within a single window, in a sort of zoom in or zoom out function, as in box 490. Moreover, the position of the multi-planar view may be adjusted as necessary to ensure that to the extent possible, any abnormalities that might be hidden by structures such as Haustral ridges will be visible, as indicated by box 500.
- the MPR views may also include the ability to show a 3-D view of the colon, as indicated by box 510, may be windowed for contrast adjustments, and may be used to trigger the fly over local movie, described above, to localize the viewing of selected abnormalities.
- the present invention includes a method for "flattening" the rendered volume representative of a tubular structure.
- Previous attempts at providing a flattened view have used various methods to display a longitudinal section of the colon, such as would be obtained by slicing though a chord of the tube. The disadvantage of these methods is that the overall view of the colon remained curved, unless such a small chordal diameter for the section was used as to provide no useful data.
- the inventors have developed a method of truly flattening the image of the colon in a manner that may be likened to making a single slice through the wall of the colon and unrolling the tubular structure of the colon until it lies flat. By flattening the colon in this manner, surface abnormalities are highlighted and made visible, even if previously hidden by structures ofthe colon, such as Haustral ridges.
- FIG. 6 depicts view of a colon 600 showing a fly by wire pathway through the luminal center of the colon 605. Also shown in FIG. 6 are colon segments 610 and 620, which represent thin slices of the colon which may be converted from a three-dimensional tubular structure into a flattened view in accordance with principles of the present invention.
- Figs. 7-9 illustrate the general principles of the flattening process.
- the colon 650 is a tubular structure having an interior and an exterior defined by a wall.
- the tubular structure of the colon can be thought of as a wall that begins at position zero (not shown) that extends clockwise around FIG. 7 for 360 degrees.
- FIG. 8 depicts a slice of colon wall 655 that has been taken at a desired location.
- the flattened colon wall extends from zero degrees to 360 degrees; the flattened image is formed by unwrapping the wall ofthe colon (FIG. 7).
- the width of slice 655 may be any width desired by the user, but typically represents a 1 mm wide section of the colon.
- FIG. 9 shows this relationship in more detail.
- a slice 680 of the colon is converted into a flattened image 685.
- the edge of polyp "P" just intrudes upon slice 685, and can be seen from a view that sets the observers viewpoint directly overhead. This allows smaller structures such as polyps or other abnormalities to be viewed when they otherwise would be hidden by protruding structures of the colon, such as Haustral ridges.
- the process used to render flattened slices of the colon will now be described with references to Figs. 7-10.
- the first step in the overall display process is loading a three-dimensional volume or a series of two- dimensional images that are used to generate a three-dimensional volume into the memory of the computer or processor that is operating in accordance with a software program embodying the principles of the present invention.
- a user of the program uses the multi-planar views of the data set that are displayed by the program, a user of the program defines a starting point, typically near the anus or rectal area, of a pathway that extends through the colon, defining a fly by wire pathway through the luminal center of the colon.
- the identified points may also be used to identify key frames, which may be stacked to form the three- dimensional volume.
- the program includes a sub-process or sub-processes that automatically define a directional vector and curved pathway between the point.
- the program provides the user with the ability to automatically center the fly by wire point and to automatically advance the position along the fly by wire pathway.
- the same data may be further processed to create a flattened volume.
- additional processing of the data may be performed to fine tune, optimize and/or smooth the fly by wire pathway.
- the flattening process typically begins with a pre-sampling analysis of the data in order to determine the length of the fly by wire pathway, from which the height of the flattened image may be determined.
- the pre-sampling process moves the camera/user viewpoint along the fly by wire pathway. At each point or step along the pathway, the camera is rotated 90 degrees about the Y-axis to face the colon wall. Next, the sample is rotated 360 degrees about the original Z-axis, typically in 5 degree increments during the pre-sampling round to reduce analysis time. At each step of rotation, a ray is projected into the colon wall, and the distance to the wall is measured. The average distance measured is used to determine the colon radius at the particular point in the fly by wire pathway. The distance along the fly by wire pathway is then incremented, and the process is repeated.
- an image buffer may be created.
- the height of the image buffer is determined by the length of the fly by wire pathway through the colon.
- the width of the image buffer is typically 360 to account for the full 360 degree rotation used to retrace the colon walls.
- the process may be continued to generate the final flattened image.
- the flattened final image is created by once again stepping through the fly by wire pathway.
- the camera is rotated 90 degrees about the Y-axis to face the colon wall.
- the camera (or virtual observer's viewpoint) is rotated 180 degrees, in 1 degree increments during final flattened image generation, about the original Z- axis.
- a ray is projected into the colon wall.
- the ray tracing or projection process occurs by incrementing the camera position by approximately one voxel along the z- vector towards the lumen wall.
- the opacity of the given voxel is determined by comparing the value of the voxel found at the position of the step with an opacity table that is defined by the user. If the voxel that is being stepped through is not air, which is typically defined as zero in the opacity table, its value is added to a voxel value counter based on the weight of its opacity, relative to a pre-defined tolerance. It will be understood that while some voxels are transparent, they affect the image, but have very little relative weight. Once the pre-defined tolerance, which is determined by the user depending on the resolution and contrast, or other windowing parameters selected to optimize the image, has been exceeded, the cumulative voxel value is written to the flattened image buffer.
- the first voxel drawn uses the last vertical line in the flattened image buffer.
- the image buffer y offset is the height of the flattened image buffer minus one.
- the image buffer x offset would then be 180.
- the flattened image is actually drawn into the image buffer starting at zero degrees and rotating to 180 degrees.
- the camera is reset to its original viewpoint facing the colon wall and the process is repeated by rotating the camera 180 degrees in the opposite direction.
- the image buffer offset is reset to 180 and incremented at each rotation since the right side of that horizontal line of the image is drawn first.
- the y-offset of the flattened image buffer is decremented to draw the next row down.
- the process is continued, stepping forward along the fly by wire pathway and repeating the flattening process at each incremental step along the fly by wire pathway until a composite flattened view of has been drawn.
- the flattened image is completed, it is stored to disk and may then be displayed in a separate window, as depicted in FIG. 4.
- FIG. 10 provides a generalized flow chart illustrating the above described flattening process.
- the first step in the flattening process is to initialize variables and set up appropriate buffers and look up tables that are used by the process, as indicated in box 710.
- the process of initializing variables in box 710 also includes picking a starting point along the fly by wire pathway.
- the process increments one step forward along the fly by wire pathway.
- the length of the incremental step is user definable, and is generally in the range of 1.0-2.0 mm.
- the camera which represents the user's viewpoint, is rotated one degree in box 740.
- a ray is projected to the lumen wall in box 750, as is shown by ray 652 in FIG. 7.
- ray 652 is shown pointing to 360 degrees in FIG. 7, the ray pointing to 1 degree, the position of the first increment, has been omitted for clarity, but its position will be readily understood by a skilled person.
- a voxel value is added to a flattened image buffer, as in box 760.
- the processor tests the opacity value ofthe voxel along the length of the ray until it determines an opacity value that is greater than zero, indicating that the ray has encountered the inner surface of the colon wall.
- the processor determines whether the camera has been rotated 360 degrees. If the camera has not been rotated 360 degrees, the program branches back to box 740, where the camera is rotated one degree and the processes of boxes 750 and 760 are repeated. If the camera has been rotated 360 degrees, the flattened frame process has been completed for that point on the fly by wire pathway and the process branches back to box 780. In box 780, the program determines whether or not the end of the fly by wire pathway has been reached. If the end of the fly by wire pathway has not been reached, the process branches back to 720, wherein the location along the fly by wire pathway is incremented one unit (which may be, for example, 1.0 mm).
- the process is repeated for each incremental step until the entire length, or selected portion, or the fly by wire pathway, has been processed.
- the process then encounters the flattened frame sub-process 730 again, and repeats that as necessary. If it is determined that the end of the fly by wire pathway has been reached in box 780, the flattened image is stored to disk or other appropriate storage media in box 790, and then the flattened image is displayed in box 800.
- An example of the display of the flattened image synchronized with other views of the colon is depicted in the view titled "Variable Width Flattened View" shown in FIG. 4.
- the file containing the flattened view data may be played like a movie, that is it may be displayed showing forward or backward flight through the virtual colon, to the provide the user with the sensation of flying over the surface of the colon, much like an aircraft flight simulation program provides the sensation of flying over a simulated terrain.
- the terrain is the surface of the colon.
- Each horizontal line in a flattened image is linked to a three-dimensional position in the acquired colon volume data set. Since the flattened image was created by stepping along a fly by wire pathway through the volume, the user can jump back to the exact place in the volume from a given position in a flattened image. Vertical position (from rectum to cecum) in the flattened image provides the three-dimensional position along the fly by wire pathway in the center of the lumen ofthe colon.
- a three-dimensional position on the lumen wall may be obtained. This value may be used to rotate the camera to the angle that was used to project a ray to create that pixel in the flattened image. From this position, a ray is projected just as in the flattening process to move the camera to the actual position in the three-dimensional volume. Projecting the ray in this manner, a sub-surface view of the colon wall may be generated by allowing the ray to continue to increment into the colon wall through solid tissue.
- Generating a sub-surface view of the colon wall in this manner allows for the assessment of mucosal and submucosal wall thickness for areas of abnormality.
- a localized abnormality may be one of the roughly 7 percent of colorectal cancers that arise not from slow growing visible polyps, but from fast growing, aggressive flat tumors which are difficult to detect by any screening procedure.
- assessing wall thickness in a manner independent of prior window width and window level settings minimizes the possibility for missing constricting tumors which can be overlooked using only three-dimensional endoluminal views or two-dimensional multiplanar views that rely on narrow high contrast window settings to view and analyze CT image data.
- the cause may be either inadequate air insufflation, spasm, or some structure compressing the lumen either intrinsic to the colon wall, such as inflammatory bowel disease or a tumor, or extrinsic to the colon, such as an adjacent organ or mass in the immediate horronic vicinity.
- a standing artifact such as inadequate air insufflation, or some underlying disease or structural abnormality.
- the circumferential tumor would not readily be seen from the three-dimensional endoluminal view present in FIG. 4. If the mag axial, mag sagittal, or mag coronal images also shown in FIG. 4 had been presented using traditional "lung" windows (in FIG. 4 these images are properly windowed) these the constricting tumor would also have been missed. However, the sub-surface view, shows a thickened rind of colon wall that is indicative of an advanced malignant colon tumor.
- FIG. 11 includes an illustration of how the sub-surface structure is determined by stepping the ray through the colon wall and into the horronic structure. Also included in FIG. 11 are a variable width flattened view with a sub-surface view of a particular section of the variable width flattened view. The sub-surface view is keyed to the display of the graphic illustrating the retracing process.
- Figure 12 is a generalized flow diagram showing the various steps included in rendering the sub-surface view of the volume set for a selected scan line of a flattened image.
- the process used to generate the sub-surface view begins.
- a ray is projected into the lumen wall, as in box 950.
- the ray is stepped x voxels along the vector into wall, as in box 970.
- the program determines whether or not the projection has traveled x pixels, the value for which is chosen by the user, in box 980. If the projection has not traveled x pixels, the process branches back to box 960, and is repeated.
- the program branches to box 990, where the camera (or viewpoint) is rotated or shifted one degree.
- a pictorial example of this process is shown better in FIG. 11.
- the starting point of the ray is depicted by point A where y, the distance along the Y- axis, is arbitrarily set equal to zero. This point is typically selected to be the center of the air filled colonical lumen.
- the processor then projects the ray outwards towards the colon wall. In general, the ray will be projected from the center point a distance of typically 90 pixels, although other projection distances may be selected by the user for this analysis.
- voxel values are calculated based upon an opacity table in box 990 for any structures encountered by the ray and are added to the sub-surface image buffer in step 1000.
- the ray continues to be incrementally projected to point B, where y equals 35, then to point C where y equals 65.
- the units of y may not be an actual measurement of distance, although that is possible.
- the units of y may indicate a proportional step determined from desired analysis distance starting from the center point and extending into the wall a desired distance.
- the ray encounters the wall of the colon.
- the ray is then allowed to project a further distance, to point D, where y equals 100, thus rendering an image that includes not only the thickness of the colon wall, but also some distance into the horronic vicinity.
- the subsurface view in coordination with the flattened view, provides the user with two different ways of viewing the colon.
- the flattened view may be likened to a stretched aeronautical surface topography map of the colon that contains surface elevation data of structures, like polyps, that extend from the wall of the colon.
- the subsurface view in contrast, provides a view of the structure above and below the surface of the colonic wall.
- the sub-surface view generated in accordance with the principles of the present invention provides substantial advantages to the user in providing the ability to observe abnormalities or disease related artifacts that would otherwise not be viewable in any other view presently available for analysis of CT scan data.
- the process returns to box 990, where the camera, or user viewpoint is rotated 1 degree.
- the process determines in box 1000 whether the camera has been rotated 360 times, thus, rotating about the entire inner surface of the colon wall. If the camera has been rotated 360 times, the program branches to box 1010 and the sub-surface volume is displayed. If the camera has not been rotated 360 times, the program branches back to box 950, where the projection of the ray into the lumen wall is repeated until the entire image has been rendered.
- the isotropic data sets generated using the methods of the present invention may also be used as input for a computer aided diagnostic system.
- the data sets may be further processed using pattern recognition software.
- abnormalities may be identified by computation of three- dimensional features that identify polyps or other abnormal structure.
- Various techniques such as applying quadratic discriminant analysis, hysteresis thresholding and fuzzy clustering may be used to reduce the occurrence of false positive identification of abnormal structures.
- FIG. 13 is a flowchart describing in general the steps used to render a diagnosis using pattern recognition to identify structural abnormalities in a colon.
- the CT scan data is processed and three- dimensional volume dates sets are generated in box 1 110.
- Data representing the entire colon, or a selected portion of the colon may be extracted from the three- dimensional volume data set, as indicated in box 1120.
- the extracted data set is then processed in box 1130 using a pattern recognition program that has been designed to compare three-dimensional features of the volume data set with a set of pre-determined geometrical shapes that have been determined to be representative of the geometric shape of abnormalities found in the colon, such as polyps.
- the program generates a listing of abnormalities, including the coordinates within the three-dimensional data base where they are located, which may be used to visually assess the abnormality to determine whether the abnormality has been properly identified, or if it is simply a structural feature that resembles an abnormality, but which is otherwise benign.
- the data representing the abnormality may also be extracted from the data set and stored separately for further analysis.
- the further processing may be carried on the entire data set, or an extracted data set containing volume data representative of the area of the colon surrounding the location of the identified abnormality.
- Such further processing includes applying various techniques known in the art, such as quadratic discriminant analysis, hysteresis thresholding and fuzzy clustering to the data to improve the identification of abnormalities and reduce the occurrence of false positives.
- a report is generated in box 1150 which is communicated to a surgeon and/or the patient's physician.
- This report may contain various information concerning the scan and the abnormalities identified during the scan, and is intended to aid the physician in diagnosis and treatment of the patient.
- An example of such a report is shown in FIGS. 14 and 15.
- FIG. 14 depicts a view of entire colon of a patient, with particular segments identified.
- FIG. 15 shows a computer generated report keyed to the segments identified in FIG. 14.
- a particularly advantageous application ofthe system and methods various embodiments of the present invention is the ability to automatically, or semi- automatically, render volume data sets, diagnose abnormalities and generate reports using relatively inexpensive computers and displays, as well as the ability to do so using data that is communicated to a remote analysis site from one or more scanning centers.
- one embodiment of the present invention contemplates a centralized system or analyzing large volumes of CT scan data communicated to a central analysis location. Such a system would allow the scanning centers to scan patients without being burdened with the overhead of having to also analyze the data and generate reports.
- a single analysis center using the system and method of the present invention would receive and analyze CT scan data from 100 or more scanning centers.
- the data is either processed by the center, or alternatively, could be sent to secondary centers for processing. If the data is sent to secondary centers, the reports and/or processed volume data sets generated by the secondary center are sent back to the analysis center for communication either to the originating scanning center or to surgeons and/or physicians identified by the scanning center.
- Such a system also provides for analysis of the data using skilled technologists who report their findings to radiologists, who then may provide a report containing their opinion after reading the scan data and their suggestions for treatment which is then communicated to the appropriate surgeon or physician.
- volBuffer volume->f3dVolumeBuffer
- zOffsetTable (Uint32*) volume->fZoffsetTable
- yOffsetTable (Uint32*) volume->fYoffsetTable
- gDestVolIndex.x 359; //Reset the Vectors; Point3dLongToDouble(&gxVectorDouble,gxVectorLong);
- Point3dLongToDouble &gyVectorDouble,gyVectorLong
- /////////// theRotAngle gzVectorDouble
- gzVectorDouble gxVectorDouble
- gxVectorDouble theRotAngle
- theRotAngle. z -gxVectorDouble. z;
- voxelValue SamplePixelFastDouble2(tempPosition,&gzVectorDouble,gSlopFactor,0);
- voxelValue SamplePixelFastDouble2(tempPosition,&gzVectorDouble,gSlopFactor,0); Update3dVectorsDoubleNEW(theRotAngle); // rotate 360 degrees about z-axis 1 degree at a time gDest Vol Index . x++ ;
- Point3 dDouble tempPosition, startPosition; char firstVoxel TRUE; double vertOverSampling;
- volumeBufferMax gVolumeBufferMax
- volumeOffsetMax g3dVolumeBufferSize
- tempVolumeSizeMax gTempVolumeSizeMax; //////////////////////// // The loop
- startPosition tempVectorlndex
- yCount > 128) // VERT_OVERSAMPLE ⁇
- volBuffer volume->f3dVolumeBuffer
- zOffsetTable (Uint32*) volume->fZoffsetTable
- yOffsetTable (Uint32*) volume->fYoffsetTable
- voxelValue SampleXRayFastDouble2(tempPosition,&gzVectorDouble,gSlopFactor,0);
- voxelValue SampleXRayFastDouble2(tempPosition,&gzVectorDouble,gSlopFactor,0);
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Public Health (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Theoretical Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Radiology & Medical Imaging (AREA)
- Optics & Photonics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- High Energy & Nuclear Physics (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Endocrinology (AREA)
- Gastroenterology & Hepatology (AREA)
- Physiology (AREA)
- Pulmonology (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/507,337 US7640050B2 (en) | 2002-03-14 | 2003-03-14 | System and method for analyzing and displaying computed tomography data |
AU2003218182A AU2003218182B2 (en) | 2002-03-14 | 2003-03-14 | System and method for analyzing and displaying computed tomography data |
EP03714173.6A EP1487333B1 (en) | 2002-03-14 | 2003-03-14 | System and method for analyzing and displaying computed tomography data |
JP2003575817A JP4648629B2 (en) | 2002-03-14 | 2003-03-14 | Computer-linked tomography data analysis and display system and method |
AU2009201075A AU2009201075B2 (en) | 2002-03-14 | 2009-03-18 | System and method for analyzing and displaying computed tomography data |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US36426902P | 2002-03-14 | 2002-03-14 | |
US60/364,269 | 2002-03-14 | ||
US42982702P | 2002-11-27 | 2002-11-27 | |
US60/429,827 | 2002-11-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003077758A1 true WO2003077758A1 (en) | 2003-09-25 |
Family
ID=28045383
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2003/007996 WO2003077758A1 (en) | 2002-03-14 | 2003-03-14 | System and method for analyzing and displaying computed tomography data |
Country Status (7)
Country | Link |
---|---|
US (1) | US7640050B2 (en) |
EP (1) | EP1487333B1 (en) |
JP (1) | JP4648629B2 (en) |
CN (2) | CN101219058B (en) |
AU (2) | AU2003218182B2 (en) |
HK (1) | HK1122974A1 (en) |
WO (1) | WO2003077758A1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006127875A2 (en) * | 2005-05-26 | 2006-11-30 | Siemens Corporate Research, Inc. | Method and system for guided two dimensional colon screening |
JP2007537770A (en) * | 2003-11-03 | 2007-12-27 | ブラッコ イメージング エス.ピー.エー. | A dynamic crop box determination method for display optimization of luminal structures in endoscopic images |
DE102006018348B4 (en) * | 2005-04-19 | 2010-05-12 | Siemens Corp. Research, Inc. | System and method for fused PET-CT visualization for unfolding the heart |
US7860283B2 (en) | 2006-10-25 | 2010-12-28 | Rcadia Medical Imaging Ltd. | Method and system for the presentation of blood vessel structures and identified pathologies |
US7873194B2 (en) | 2006-10-25 | 2011-01-18 | Rcadia Medical Imaging Ltd. | Method and system for automatic analysis of blood vessel structures and pathologies in support of a triple rule-out procedure |
US7940970B2 (en) | 2006-10-25 | 2011-05-10 | Rcadia Medical Imaging, Ltd | Method and system for automatic quality control used in computerized analysis of CT angiography |
US7940977B2 (en) | 2006-10-25 | 2011-05-10 | Rcadia Medical Imaging Ltd. | Method and system for automatic analysis of blood vessel structures to identify calcium or soft plaque pathologies |
US7984929B2 (en) | 2006-03-13 | 2011-07-26 | Renishaw Plc | Fluid connector for fluid delivery apparatus |
US8147402B2 (en) | 2007-11-29 | 2012-04-03 | Olympus Medical Systems Corp. | Endoscope system |
CN102110309B (en) * | 2004-06-23 | 2013-06-26 | 皇家飞利浦电子股份有限公司 | Virtual endoscopy |
US8747371B2 (en) | 2006-03-13 | 2014-06-10 | Renishaw Plc | Method and apparatus for fluid delivery |
EP1761898B1 (en) * | 2004-06-23 | 2019-03-27 | Koninklijke Philips N.V. | Image processing system for displaying information relating to parameters of a 3-d tubular object |
Families Citing this family (84)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7194117B2 (en) * | 1999-06-29 | 2007-03-20 | The Research Foundation Of State University Of New York | System and method for performing a three-dimensional virtual examination of objects, such as internal organs |
CA2642135C (en) | 2001-11-21 | 2013-04-09 | E-Z-Em, Inc. | Device, system, kit or method for collecting effluent from an individual |
JP4421203B2 (en) * | 2003-03-20 | 2010-02-24 | 株式会社東芝 | Luminous structure analysis processing device |
US7274811B2 (en) * | 2003-10-31 | 2007-09-25 | Ge Medical Systems Global Technology Company, Llc | Method and apparatus for synchronizing corresponding landmarks among a plurality of images |
US20060119622A1 (en) * | 2004-11-23 | 2006-06-08 | General Electric Company | Method and apparatus for volume rendering display protocol |
US9289267B2 (en) * | 2005-06-14 | 2016-03-22 | Siemens Medical Solutions Usa, Inc. | Method and apparatus for minimally invasive surgery using endoscopes |
JP5123182B2 (en) * | 2005-08-17 | 2013-01-16 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Method and apparatus featuring simple click-style interaction with clinical work workflow |
US7806850B2 (en) | 2005-10-24 | 2010-10-05 | Bracco Diagnostics Inc. | Insufflating system, method, and computer program product for controlling the supply of a distending media to an endoscopic device |
US8452061B2 (en) * | 2005-11-30 | 2013-05-28 | The Research Foundation Of State University Of New York | Electronic colon cleansing method for virtual colonoscopy |
WO2007064981A2 (en) * | 2005-11-30 | 2007-06-07 | The Research Foundation Of State University Of New York | Reducing false positives of polyp in cad |
US8135453B2 (en) * | 2005-12-07 | 2012-03-13 | Siemens Corporation | Method and apparatus for ear canal surface modeling using optical coherence tomography imaging |
JP4920260B2 (en) * | 2006-01-25 | 2012-04-18 | 株式会社東芝 | Image diagnostic apparatus, image display apparatus, and image data generation method |
US20070197895A1 (en) | 2006-02-17 | 2007-08-23 | Sdgi Holdings, Inc. | Surgical instrument to assess tissue characteristics |
US7783097B2 (en) * | 2006-04-17 | 2010-08-24 | Siemens Medical Solutions Usa, Inc. | System and method for detecting a three dimensional flexible tube in an object |
JP5345934B2 (en) * | 2006-08-11 | 2013-11-20 | コーニンクレッカ フィリップス エヌ ヴェ | Data set selection from 3D rendering for viewing |
NL1032602C2 (en) * | 2006-09-29 | 2008-04-01 | Koninkl Philips Electronics Nv | Methods, system and computer program product for detecting protrusion. |
US10795457B2 (en) | 2006-12-28 | 2020-10-06 | D3D Technologies, Inc. | Interactive 3D cursor |
US7941213B2 (en) | 2006-12-28 | 2011-05-10 | Medtronic, Inc. | System and method to evaluate electrode position and spacing |
JP4545169B2 (en) * | 2007-04-12 | 2010-09-15 | 富士フイルム株式会社 | Image display method, apparatus and program |
JP4909792B2 (en) * | 2007-04-12 | 2012-04-04 | 富士フイルム株式会社 | Image interpretation support apparatus, method, and program |
CN101711125B (en) | 2007-04-18 | 2016-03-16 | 美敦力公司 | For the active fixing medical electrical leads of long-term implantable that non-fluorescence mirror is implanted |
JP4563421B2 (en) * | 2007-05-28 | 2010-10-13 | ザイオソフト株式会社 | Image processing method and image processing program |
WO2008149274A1 (en) * | 2007-06-07 | 2008-12-11 | Koninklijke Philips Electronics N.V. | Inspection of tubular-shaped structures |
US7978191B2 (en) * | 2007-09-24 | 2011-07-12 | Dolphin Imaging Systems, Llc | System and method for locating anatomies of interest in a 3D volume |
US20100228100A1 (en) | 2007-10-15 | 2010-09-09 | Vining David J | Apparatus and method for use in analyzing a patient's bowel |
US8839798B2 (en) | 2008-04-18 | 2014-09-23 | Medtronic, Inc. | System and method for determining sheath location |
US8340751B2 (en) | 2008-04-18 | 2012-12-25 | Medtronic, Inc. | Method and apparatus for determining tracking a virtual point defined relative to a tracked member |
US8494608B2 (en) | 2008-04-18 | 2013-07-23 | Medtronic, Inc. | Method and apparatus for mapping a structure |
US8532734B2 (en) | 2008-04-18 | 2013-09-10 | Regents Of The University Of Minnesota | Method and apparatus for mapping a structure |
US8663120B2 (en) | 2008-04-18 | 2014-03-04 | Regents Of The University Of Minnesota | Method and apparatus for mapping a structure |
US8260395B2 (en) | 2008-04-18 | 2012-09-04 | Medtronic, Inc. | Method and apparatus for mapping a structure |
CA2665215C (en) * | 2008-05-06 | 2015-01-06 | Intertape Polymer Corp. | Edge coatings for tapes |
US8643641B2 (en) * | 2008-05-12 | 2014-02-04 | Charles G. Passmore | System and method for periodic body scan differencing |
JP5231901B2 (en) * | 2008-08-29 | 2013-07-10 | 株式会社東芝 | Image processing device |
JP5238440B2 (en) * | 2008-10-02 | 2013-07-17 | 株式会社東芝 | Image display device and image display method |
US10345996B2 (en) | 2008-10-22 | 2019-07-09 | Merge Healthcare Solutions Inc. | User interface systems and methods |
US20100100849A1 (en) | 2008-10-22 | 2010-04-22 | Dr Systems, Inc. | User interface systems and methods |
US10768785B2 (en) | 2008-10-22 | 2020-09-08 | Merge Healthcare Solutions Inc. | Pressure sensitive manipulation of medical image data |
JP5575388B2 (en) * | 2008-12-03 | 2014-08-20 | 株式会社東芝 | Image display apparatus and X-ray CT apparatus |
US8175681B2 (en) | 2008-12-16 | 2012-05-08 | Medtronic Navigation Inc. | Combination of electromagnetic and electropotential localization |
US8487933B2 (en) * | 2009-03-31 | 2013-07-16 | General Electric Company | System and method for multi-segment center point trajectory mapping |
US8494613B2 (en) | 2009-08-31 | 2013-07-23 | Medtronic, Inc. | Combination localization system |
US8446934B2 (en) * | 2009-08-31 | 2013-05-21 | Texas Instruments Incorporated | Frequency diversity and phase rotation |
US8494614B2 (en) | 2009-08-31 | 2013-07-23 | Regents Of The University Of Minnesota | Combination localization system |
US8355774B2 (en) | 2009-10-30 | 2013-01-15 | Medtronic, Inc. | System and method to evaluate electrode position and spacing |
US9826958B2 (en) * | 2009-11-27 | 2017-11-28 | QView, INC | Automated detection of suspected abnormalities in ultrasound breast images |
US8854355B2 (en) * | 2009-12-11 | 2014-10-07 | General Electric Company | System and method of visualizing features in an image |
CN101766491B (en) * | 2010-02-04 | 2013-02-27 | 华祖光 | Method and system for virtual colonoscopy without making preparation of cleaning intestinal tract |
US9401047B2 (en) * | 2010-04-15 | 2016-07-26 | Siemens Medical Solutions, Usa, Inc. | Enhanced visualization of medical image data |
US8922546B2 (en) * | 2010-09-30 | 2014-12-30 | Siemens Aktiengesellschaft | Dynamic graphical user interfaces for medical workstations |
WO2012071399A1 (en) | 2010-11-24 | 2012-05-31 | Bracco Diagnostics Inc. | System, device, and method for providing and controlling the supply of a distending media for ct colonography |
US8797350B2 (en) * | 2010-12-20 | 2014-08-05 | Dr Systems, Inc. | Dynamic customizable human-computer interaction behavior |
CN102141380B (en) * | 2010-12-28 | 2012-05-23 | 天津钢管集团股份有限公司 | Method for processing size and image of pipe end of non-contact measuring steel pipe |
EP2699166B1 (en) * | 2011-04-21 | 2019-09-04 | Koninklijke Philips N.V. | Mpr slice selection for visualization of catheter in three-dimensional ultrasound |
BR112014012955A2 (en) * | 2011-12-03 | 2017-06-13 | Koninklijke Philips Nv | planning system, system having operatively coupled viewports, method for planning a procedure, and method for operationally coupling viewports |
US20130328874A1 (en) * | 2012-06-06 | 2013-12-12 | Siemens Medical Solutions Usa, Inc. | Clip Surface for Volume Rendering in Three-Dimensional Medical Imaging |
US9498188B2 (en) * | 2012-07-20 | 2016-11-22 | Fujifilm Sonosite, Inc. | Enhanced ultrasound imaging apparatus and associated methods of work flow |
JP6391922B2 (en) * | 2012-08-08 | 2018-09-19 | キヤノンメディカルシステムズ株式会社 | Medical image diagnostic apparatus, image processing apparatus, and image processing method |
US9095315B2 (en) * | 2012-11-21 | 2015-08-04 | Mckesson Financial Holdings | Method and apparatus integrating clinical data with the review of medical images |
EP2948044A4 (en) | 2013-01-24 | 2016-03-09 | Tylerton Internat Holdings Inc | Body structure imaging |
DE102013215807A1 (en) * | 2013-08-09 | 2015-02-12 | Siemens Aktiengesellschaft | Method for spiral recording with variable table speed at constant pitch and computed tomography device for performing such a method |
US10052495B2 (en) | 2013-09-08 | 2018-08-21 | Tylerton International Inc. | Detection of reduced-control cardiac zones |
US10646183B2 (en) | 2014-01-10 | 2020-05-12 | Tylerton International Inc. | Detection of scar and fibrous cardiac zones |
KR102364490B1 (en) * | 2014-12-15 | 2022-02-18 | 삼성메디슨 주식회사 | Untrasound dianognosis apparatus, method and computer-readable storage medium |
JP2016143194A (en) * | 2015-01-30 | 2016-08-08 | ザイオソフト株式会社 | Medical image processing apparatus, medical image processing method, and medical image processing program |
US10242488B1 (en) * | 2015-03-02 | 2019-03-26 | Kentucky Imaging Technologies, LLC | One-sided transparency: a novel visualization for tubular objects |
US10163262B2 (en) | 2015-06-19 | 2018-12-25 | Covidien Lp | Systems and methods for navigating through airways in a virtual bronchoscopy view |
CN106338423B (en) | 2015-07-10 | 2020-07-14 | 三斯坎公司 | Spatial multiplexing of histological staining |
US10565774B2 (en) * | 2015-09-03 | 2020-02-18 | Siemens Healthcare Gmbh | Visualization of surface-volume hybrid models in medical imaging |
JP6555056B2 (en) * | 2015-09-30 | 2019-08-07 | アイシン精機株式会社 | Perimeter monitoring device |
CN105261052B (en) | 2015-11-03 | 2018-09-18 | 沈阳东软医疗系统有限公司 | Method for drafting and device is unfolded in lumen image |
WO2017114479A1 (en) * | 2015-12-31 | 2017-07-06 | 上海联影医疗科技有限公司 | Image processing method and system |
US10332305B2 (en) * | 2016-03-04 | 2019-06-25 | Siemens Healthcare Gmbh | Cinematic rendering of unfolded 3D volumes |
US10275130B2 (en) * | 2017-05-12 | 2019-04-30 | General Electric Company | Facilitating transitioning between viewing native 2D and reconstructed 3D medical images |
WO2019045144A1 (en) | 2017-08-31 | 2019-03-07 | (주)레벨소프트 | Medical image processing apparatus and medical image processing method which are for medical navigation device |
KR102084251B1 (en) * | 2017-08-31 | 2020-03-03 | (주)레벨소프트 | Medical Image Processing Apparatus and Medical Image Processing Method for Surgical Navigator |
CN109003471A (en) * | 2018-09-16 | 2018-12-14 | 山东数字人科技股份有限公司 | A kind of 3 D human body supersonic anatomy tutoring system and method |
JP7239362B2 (en) * | 2019-03-20 | 2023-03-14 | ソニー・オリンパスメディカルソリューションズ株式会社 | Medical image processing device and medical observation system |
US11151789B1 (en) | 2019-03-25 | 2021-10-19 | Kentucky Imaging Technologies | Fly-in visualization for virtual colonoscopy |
US11044400B1 (en) | 2019-04-03 | 2021-06-22 | Kentucky Imaging Technologies | Frame stitching in human oral cavity environment using intraoral camera |
US11521316B1 (en) | 2019-04-03 | 2022-12-06 | Kentucky Imaging Technologies | Automatic extraction of interdental gingiva regions |
KR102208577B1 (en) * | 2020-02-26 | 2021-01-27 | (주)레벨소프트 | Medical Image Processing Apparatus and Medical Image Processing Method for Surgical Navigator |
JP2023538578A (en) * | 2020-08-20 | 2023-09-08 | コーニンクレッカ フィリップス エヌ ヴェ | Rendering 2D datasets |
WO2022244002A1 (en) * | 2021-05-18 | 2022-11-24 | Ramot At Tel-Aviv University Ltd. | System and method for analyzing abdominal scan |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1998032371A1 (en) | 1997-01-24 | 1998-07-30 | Mayo Foundation For Medical Education And Research | System for two-dimensional and three-dimensional imaging of tubular structures in the human body |
US6283918B1 (en) * | 1997-09-30 | 2001-09-04 | Kabushiki Kaisha Toshiba | Medical image diagnostic apparatus |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4998972A (en) | 1988-04-28 | 1991-03-12 | Thomas J. Fogarty | Real time angioscopy imaging system |
US5699798A (en) | 1990-08-10 | 1997-12-23 | University Of Washington | Method for optically imaging solid tumor tissue |
US5734384A (en) * | 1991-11-29 | 1998-03-31 | Picker International, Inc. | Cross-referenced sectioning and reprojection of diagnostic image volumes |
JP2801452B2 (en) * | 1991-12-25 | 1998-09-21 | アロカ株式会社 | Ultrasonic image forming device |
US5359637A (en) | 1992-04-28 | 1994-10-25 | Wake Forest University | Self-calibrated tomosynthetic, radiographic-imaging system, method, and device |
US5987149A (en) | 1992-07-08 | 1999-11-16 | Uniscore Incorporated | Method for scoring and control of scoring open-ended assessments using scorers in diverse locations |
US5381786A (en) | 1993-02-11 | 1995-01-17 | Wayne State University | Method and apparatus for measurement of luminal dimensions |
US5469353A (en) | 1993-11-26 | 1995-11-21 | Access Radiology Corp. | Radiological image interpretation apparatus and method |
US5782762A (en) | 1994-10-27 | 1998-07-21 | Wake Forest University | Method and system for producing interactive, three-dimensional renderings of selected body organs having hollow lumens to enable simulated movement through the lumen |
US5920319A (en) | 1994-10-27 | 1999-07-06 | Wake Forest University | Automatic analysis in virtual endoscopy |
US5790086A (en) * | 1995-01-04 | 1998-08-04 | Visualabs Inc. | 3-D imaging system |
US5687259A (en) * | 1995-03-17 | 1997-11-11 | Virtual Eyes, Incorporated | Aesthetic imaging system |
WO1997028845A1 (en) | 1996-02-09 | 1997-08-14 | Mayo Foundation For Medical Education And Research | Radiotherapy treatment using medial access transformation |
JP3555912B2 (en) | 1996-07-04 | 2004-08-18 | 富士写真フイルム株式会社 | Image reading transmission system |
JPH1031761A (en) * | 1996-07-17 | 1998-02-03 | Ge Yokogawa Medical Syst Ltd | Image display method and image display device |
US6349373B2 (en) | 1998-02-20 | 2002-02-19 | Eastman Kodak Company | Digital image management system having method for managing images according to image groups |
US6289235B1 (en) | 1998-03-05 | 2001-09-11 | Wake Forest University | Method and system for creating three-dimensional images using tomosynthetic computed tomography |
JP4200546B2 (en) * | 1998-03-09 | 2008-12-24 | 株式会社日立メディコ | Image display device |
AU3086099A (en) * | 1998-03-13 | 1999-09-27 | University Of Iowa Research Foundation, The | A curved cross section based system and method for gastrointestinal tract unraveling |
US6081577A (en) | 1998-07-24 | 2000-06-27 | Wake Forest University | Method and system for creating task-dependent three-dimensional images |
JP4334037B2 (en) | 1998-08-13 | 2009-09-16 | 株式会社東芝 | Medical image processing device |
JP2000051202A (en) * | 1998-08-14 | 2000-02-22 | Ge Yokogawa Medical Systems Ltd | Image displaying method and image displaying device |
US6381557B1 (en) | 1998-11-25 | 2002-04-30 | Ge Medical Systems Global Technology Company, Llc | Medical imaging system service evaluation method and apparatus |
US6381029B1 (en) | 1998-12-23 | 2002-04-30 | Etrauma, Llc | Systems and methods for remote viewing of patient images |
US6342884B1 (en) * | 1999-02-03 | 2002-01-29 | Isurftv | Method and apparatus for using a general three-dimensional (3D) graphics pipeline for cost effective digital image and video editing, transformation, and representation |
JP2000237184A (en) * | 1999-02-19 | 2000-09-05 | Ge Yokogawa Medical Systems Ltd | Photographing method in x-ray ct device and x-ray ct device |
US6510340B1 (en) | 2000-01-10 | 2003-01-21 | Jordan Neuroscience, Inc. | Method and apparatus for electroencephalography |
CN1274898A (en) * | 2000-04-30 | 2000-11-29 | 重庆炜迪科技发展有限公司 | Method of obtaining three-dimensional transparent digital x-ray image |
-
2003
- 2003-03-14 CN CN2008100034318A patent/CN101219058B/en not_active Expired - Fee Related
- 2003-03-14 EP EP03714173.6A patent/EP1487333B1/en not_active Expired - Lifetime
- 2003-03-14 AU AU2003218182A patent/AU2003218182B2/en not_active Ceased
- 2003-03-14 JP JP2003575817A patent/JP4648629B2/en not_active Expired - Fee Related
- 2003-03-14 WO PCT/US2003/007996 patent/WO2003077758A1/en active Application Filing
- 2003-03-14 CN CNB038087952A patent/CN100405973C/en not_active Expired - Fee Related
- 2003-03-14 US US10/507,337 patent/US7640050B2/en not_active Expired - Fee Related
-
2009
- 2009-01-02 HK HK09100041.0A patent/HK1122974A1/en not_active IP Right Cessation
- 2009-03-18 AU AU2009201075A patent/AU2009201075B2/en not_active Ceased
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1998032371A1 (en) | 1997-01-24 | 1998-07-30 | Mayo Foundation For Medical Education And Research | System for two-dimensional and three-dimensional imaging of tubular structures in the human body |
US6283918B1 (en) * | 1997-09-30 | 2001-09-04 | Kabushiki Kaisha Toshiba | Medical image diagnostic apparatus |
Non-Patent Citations (1)
Title |
---|
See also references of EP1487333A4 |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007537770A (en) * | 2003-11-03 | 2007-12-27 | ブラッコ イメージング エス.ピー.エー. | A dynamic crop box determination method for display optimization of luminal structures in endoscopic images |
CN102110309B (en) * | 2004-06-23 | 2013-06-26 | 皇家飞利浦电子股份有限公司 | Virtual endoscopy |
EP1761898B1 (en) * | 2004-06-23 | 2019-03-27 | Koninklijke Philips N.V. | Image processing system for displaying information relating to parameters of a 3-d tubular object |
DE102006018348B4 (en) * | 2005-04-19 | 2010-05-12 | Siemens Corp. Research, Inc. | System and method for fused PET-CT visualization for unfolding the heart |
WO2006127875A3 (en) * | 2005-05-26 | 2007-01-18 | Siemens Corp Res Inc | Method and system for guided two dimensional colon screening |
US7711163B2 (en) | 2005-05-26 | 2010-05-04 | Siemens Medical Solutions Usa, Inc. | Method and system for guided two dimensional colon screening |
WO2006127875A2 (en) * | 2005-05-26 | 2006-11-30 | Siemens Corporate Research, Inc. | Method and system for guided two dimensional colon screening |
US8747371B2 (en) | 2006-03-13 | 2014-06-10 | Renishaw Plc | Method and apparatus for fluid delivery |
US7984929B2 (en) | 2006-03-13 | 2011-07-26 | Renishaw Plc | Fluid connector for fluid delivery apparatus |
US9132265B2 (en) | 2006-03-13 | 2015-09-15 | Renishaw (Ireland) Limited | Method and apparatus for fluid delivery |
US10518075B2 (en) | 2006-03-13 | 2019-12-31 | Renishaw Plc | Method and apparatus for fluid delivery |
US7940977B2 (en) | 2006-10-25 | 2011-05-10 | Rcadia Medical Imaging Ltd. | Method and system for automatic analysis of blood vessel structures to identify calcium or soft plaque pathologies |
US7940970B2 (en) | 2006-10-25 | 2011-05-10 | Rcadia Medical Imaging, Ltd | Method and system for automatic quality control used in computerized analysis of CT angiography |
US7873194B2 (en) | 2006-10-25 | 2011-01-18 | Rcadia Medical Imaging Ltd. | Method and system for automatic analysis of blood vessel structures and pathologies in support of a triple rule-out procedure |
US7860283B2 (en) | 2006-10-25 | 2010-12-28 | Rcadia Medical Imaging Ltd. | Method and system for the presentation of blood vessel structures and identified pathologies |
US8147402B2 (en) | 2007-11-29 | 2012-04-03 | Olympus Medical Systems Corp. | Endoscope system |
Also Published As
Publication number | Publication date |
---|---|
CN100405973C (en) | 2008-07-30 |
AU2003218182A1 (en) | 2003-09-29 |
US7640050B2 (en) | 2009-12-29 |
HK1122974A1 (en) | 2009-06-05 |
CN101219058B (en) | 2012-01-11 |
EP1487333B1 (en) | 2020-07-01 |
AU2003218182B2 (en) | 2008-12-18 |
US20050245803A1 (en) | 2005-11-03 |
AU2009201075B2 (en) | 2012-04-19 |
EP1487333A1 (en) | 2004-12-22 |
CN101219058A (en) | 2008-07-16 |
JP4648629B2 (en) | 2011-03-09 |
EP1487333A4 (en) | 2016-09-21 |
CN1646059A (en) | 2005-07-27 |
AU2009201075A1 (en) | 2009-04-09 |
JP2005520590A (en) | 2005-07-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7640050B2 (en) | System and method for analyzing and displaying computed tomography data | |
US6694163B1 (en) | Method and system for producing interactive, three-dimensional renderings of selected body organs having hollow lumens to enable simulated movement through the lumen | |
US6272366B1 (en) | Method and system for producing interactive three-dimensional renderings of selected body organs having hollow lumens to enable simulated movement through the lumen | |
AU759501B2 (en) | Virtual endoscopy with improved image segmentation and lesion detection | |
US8682045B2 (en) | Virtual endoscopy with improved image segmentation and lesion detection | |
JP4088348B2 (en) | System for 2D and 3D imaging of tubular structures in the human body | |
JP4359647B2 (en) | Automatic analysis in endoscopy that cannot be detected directly | |
WO2000032106A1 (en) | Virtual endoscopy with improved image segmentation and lesion detection | |
WO1998032371A9 (en) | System for two-dimensional and three-dimensional imaging of tubular structures in the human body | |
WO2006112895A1 (en) | Display and analysis of vessel lumen interiors | |
Valev et al. | Techniques of CT colonography (virtual colonoscopy) | |
Williams | Visualisation of curved tubular structures in medical databases: An application to virtual colonoscopy |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2003575817 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003218182 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003714173 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20038087952 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 2003714173 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10507337 Country of ref document: US |