WO2014166415A1 - Image guidance method employing two-dimensional imaging - Google Patents
Image guidance method employing two-dimensional imaging Download PDFInfo
- Publication number
- WO2014166415A1 WO2014166415A1 PCT/CN2014/075126 CN2014075126W WO2014166415A1 WO 2014166415 A1 WO2014166415 A1 WO 2014166415A1 CN 2014075126 W CN2014075126 W CN 2014075126W WO 2014166415 A1 WO2014166415 A1 WO 2014166415A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- dimensional
- feature
- real
- feature area
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 238000003384 imaging method Methods 0.000 title abstract description 14
- 239000003550 marker Substances 0.000 claims abstract description 13
- 238000012216 screening Methods 0.000 claims abstract description 11
- 238000004364 calculation method Methods 0.000 claims abstract description 7
- 238000012360 testing method Methods 0.000 claims description 21
- 210000003484 anatomy Anatomy 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 9
- 238000012545 processing Methods 0.000 claims description 6
- 230000000717 retained effect Effects 0.000 claims description 6
- 238000001914 filtration Methods 0.000 claims description 3
- 238000012935 Averaging Methods 0.000 claims 1
- 238000002513 implantation Methods 0.000 abstract description 3
- 238000001514 detection method Methods 0.000 abstract 1
- 210000001519 tissue Anatomy 0.000 description 10
- 238000001959 radiotherapy Methods 0.000 description 8
- 230000008859 change Effects 0.000 description 7
- 238000006073 displacement reaction Methods 0.000 description 6
- 239000007943 implant Substances 0.000 description 6
- 238000002474 experimental method Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 210000000988 bone and bone Anatomy 0.000 description 3
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 3
- 229910052737 gold Inorganic materials 0.000 description 3
- 239000010931 gold Substances 0.000 description 3
- 210000004872 soft tissue Anatomy 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 230000003902 lesion Effects 0.000 description 2
- 238000010845 search algorithm Methods 0.000 description 2
- 238000002591 computed tomography Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000004980 dosimetry Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000007429 general method Methods 0.000 description 1
- 229910001385 heavy metal Inorganic materials 0.000 description 1
- 238000007917 intracranial administration Methods 0.000 description 1
- 210000004185 liver Anatomy 0.000 description 1
- 210000004072 lung Anatomy 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000008855 peristalsis Effects 0.000 description 1
- 230000002980 postoperative effect Effects 0.000 description 1
- 230000001144 postural effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 238000010187 selection method Methods 0.000 description 1
- 229910001220 stainless steel Inorganic materials 0.000 description 1
- 239000010935 stainless steel Substances 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5223—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data generating planar views from image data, e.g. extracting a coronal view from a 3D image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/032—Transmission computed tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
Definitions
- the present invention relates to the field of image guidance, and more particularly to an image guidance method using two-dimensional images.
- the basic purpose of image guidance is to reproduce the pre-set diseased body position during actual treatment or treatment, referred to as the preset body position.
- the doctor collects the patient's 3D CT image, develops a radiotherapy plan based on the image, and selects a point in the image as a treatment center, called the isocenter.
- the anatomical location represented by these centers needs to be placed in the treatment device, usually at the center of the medical electron linear accelerator (also known as the isocenter, the isocenter of the device).
- the preset position refers to the position of the patient when the anatomical position of the isocenter representative set by the radiotherapy plan is located at the center of the device.
- image-guided technology determines the deviation (or desired position adjustment parameter) of the patient's current position and preset position by real-time acquisition of intraoperative images (hereinafter referred to as real-time images or real-time images) before or during treatment.
- the doctor or radiotherapy technician can adjust the patient's position based on this deviation to improve the placement accuracy and achieve precise radiotherapy.
- the basic steps of implementing image guidance by this method may include: A1) The marker (as opposed to the preset body center) The three-dimensional position can be obtained in advance in a three-dimensional image (such as CT), Rl, R2...Rn, called a preset. Position; A2) Find the location of these markers in the acquired real-time image; the method of finding is described in references [A] and [B]; if using a two-dimensional image for guidance, images at least at two angles Find their position; A3) through the back projection operation, get their position in 3D space at this time, Sl, S2, ...
- This positional deviation includes displacement in three-dimensional space and may also include rotation about three degrees of freedom.
- This method is a general method, has a large amount of literature, and has been used in commercial products for many years without further elaboration. The operator adjusts the position of the treatment bed based on the deviation of the intraoperative position from the pre-set position or the desired position adjustment.
- Implanted markers are usually made of heavy metals such as gold, stainless steel, etc., in X-ray images It will exhibit a higher contrast to the surrounding tissue and is therefore easier to distinguish in a two-dimensional image ( Figure 5).
- implanting markers is an invasive clinical tool that increases certain patient risks and costs, and its scope of application is limited.
- the preparation for radiotherapy such as three-dimensional image acquisition is performed, and thus the treatment has a certain time delay.
- the implanted marker is prone to positional displacement or shedding in the tissue, ie, deviating from its implantation position.
- markers introduces large positioning errors and is not easily found. This can be avoided by using a three-dimensional image taken on the patient or by marking a feature area that reflects the patient's own anatomical features on a plurality of two-dimensional DRRs generated from the three-dimensional image.
- the present invention provides an image guiding method using a two-dimensional image without implanting a marker.
- the DRR is a projection imaging process that simulates a two-dimensional radiological imaging system, and the object or patient anatomy represented by the three-dimensional image is generated at a specific position (or relative to the imaging system).
- the DRR in the above steps may be representative of a plurality of image systems and acquisition angles, and may be a plurality of sets of DRRs generated by placing a patient's three-dimensional image in a plurality of known body positions (relative to the imaging system).
- the shape can be square, circular, spherical or other shape determined according to a certain template), which can be a feature region formed by the marker and a feature region formed near the anatomical structure, or only a feature region formed near the anatomical structure, and then selecting a reference feature region, wherein at least one of the reference feature regions is near the anatomical structure
- the feature area is defined on the three-dimensional image. It can be selected on the 3D image or selected on the 2D DRR, and the selected area has obvious features on both the 3D and 2D images.
- step 2) searching for the real-time feature area, searching for real-time feature areas corresponding to the reference feature areas in step 1) in the real-time image, and searching the real-time feature areas as real-time feature areas corresponding to all reference feature areas, or A part of the reference feature area corresponds to the real-time feature area of the feature area.
- the position deviation as the image guidance is determined by comparing the real-time feature area on the real-time image with the position of the reference feature area on the two-dimensional DRR.
- This positional deviation includes displacement in three-dimensional space and may also include rotation about three degrees of freedom.
- the estimation of the rotation error is mainly estimated by calculating the possible error of the isocenter position.
- the possible error of the isocenter position is 1* ⁇ , where L is the distance between the center of the reference feature region and the isocenter, and ⁇ is the rotation error.
- the center of the feature area is the center point of the feature area.
- the estimating step of the deformation error is: integrating the deformation degree of the tissue on the connection line between the reference feature area and the isocenter according to the position of each reference feature area in the three-dimensional image, and obtaining the reference feature area relative to The possible degree of deformation of the isocenter, and then average or weighted the degree of deformation to obtain an estimate of the degree of deformation of the reference feature region, and then based on the estimate of the degree of deformation gives an isocenter position caused by the deformation Estimate the error, and finally calculate the relative equal center position of the reference feature area according to the look-up table or through the empirical formula. Set the error.
- the selection and screening of the feature regions in the above step 1) can be performed on the three-dimensional image.
- the user can select a feature point in the three-dimensional image for marking, the feature point is located near the anatomical structure with obvious features, and the area centered on the feature point of the mark is used as an optional feature area with obvious features;
- the projection position of the feature point is calculated according to the projection relationship and marked out; then it is judged whether the image near the position has sufficient feature degree, if it is enough to retain, if not enough, it is discarded and re-selected.
- the "significant feature” in this application refers to a more significant change in the gray level of the image near the feature point, rather than a flat, untextured area. These features are usually formed by bony tissue.
- This selection principle is very easy to understand and master for operators familiar with X-ray images, and the subsequent real-time feature area search algorithm has high fault tolerance. Therefore, this method is used to select the reference feature area to guide the image.
- the positional deviation obtained by the method can fully meet the requirements of image guidance.
- the degree of feature can be calculated in a variety of ways, such as local grayscale changes (which can be expressed by variance, standard deviation), information entropy (entropy), contrast, and so on.
- the specific choice may also need to match the search algorithm of the feature area.
- the selection and screening of the reference feature regions can be performed on the two-dimensional image.
- the steps of selecting a plurality of feature regions having distinct features on the two-dimensional DRR as described in step 1) are: a) selecting one of a plurality of two-dimensional DRRs representing a unified position of the patient generated by the three-dimensional image taken by the patient as The first two-dimensional DRR selects a strong A point on the first two-dimensional DRR as the center point of the feature area with obvious features; b) marks the corresponding A point on the second two-dimensional DRR (This step can be implemented by computer software); c) selecting a strong B point from the projection line as the center point of the feature area having significant features on the second two-dimensional DRR; d) according to step a) And the two center points A and B in c), using the back projection relationship to determine the position of the reference feature area in the three-dimensional image (this step can be implemented by computer software).
- the process of selecting and screening out the reference feature area in step 1) may be determined by the operator based on the visual evaluation or automatically by the device responsible for image processing.
- the automatic selection method may be: calculating the feature degree of each point in a certain range around the center of the three-dimensional image according to the foregoing feature degree calculation method; and selecting several points of the loca l maximum as the candidate feature area.
- a certain preset ratio such as 80% or 100%
- the above similarity is calculated in many ways, and may include correlation coefficients (correla t ion coeff ic ient), mutual information (mutua l informa t ion ), and the like.
- the value of the above-mentioned threshold is also related to the method of calculating the similarity.
- the correlation coefficient can be selected to be 0.7.
- Another processing method is to select a plurality of reference feature regions, and after obtaining the real-time image, according to the similarity between the regions in the real-time map and the corresponding reference feature regions, the reference feature region with high similarity is selected, and the similarity is low and difficult to use. Those reference feature areas searched in the real-time graph.
- the purpose of searching for the real-time feature area in the above step 1) is to find the position of the reference feature area selected in step 1) in the real-time map.
- the reference feature center point that is, the position of the feature point in the three-dimensional image and the projection relationship of the imaging system
- the projection position of the reference feature area in the DRR can be calculated.
- an area of 0. 5-6 cm centered on the feature point can be defined in the DRR, and the size and shape of the area (usually square, rectangular) can be selected and adjusted.
- the search for the real-time feature area is to find the location of the small area most similar to the template in the corresponding real-time map. There are various methods.
- one method includes multi-value processing to filter "suspected" regions (blobs), filtering with patterns of shape, size, brightness, etc., vertical axis (super ior-infer Ior axi s) decision, layout (conf igura t ion) determination and other steps. Finally, the blobs in the layout mode giving the best position matching are determined as the corresponding feature areas in the real-time image.
- Another method for searching for real-time feature regions is as described in [B], including pre-processing, correlation processing, and extracting local maximum points to form a candidate region list, and using CVA algorithm to select the best matching candidate region as the selection. Corresponding feature area in the real-time image.
- This 4 set is basically applicable to the feature area located in the central area, but a large error is generated for the characteristic area located in the edge area.
- the degree of correlation can be calculated according to the principle of the Epipolar line. Assume that the central coordinate of a candidate region ⁇ in real-time graph A is (x A , yJ, and the central coordinate of a candidate region ⁇ in real-time graph B is ( , y B ). When calculating the degree of association between the two, find the real-time first.
- the method of calculating the positioning based on the reference feature area and the real-time feature area in the above step 3) may be various.
- the image guidance system uses a real-time image taken at one imaging angle to achieve guidance.
- the second method is to calculate the coordinates of the real-time feature regions obtained from the real-time map in the above-mentioned step A3 in the three-dimensional coordinates S1, .., Sn, and calculate the rest of the corresponding reference feature region preset positions R1, ..., Rn.
- Image guidance using this method eliminates the need for implantable markers, making the test non-invasive, reducing patient suffering and reducing patient care costs and costs. At the same time, because no need to wait If the implant is to be placed in the body, the method can also shorten the waiting period for treatment, and the patient can start treatment as soon as possible.
- Figure 1 is a schematic diagram of the selected reference feature area.
- the left side of the "+" mark in the figure is the reference feature area on the right side of the reference feature area.
- FIG. 2 is a schematic diagram showing that the information of the reference feature area is weak.
- the left side of the "+" mark in the figure is the reference feature area on the right side of the reference feature area.
- FIG. 3 is a schematic diagram of a process of selecting a reference feature region from a DRR image; S1 and S2 represent the position of the radiation source; D1 and D2 represent imaging detectors corresponding to the two sources; FIG. 4 is not suitable for use in the three-dimensional image.
- An example of a reference feature area is selected by a two-dimensional image, and has strong features in both DRRs, but in a three-dimensional image in a region where the grayscale is gentle, there is no obvious feature, Shown in the three views of CT, the left picture is the cross section; the upper right is the coronal plane; the lower right is the sagittal plane;
- Figure 5 is a schematic diagram of a reference feature area and a real-time feature area of the implanted marker. There are two DRRs on the left side; the real-time map corresponding to the right side, and the implanted markers in the box on the right. detailed description
- the positional deviation is determined by searching some real-time feature regions in the real-time map and comparing the positions of the real-time map with the reference feature region in the DRR in the real-time feature region.
- These reference feature regions are small regions in a three-dimensional image whose projection in a two-dimensional image (including a real-time map and a two-dimensional DRR) is a small region having features different from the surrounding region.
- the specific feature description depends on the image mode and image alignment (also called registration or fusion) algorithm.
- the reference feature area can be a large grayscale, unique change. Small area, as shown in Figure 1.
- the reference feature region selected here is the reference feature region of the non-artificial implant marker.
- the reference feature region in the present invention refers to a feature region formed by the body's own anatomical structure in addition to these implantable markers, usually formed near the bone structure. At least one such reference feature area is selected at the time of selection, and the size and shape of the reference feature area are adjustable.
- the selection of these reference feature zones can be done manually. It can be selected by the operator in a three-dimensional image (such as CT) used to generate the DRR, or directly on the DRR.
- a three-dimensional image such as CT
- the operator selects several feature points in the three-dimensional space as the center of several reference feature areas (which can be implemented by software assistance).
- the position of these points in the two-dimensional DRR can be calculated, and the selected reference feature regions are marked on the two-dimensional DRR, and then the operator can visually judge whether the feature information of these regions is strong enough, and the selection is retained one by one. Or delete. This may be the case: A reference feature area is reluctant in one of the projected DRRs but weaker in the other DRR. As shown in FIG.
- This process can also be done directly on the 2D DRR, as shown in Figure 3.
- the specific steps are as follows: Select a strong feature point on a DRR as the reference feature area center point P1; then mark the corresponding projection line corresponding to the point P1 on another DRR (as an auxiliary means, can be implemented by software) , that is, Epipolar l ine, L2; the operator selects a point with strong features from the projection line L2 (as the center point of the reference feature region selected first in the DRR), P2; The position of these two points determines the center point position P of the feature point in the three-dimensional image with reference to the feature area.
- One of the above DRRs corresponds to one of the other DRRs.
- the information of a point in the DRR (e.g., P1) represents the integral effect of the medium on the line between the source S1 and the detector D1 pixel that collects the point information.
- This connection is called the projection line of P1.
- Projection line S1_P1 is projected on D2 in another imaging system consisting of S2 and D2. It is a line formed by the projection of each point on S1_P1 on D2, called the Epipolar line corresponding to P1.
- the back projection relationship which determines the back projection line based on the projection points P1 and P2 on the two DRRs, and then calculates the spatial intersection point P is a back projection process and is the basic technique in computer vision.
- the reference feature area selected according to the above steps may be further selected to meet the requirements of the feature area.
- the feature area selected by the two-dimensional image may have obvious features on the two two-dimensional images, but there are no significant features on the three-dimensional image, as shown in the "+" mark in Figure 4; some features may be caused by The difference between the postoperative position and the preset position (especially when there is a large angular difference) and a large change in the real-time graph, This makes it difficult to search for such feature areas in real-time graphs. Such feature areas do not meet the requirements.
- the screening methods for the reference feature area are:
- the feature degree of the feature area on the three-dimensional image may be a change of its three-dimensional neighborhood (variation of gray value) or a measure of information amount. If the characteristic degree does not reach a certain threshold, delete it.
- test DRR representing the difference between the postural position and the preset body position (ie, the larger body position difference, including the larger angle and the larger displacement); according to the projection relationship, calculate the position of the feature point in these DRRs, that is, the reference feature area The position in these test DRRs; for each of the reference feature regions, their similarities in the test DRR and in the DRR representing the preset body position are calculated. The operator can decide whether to retain or delete the reference feature area based on the degree of similarity. If the similarity in all test DRRs is high, it can be retained.
- the above screening process can also be done automatically by the image processing device.
- the test DRR is generated, the position of the feature point in the test DRR is calculated, and the similarity of the reference feature area is calculated automatically.
- the selected reference feature area and the real-time feature area are non-implanted feature areas
- the projection of the three-dimensional structure on the two-dimensional imaging plane changes with the position and direction of the imaged body, this represents the DRR of the preset body position.
- the reference feature area may decrease in similarity with the corresponding real-time feature area in the real-time map. This change is gradual and is primarily affected by directional deviations.
- a series of DRRs can be generated, representing the expected X-ray image when the imaged body and the preset body position have a certain positional deviation, and then the feature areas in which the projections in the DRRs are not changed greatly are selected as the reference feature areas.
- the magnitude of the error and the center of the reference feature zone are related to the distance of the isocenter.
- the positioning error calculated based on the position of the feature area, especially the rotation error.
- the rotation error is the error of the displacement part result caused by the possible error of the rotation part of the position deviation, which is amplified by the above distance:
- the estimation of the rotation error can be based on experimental data.
- the rotation error ⁇ can be estimated experimentally, usually in multiple Know the root mean square of the error between the test result and the known position at the position.
- the soft tissue, lung tissue, and bone have different gray values in CT and X-ray images, and can be divided accordingly, and the corresponding degree of deformation can be set (the degree of deformation can be set according to the elastic coefficient of the tissue).
- the deformation degree of the tissue on the line between the point and the isocenter can be integrated to obtain a possible deformation of the point relative to the isocenter. Degree measure.
- the deformation degree of all the feature points is averaged or weighted averaged to obtain an estimate of the deformation degree of the currently selected feature region.
- This deformation estimate can be provided directly to the operator as a reference to the isocenter position error, or an estimate of the isocenter position error caused by the deformation can be given based on this estimate.
- the final step of estimating the isocenter position error from the deformation degree estimation may be a look-up table or an empirical formula calculation. For example, the measurement of the elastic coefficient of a tissue has a large amount of literature available, such as [ ⁇ ]. Both the table and the empirical formula can be determined experimentally.
- This kind of error is also related to the specific anatomical position, and the possibility that the error is greater in the part affected by the breathing and peristalsis is greater.
- the error caused by the motion here is mainly related to the organization between the equal center and the set feature area. This can be achieved by assigning different coefficients of motion to the various anatomical parts, for example, imparting a large coefficient of motion to tissues and organs near the diaphragm, such as the liver, and giving small movements to areas with less motion, such as the intracranial coefficient. This coefficient can directly prompt the user to estimate the motion error. This factor can also be converted to an error value based on an empirical formula.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biophysics (AREA)
- High Energy & Nuclear Physics (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- General Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Quality & Reliability (AREA)
- Computer Graphics (AREA)
- Pulmonology (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310127363.7 | 2013-04-12 | ||
CN201310127363.7A CN103876763A (en) | 2012-12-21 | 2013-04-12 | Image guide method implemented by aid of two-dimensional images |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014166415A1 true WO2014166415A1 (en) | 2014-10-16 |
Family
ID=51690142
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2014/075126 WO2014166415A1 (en) | 2013-04-12 | 2014-04-10 | Image guidance method employing two-dimensional imaging |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2014166415A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0429148A1 (en) * | 1989-11-15 | 1991-05-29 | George S. Allen | Method and apparatus for imaging the anatomy |
CN101032650A (en) * | 2006-03-10 | 2007-09-12 | 三菱重工业株式会社 | Radiotherapy device control apparatus and radiation irradiation method |
CN101076282A (en) * | 2004-09-30 | 2007-11-21 | 安科锐公司 | Dynamic tracking of moving targets |
CN101478918A (en) * | 2006-06-28 | 2009-07-08 | 艾可瑞公司 | Parallel stereovision geometry in image-guided radiosurgery |
WO2012119649A1 (en) * | 2011-03-09 | 2012-09-13 | Elekta Ab (Publ) | System and method for image-guided radio therapy |
-
2014
- 2014-04-10 WO PCT/CN2014/075126 patent/WO2014166415A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0429148A1 (en) * | 1989-11-15 | 1991-05-29 | George S. Allen | Method and apparatus for imaging the anatomy |
CN101076282A (en) * | 2004-09-30 | 2007-11-21 | 安科锐公司 | Dynamic tracking of moving targets |
CN101032650A (en) * | 2006-03-10 | 2007-09-12 | 三菱重工业株式会社 | Radiotherapy device control apparatus and radiation irradiation method |
CN101478918A (en) * | 2006-06-28 | 2009-07-08 | 艾可瑞公司 | Parallel stereovision geometry in image-guided radiosurgery |
WO2012119649A1 (en) * | 2011-03-09 | 2012-09-13 | Elekta Ab (Publ) | System and method for image-guided radio therapy |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11257241B2 (en) | System and method for component positioning by registering a 3D patient model to an intra-operative image | |
Bert et al. | Clinical experience with a 3D surface patient setup system for alignment of partial-breast irradiation patients | |
Tomazevic et al. | 3-D/2-D registration of CT and MR to X-ray images | |
Aubry et al. | Measurements of intrafraction motion and interfraction and intrafraction rotation of prostate by three-dimensional analysis of daily portal imaging with radiopaque markers | |
CN101443816B (en) | Image deformable registration for image-guided radiation therapy | |
US11911110B2 (en) | System and method for registration between coordinate systems and navigation of selected members | |
US20200237445A1 (en) | System and Method for Registration Between Coordinate Systems and Navigation of Selected Members | |
JP5243754B2 (en) | Image data alignment | |
US20150150523A1 (en) | On-site verification of implant positioning | |
US20080037843A1 (en) | Image segmentation for DRR generation and image registration | |
Schmid et al. | A phantom study to assess accuracy of needle identification in real-time planning of ultrasound-guided high-dose-rate prostate implants | |
EP2032039A2 (en) | Parallel stereovision geometry in image-guided radiosurgery | |
JP7513980B2 (en) | Medical image processing device, treatment system, medical image processing method, and program | |
Russakoff et al. | Intensity-based 2D-3D spine image registration incorporating a single fiducial marker1 | |
Huang et al. | Rapid dynamic image registration of the beating heart for diagnosis and surgical navigation | |
US10376712B2 (en) | Real-time applicator position monitoring system | |
Chaoui et al. | Recognition-based segmentation and registration method for image guided shoulder surgery | |
WO2002061680A2 (en) | Surface imaging | |
CN101632570B (en) | Calibration method of medical endoscope | |
CN103876763A (en) | Image guide method implemented by aid of two-dimensional images | |
US20050288574A1 (en) | Wireless (disposable) fiducial based registration and EM distoration based surface registration | |
Shin et al. | Markerless registration for intracerebral hemorrhage surgical system using weighted iterative closest point (ICP) | |
CN113545848B (en) | Registration method and registration device of navigation guide plate | |
CN116452755B (en) | Skeleton model construction method, system, medium and equipment | |
CN116035832A (en) | Apparatus and method for registering live and scanned images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14782375 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14782375 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205N DATED 03/05/2016) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14782375 Country of ref document: EP Kind code of ref document: A1 |