CN112494029A - Real-time MR movie data reconstruction method and system - Google Patents
Real-time MR movie data reconstruction method and system Download PDFInfo
- Publication number
- CN112494029A CN112494029A CN202011346985.5A CN202011346985A CN112494029A CN 112494029 A CN112494029 A CN 112494029A CN 202011346985 A CN202011346985 A CN 202011346985A CN 112494029 A CN112494029 A CN 112494029A
- Authority
- CN
- China
- Prior art keywords
- real
- data
- time
- algorithm
- coil
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 32
- 238000012549 training Methods 0.000 claims abstract description 14
- 238000013528 artificial neural network Methods 0.000 claims description 10
- 230000000306 recurrent effect Effects 0.000 claims description 5
- 230000035945 sensitivity Effects 0.000 claims description 3
- 238000002595 magnetic resonance imaging Methods 0.000 description 43
- 238000003384 imaging method Methods 0.000 description 8
- 230000000747 cardiac effect Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000002441 reversible effect Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000013136 deep learning model Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 210000002216 heart Anatomy 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 210000002569 neuron Anatomy 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 206010006322 Breath holding Diseases 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 210000004100 adrenal gland Anatomy 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 210000000481 breast Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013184 cardiac magnetic resonance imaging Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 210000003679 cervix uteri Anatomy 0.000 description 1
- 210000001072 colon Anatomy 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 230000002526 effect on cardiovascular system Effects 0.000 description 1
- 210000003414 extremity Anatomy 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000010977 jade Substances 0.000 description 1
- 210000004185 liver Anatomy 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 210000004072 lung Anatomy 0.000 description 1
- 210000002751 lymph Anatomy 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 210000001672 ovary Anatomy 0.000 description 1
- 210000000496 pancreas Anatomy 0.000 description 1
- 210000003899 penis Anatomy 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 210000002307 prostate Anatomy 0.000 description 1
- 238000001303 quality assessment method Methods 0.000 description 1
- 238000013442 quality metrics Methods 0.000 description 1
- 210000000664 rectum Anatomy 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 210000003079 salivary gland Anatomy 0.000 description 1
- 210000001732 sebaceous gland Anatomy 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 210000002027 skeletal muscle Anatomy 0.000 description 1
- 210000003491 skin Anatomy 0.000 description 1
- 210000002460 smooth muscle Anatomy 0.000 description 1
- 210000000278 spinal cord Anatomy 0.000 description 1
- 210000000952 spleen Anatomy 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 210000001550 testis Anatomy 0.000 description 1
- 210000001541 thymus gland Anatomy 0.000 description 1
- 210000001685 thyroid gland Anatomy 0.000 description 1
- 210000003437 trachea Anatomy 0.000 description 1
- 210000003932 urinary bladder Anatomy 0.000 description 1
- 210000004291 uterus Anatomy 0.000 description 1
- 210000001835 viscera Anatomy 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/055—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Signal Processing (AREA)
- Psychiatry (AREA)
- Physiology (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- High Energy & Nuclear Physics (AREA)
- Radiology & Medical Imaging (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
One method comprises the following steps: training the algorithm using fully sampled retrospective MR cine data; and applying the trained algorithm to the real-time MR cine data to produce a reconstructed MR image.
Description
Cross Reference to Related Applications
This application claims the benefit of U.S. provisional application No. 62/941,904 filed on day 11, 29 in 2019 and U.S. provisional application No. 17/060,988 filed on day 10, 1 in 2020, which are hereby incorporated by reference in their entirety.
Technical Field
Aspects of the present disclosure relate generally to Magnetic Resonance Imaging (MRI), and in particular to the use of parallel imaging for real-time MR cine image reconstruction.
Background
MRI is a widely used medical technique that uses magnetic and radio frequency energy to generate images of a region of interest. During an MRI scan, volume coils (e.g., body coils) and local coils (e.g., surface coils) may acquire MR signals resulting from nuclear relaxation inside the subject being examined. Cardiovascular mri (cmri) is a well-recognized gold standard for clinical assessment of cardiac structure and function, i.e., widely recognized as a particularly effective procedure. Standard CMR applications such as retrospective movie (retro-cine) rely on ECG gating plus several breath holds to provide high diagnostic image quality. This can present difficulties because a typical patient who may need to assess cardiac structure and function may have irregular heartbeat signals and have difficulty holding their breath. Real-time cardiac cine MRI utilizes relatively fast image acquisition and therefore requires neither ECG gating nor breath holding during the data acquisition process as compared to retrospective cine MRI. Thus, real-time movies may be more useful for patients who may have difficulty holding their breath during an MRI examination or who may have irregular cardiac signals. However, to achieve the required fast image acquisition speed, real-time movies may typically acquire highly undersampled data (typically using accelerations greater than 10X) while utilizing parallel imaging techniques. This may present computational challenges to MRI image reconstruction when reconstructing undersampled data and reconstructing data from multiple coils used in parallel imaging.
Compressed-sensing based methods have been proposed for real-time movie reconstruction. In addition, several deep learning based methods have also been proposed for MRI reconstruction. For example, Qin et al ("capacitive neural networks for dynamic MR image reconstruction." IEEE transactions on Medical Imaging 38.1 (2018): page 280-290) have developed a Convolutional recurrent neural network for cardiac MRI image reconstruction. However, these studies have several limitations. In a traditional machine learning or deep learning framework, gold standards or gold standard datasets are required to teach the deep learning model how to reconstruct the image. However, taking into account the sampling time and the number of coils, it is almost impossible to acquire fully sampled real-time cine data between heartbeats. It follows that the proposed method reconstructs simulated undersampled data from retrospective cine data, rather than using actual real-time cine data for evaluation; the acceleration rate is lower than 10X; previous methods were designed only for single coil image reconstruction and not multi-coil image reconstruction (i.e., parallel imaging); and evaluation is performed using only image quality metrics and not clinical usefulness.
Disclosure of Invention
It would be advantageous to provide methods and systems that enable high quality reconstruction of real-time cardiac cine MRI. The disclosed embodiments relate to utilizing an algorithm for real-time cardiac cine MR image reconstruction of under-sampled real-time cine data for parallel imaging.
According to one aspect of the disclosure, a method comprises: training the algorithm using fully sampled retrospective movie data; and applying the trained algorithm to the real-time MR cine data to produce a reconstructed MR image.
The method may include training an algorithm using one or more of the downscaled retrospective movie data and the downscaling mask.
The method may further include training the algorithm using retrospective cine data from individual coils of the multi-coil MR scanner.
The real-time MR cine data may include real-time MR cine data from individual coils of a multi-coil MR scanner.
The fully sampled retrospective movie data may be used to calculate losses during training, where the losses may include one or more of a mean square error loss, an L1 loss, a Structural Similarity Index (SSIM) loss, or a Huber (Huber) loss.
The algorithm may include a residual convolutional recurrent neural network.
The real-time MR cine data may include undersampled multi-coil real-time MR cine data.
The real-time MR cine data may include real-time MR cine data from individual coils of a multi-coil MR scanner, and the algorithm may include a plurality of algorithms, each algorithm configured to be applied to data from a different individual coil of the multi-coil MR scanner.
The method may include combining reconstructed images from the plurality of algorithms using a root sum of squares (root sum of squares) or coil sensitivity maps to generate a final combined image.
The real-time MR cine data may include real-time MR cine data from individual coils of the multi-coil MR scanner and the algorithm may include a single algorithm configured to be applied to the data from the individual coils of the multi-coil MR scanner.
According to another aspect of the disclosure, a system comprises: a real-time MR cine data source; and a computation circuit implementing an algorithm trained using the fully sampled retrospective cine data, wherein the trained algorithm is configured to produce reconstructed MR images when applied to the real-time MR cine data.
These and other aspects, embodiments, and advantages of the exemplary embodiments will become apparent from the embodiments described herein, when considered in conjunction with the accompanying drawings. It is to be understood, however, that the description and drawings are designed solely for purposes of illustration and not as a definition of the limits of the disclosed invention, for which reference should be made to the appended claims. Additional aspects and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. Furthermore, the aspects and advantages of the invention may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims.
Drawings
In the following detailed part of the disclosure, the invention will be explained in more detail with reference to exemplary embodiments shown in the drawings. These embodiments are non-limiting exemplary embodiments, in which like reference numerals represent similar structures throughout the several views of the drawings, and in which:
FIG. 1 illustrates an exemplary process flow in accordance with aspects of the disclosed embodiments;
FIG. 2 illustrates an embodiment of an exemplary system incorporating aspects of the disclosed embodiments;
FIG. 3 illustrates an embodiment of another exemplary system incorporating aspects of the disclosed embodiments;
FIG. 4 illustrates a schematic block diagram of an exemplary multi-coil MRI data source in accordance with the disclosed embodiments;
FIGS. 5A and 5B illustrate different MRI multi-coil arrangements in accordance with the disclosed embodiments;
FIG. 6 illustrates an exemplary architecture of a computing circuit in which an algorithm may be implemented in accordance with the disclosed embodiments;
FIG. 7 illustrates an example of an algorithm in the form of a deep residual convolutional neural network; and
fig. 8 illustrates an example of a bi-directional convolutional recurrent neural network element.
Detailed Description
In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant disclosure. However, it should be apparent to one skilled in the art that the present disclosure may be practiced without such details. In other instances, well known methods, procedures, systems, components, and/or circuits have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present disclosure. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present disclosure. Thus, the present disclosure is not limited to the embodiments shown, but is to be accorded the widest scope consistent with the claims.
It will be understood that the terms "system," "unit," "module" and/or "block" as used herein are a means for distinguishing between different components, elements, parts, portions or assemblies of different levels in ascending order. However, the terms and other expressions may be substituted for the other expressions if they can achieve the same purpose.
It will be understood that when an element, module or block is referred to as being "on," "connected to" or "coupled to" another element, module or block, it can be directly on, connected or coupled to the other element, module or block or intervening elements, modules or blocks may be present, unless the context clearly dictates otherwise. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
Generally, the words "module," "unit," or "block" as used herein refers to logic embodied in hardware or firmware, or to a collection of software instructions. The modules, units, or blocks described herein may be implemented as software and/or hardware and may be stored in any type of non-transitory computer readable medium or another storage device. In some embodiments, software modules/units/blocks may be compiled and linked into an executable program. It should be understood that software modules may be invoked from other modules/units/blocks or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules/units/blocks configured for execution on a computing device may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, magnetic disk, or any other tangible medium, or as digitally downloaded material (and may be initially stored in a compressed or installable format requiring installation, decompression, or decryption prior to execution). Such software code may be stored, partially or completely, on a storage device executing the computing device for execution by the computing device. The software instructions may be embedded in firmware, such as an erasable programmable read-only memory (EPROM). It should also be understood that hardware modules/units/blocks may be included in connected logic components, such as gates and flip-flops, and/or may be included in programmable units, such as programmable gate arrays or processors. The modules/units/blocks or computing device functions described herein may be implemented as software modules/units/blocks, but may be represented in hardware or firmware. In general, a module/unit/block described herein refers to a logical module/unit/block that may be combined with other modules/units/blocks or divided into sub-modules/sub-units/sub-blocks, regardless of their physical organization or storage. The description may apply to the system, the engine, or a portion thereof.
The terminology used herein is for the purpose of describing particular examples and embodiments only and is not intended to be limiting. As used herein, the singular forms "a", "an" and "the" may also be intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this disclosure, specify the presence of integers, means, acts, features, steps, elements, operations, and/or components, but do not preclude the presence or addition of one or more other integers, means, acts, features, steps, elements, operations, components, and/or groups thereof.
These and other features and characteristics of the present disclosure, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description with reference to the accompanying drawings, all of which form a part of this disclosure. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the disclosure. It should be understood that the drawings are not to scale.
The disclosed embodiments relate to a method comprising: training the algorithm using fully sampled retrospective movie data; and applying the trained algorithm to the multi-coil real-time MRI cine data to produce reconstructed MRI images.
The disclosed embodiments also relate to a system comprising: a source of fully sampled retrospective cine MR data; an algorithm configured to be trained using fully sampled retrospective cine MR data; and a source of multi-coil real-time MR cine data, wherein the trained algorithm may be applied to the multi-coil real-time MR cine data to generate reconstructed MR images.
Referring to FIG. 1, a schematic block diagram of an exemplary system 100 incorporating aspects of the disclosed embodiments is illustrated. The system may include a source 102 of multi-coil real-time MR cine data, an algorithm 104, and a source 106 of fully sampled retrospective cine MR data. The fully sampled retrospective cine data 106 may be used to train the algorithm 104, and the trained algorithm 104 may be applied to the multi-coil real-time MR cine data 102 to produce reconstructed MR images 108.
Fig. 2 and 3 illustrate embodiments of exemplary systems 200, 300 incorporating aspects of the disclosed embodiments. Fig. 2 illustrates an exemplary embodiment of providing an independent multi-coil reconstruction by providing an algorithm in the form of a deep learning model for the individual coils of a multi-coil MR system. An Inverse Fourier Transform (iFT) may be performed on the downsampled k-space real-time multi-coil data 202 to produce a multi-coil image 204 that includes aliasing and artifacts due to the downsampling. The image 204 with aliasing and artifacts may be split into separate images 206 according to the separate coils from which they were obtained1-206n. Can be applied to each image 2061-206nApplying associated algorithms 2081-208nTo generate a corresponding reconstructed image 2101-210nAnd the reconstructed images 210 may be combined1-210nTo produce a combined reconstructed image 212. The reconstructed images from the individual coils may be combined using Root Sum of Squares (RSS) or coil sensitivity maps to generate the final combined reconstructed image 212. Root Sum of Squares (RSS) refers to a calculation method that squares first, then sums, and then opens the root.
Fig. 3 illustrates an exemplary embodiment providing a parallel multi-coil reconstruction by applying a single algorithm to the images from the coils. An Inverse Fourier Transform (iFT) may be performed on the downsampled k-space real-time multi-coil data 302 to produce a multi-coil image 304 that includes aliasing and artifacts due to the downsampling. The image 304 with aliasing and artifacts may be split into separate images 306 according to the separate coils from which they were obtained1-306n. Image 3061-306nMay be provided as input to a single algorithm 308 to produce a reconstructed image 310.
Fig. 4 shows a schematic block diagram of an exemplary multi-coil MRI data source in the form of an MRI apparatus 402 for providing multi-coil MRI data in accordance with the disclosed embodiments. The MRI device 402 may include an MRI scanner 404, control circuitry 406, and a display 408. The function, size, type, geometry, location, number, or magnitude of the MRI scanner 404 may be determined or changed according to one or more particular conditions. For example, the MRI scanner 404 may be designed to surround a subject (or region of a subject) to form a tunnel MRI scanner (referred to as a closed bore MRI scanner) or an open MRI scanner (referred to as an open bore MRI scanner). As another example, an MRI scanner may be portable and may be transported along a hallway and through a doorway to a patient, thereby providing MR scanning services to the patient, as opposed to transporting the patient to the MRI scanner. In some examples, the portable MRI scanner may be configured to scan a region of interest of the subject, for example, the brain, spinal cord, limbs, heart, blood vessels, and internal organs of the subject.
As shown in phantom in fig. 4, the MRI scanner 404 may include a magnetic field generator 410, a gradient magnetic field generator 412, and a Radio Frequency (RF) generator 414, all of which surround a table 416 on which a subject under study may be positioned. The MRI scanner 404 may also include: one or more coil arrays 418; an ECG signal sensor 420 for capturing MRI data in the form of an ECG signal from a subject under study during an MRI scan; and a camera 422 for capturing MRI data in the form of video images of the subject under study during an MRI scan.
In some embodiments, the MRI scanner 404 may perform a scan of the subject or a region of the subject. The subject may be, for example, a human or other animal body. In some embodiments, the subject may be a patient. The region of the subject may comprise a portion of the subject. For example, the region of the subject may include a tissue of a patient. The tissue may include, for example, lung, prostate, breast, colon, rectum, bladder, ovary, skin, liver, spine, bone, pancreas, cervix, lymph, thyroid, spleen, adrenal gland, salivary gland, sebaceous gland, testis, thymus, penis, uterus, trachea, skeletal muscle, smooth muscle, heart, and the like. In some embodiments, the scan may be a pre-scan for a calibration imaging scan. In some embodiments, the scan may be an imaging scan used to generate an image.
The main magnetic field generator 410 can generate a static magnetic field B0And may compriseSuch as a permanent magnet, a superconducting magnet, a resistive electromagnet, or any magnetic field generating device suitable for generating a static magnetic field. The gradient magnetic field generator 412 may use coils to generate a magnetic field at B0In the same direction, but with a gradient in one or more directions (e.g., along X, Y or the Z-axis in the coordinate system of the MRI scanner 404).
In some embodiments, RF generator 414 may use an RF coil to transmit RF energy through the subject or a region of interest of the subject to induce electrical signals in the region of interest. The resulting RF field is commonly referred to as B1A field, and with B0The fields are combined to generate MR signals that are spatially localized and encoded by the gradient magnetic fields. The coil array 418 is generally operable to sense the RF field and deliver a corresponding output to the control circuitry 406. In some embodiments, the coil array may be operable to transmit and receive RF energy, while in other embodiments, the coil array may be operable to receive only.
Fig. 5A and 5B illustrate different MRI multi-coil arrangements. The multi-coil arrangement may include a phased array coil arrangement and a parallel array coil arrangement. Fig. 5A shows an exemplary phased array coil arrangement in which coils are overlapped and coupled together to enhance gain and signal-to-noise characteristics. Fig. 5B shows an exemplary parallel array arrangement in which the coils are separated and optimized for parallel imaging. The coil arrangement may include any number of coils, depending on the particular application. Exemplary numbers of coils may include 12, 16, 24, 32, 64, or more.
Returning to fig. 4, the control circuitry 406 may control the overall operation of the MRI scanner 404, and in particular, the overall operation of the magnetic field generator 410, the gradient magnetic field generator 412, the RF generator 414, and the coil array 418. For example, the control circuitry 406 may control the magnetic field gradient generator to produce a gradient field along one or more of X, Y and the Z-axis, and control the RF generator to generate an RF field. In some embodiments, the control circuitry 406 may receive commands from, for example, a user or another system, and control the magnetic field generator 410, the gradient magnetic field generator 412, the RF generator 414, and the coil array 418 accordingly.
The control circuitry 406 may be connected to the MRI scanner 404 through a network 424. The network 424 may include any suitable network that may facilitate information and/or data exchange for the MRI scanner 404. The network 424 may be and/or include a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN), a Wide Area Network (WAN)), etc.), a wired network (e.g., an ethernet network), a wireless network (e.g., an 802.11 network, a Wi-Fi network, etc.), a cellular network (e.g., a Long Term Evolution (LTE) network), a frame relay network, a virtual private network ("VPN"), a satellite network, a telephone network, a router, a hub, a switch, a server computer, and/or any combination thereof. By way of example only, network 424 may include a cable network, a wireline network, a fiber optic network, a telecommunications network, an intranet, a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, the like, or any combination thereof. In some embodiments, network 424 may include one or more network access points. For example, the network 424 may include wired and/or wireless network access points, such as base stations and/or internet exchange points, through which one or more components of the MRI scanner 402 may connect with the network 424 to exchange data and/or information.
According to some embodiments, the algorithm may be implemented in a computing circuit of the control circuit 406, while in other embodiments, the algorithm may be implemented in a computing circuit remote from the control circuit 406.
Fig. 6 illustrates an exemplary architecture of a computing circuit 600 in accordance with the disclosed embodiments. The computing circuitry 600 may include computer readable program code stored on at least one computer readable medium 602 for performing and executing the process steps described herein. Computer readable program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language (such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB. NET, Python, etc.), a conventional programming language (such as the "C" programming language, Visual Basic, Fortran 2103, Perl, COBOL 2102, PHP, ABAP), a dynamic programming language (such as Python, Ruby, and Groovy), or other programming languages. The computer readable program code may execute entirely on the computing circuitry 600, partly on the computing circuitry 600, as a stand-alone software package, partly on the computing circuitry 600 and partly on a remote computer or server, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the computing circuitry 600 via any type of network, including the networks mentioned above with respect to network 424.
The computer readable medium 602 may be the memory of the computing circuit 600. In alternative aspects, the computer readable program code may be stored in a memory external to the computing circuit 600 or remote from the computing circuit 600. The memory may include magnetic media, semiconductor media, optical media, or any media that is readable and executable by the computer. The computing circuit 600 may also include a computer processor 604 for executing computer readable program code stored on at least one computer readable medium 602. In some embodiments, the computer processor may be a Graphics Processing Unit (GPU). In at least one aspect, the computing circuitry 600 may include one or more input or output devices, commonly referred to as a user interface 606, which may be operable to allow input to the computing circuitry 600 or provide output from the computing circuitry 600, respectively. The computing circuit 600 may be implemented in hardware, software, or a combination of hardware and software.
Fig. 7 illustrates an example of an algorithm in the form of a deep residual convolutional-recurrent neural network (Res-CRNN)700, and fig. 8 illustrates an example of a bi-directional convolutional-recurrent neural network unit 800. One feature of the deep residual convolutional circular neural network is: the state neurons of a conventional Recurrent Neural Network (RNN) can be divided into a part responsible for the positive time direction (forward state) and a part responsible for the negative time direction (reverse state). In addition, the output from the forward state is not connected to the input of the reverse state, and the output from the reverse state is not connected to the input of the forward state.
Bi-directional convolutional RNN can be applied to cardiac cineModeling is performed according to dynamic information, a data consistency layer for ensuring consistency of reconstructed data and observed data, and residual connection for promoting network learning high-frequency details and increasing stability of a training process. The Res-CRNN 700 may include three such building blocks 7021-7023And an additional residual join 7041-7043. The complex values are represented as two-channel tensors and fed into the network. The deep residual convolutional circular neural network can be trained with the same algorithm as a regular one-way RNN, since there is no interaction between the two types of state neurons.
As shown in fig. 7, multi-coil down-sampled real-time cine data 706 from the MRI scanner 404 may be provided to the algorithm 700. The reconstructed cine images from the various stages may be provided to the next stage until a final reconstructed cine image 708 is achieved.
The algorithm 700 may be trained using retrospective cine data and images reconstructed by the algorithm 700. Fully sampled retrospective cine data may be downsampled with sampling masks similar to those used in real-time cine during MRI image acquisition. One or more of the downscaled retrospective movie data, the sample mask, and the fully sampled data retrospective movie data may also be used to train the algorithm 700. Retrospective cine training data may include fully sampled images from individual coils of the MRI scanner 404. To save memory consumption, retrospective movie images may be cropped to a smaller size for training, for example, where the algorithm computation circuitry includes a graphics processing unit.
During training, a fully sampled retrospective movie can be used to calculate losses. The training penalty may be implemented as one or any combination of a mean square error penalty, an L1 penalty, a Structural Similarity Index (SSIM) penalty, a huber penalty, or any penalty for image quality assessment.
Once trained, the algorithm 700 may be applied to undersampled real-time cine data for image construction, including images from multi-coil acquisitions and full-size uncut images.
Thus, while there have been shown, described, and pointed out fundamental novel features of the invention as applied to exemplary embodiments thereof, it will be understood that various omissions and substitutions and changes in the form and details of the devices and methods illustrated, and in their operation, may be made by those skilled in the art without departing from the spirit and scope of the presently disclosed invention. Further, it is expressly intended that all combinations of those elements that perform substantially the same function in substantially the same way to achieve the same results are within the scope of the invention. Moreover, it should be recognized that structures and/or elements shown and/or described in connection with any disclosed form or embodiment of the invention may be incorporated in any other disclosed or described or suggested form or embodiment as a general matter of design choice. It is the intention, therefore, to be limited only as indicated by the scope of the claims appended hereto.
Claims (10)
1. A real-time MR cine data reconstruction method comprising:
training the algorithm using fully sampled retrospective movie data; and
the trained algorithm is applied to the real-time MR cine data to produce reconstructed MR images.
2. The method of claim 1, further comprising training the algorithm using one or more downscaled retrospective movie data and a downscaling mask.
3. The method of claim 1, further comprising training the algorithm using retrospective cine data from individual coils of a multi-coil MR scanner.
4. The method of claim 1, wherein the real-time MR cine data comprises real-time MR cine data from individual coils of a multi-coil MR scanner.
5. The method of claim 1, wherein the algorithm comprises a residual convolutional recurrent neural network.
6. The method of claim 1, wherein the real-time MR cine data comprises undersampled multi-coil real-time MR cine data.
7. The method of claim 1, wherein the real-time MR cine data comprises real-time MR cine data from individual coils of a multi-coil MR scanner and the algorithm comprises a plurality of algorithms, each algorithm configured to be applied to data from a different individual coil of the multi-coil MR scanner.
8. The method of claim 8, comprising combining reconstructed images from the plurality of algorithms using a square root of sum or coil sensitivity maps to generate a final combined image.
9. The method of claim 1, wherein the real-time MR cine data comprises real-time MR cine data from individual coils of a multi-coil MR scanner and the algorithm comprises a single algorithm configured to be applied to data from the individual coils of the multi-coil MR scanner.
10. A real-time MR cine data reconstruction system comprising:
a real-time MR cine data source; and
a computing circuit that implements an algorithm trained using fully sampled retrospective movie data,
wherein the trained algorithm is configured to produce reconstructed MR images when applied to real-time MR cine data.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962941904P | 2019-11-29 | 2019-11-29 | |
US62/941,904 | 2019-11-29 | ||
US17/060,988 | 2020-10-01 | ||
US17/060,988 US11992289B2 (en) | 2019-11-29 | 2020-10-01 | Fast real-time cardiac cine MRI reconstruction with residual convolutional recurrent neural network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112494029A true CN112494029A (en) | 2021-03-16 |
CN112494029B CN112494029B (en) | 2025-04-11 |
Family
ID=74966166
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011346985.5A Active CN112494029B (en) | 2019-11-29 | 2020-11-26 | Real-time MR movie data reconstruction method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112494029B (en) |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070096733A1 (en) * | 2005-10-27 | 2007-05-03 | Arjun Arunachalam | Parallel magnetic resonance imaging method using a radial acquisition trajectory |
US20080279433A1 (en) * | 2007-05-07 | 2008-11-13 | General Electric Company | Method and apparatus for multi-coil magnetic resonance imaging |
US20110241669A1 (en) * | 2010-03-31 | 2011-10-06 | Weitian Chen | System and method of parallel imaging for magnetic resonance imaging near metallic implants |
US20130285662A1 (en) * | 2012-04-27 | 2013-10-31 | Toshiba Medical Systems Corporation | Magnetic resonance imaging apparatus and image processing apparatus |
US20150374237A1 (en) * | 2013-01-31 | 2015-12-31 | The Regents Of The University Of California | Method for accurate and robust cardiac motion self-gating in magnetic resonance imaging |
CN108335339A (en) * | 2018-04-08 | 2018-07-27 | 朱高杰 | A kind of magnetic resonance reconstruction method based on deep learning and convex set projection |
US20180285695A1 (en) * | 2017-03-28 | 2018-10-04 | Siemens Healthcare Gmbh | Magnetic Resonance Image Reconstruction System and Method |
CN109001660A (en) * | 2018-06-12 | 2018-12-14 | 上海联影医疗科技有限公司 | Film imaging method and magnetic resonance imaging system |
US20190172230A1 (en) * | 2017-12-06 | 2019-06-06 | Siemens Healthcare Gmbh | Magnetic resonance image reconstruction with deep reinforcement learning |
CN109993809A (en) * | 2019-03-18 | 2019-07-09 | 杭州电子科技大学 | Fast Magnetic Resonance Imaging Method Based on Residual U-net Convolutional Neural Network |
US20190236817A1 (en) * | 2018-01-30 | 2019-08-01 | The Board Of Trustees Of The Leland Stanford Junior University | Generalized Multi-Channel MRI Reconstruction Using Deep Neural Networks |
US20190266761A1 (en) * | 2018-02-28 | 2019-08-29 | General Electric Company | System and method for sparse image reconstruction |
US20190325621A1 (en) * | 2016-06-24 | 2019-10-24 | Rensselaer Polytechnic Institute | Tomographic image reconstruction via machine learning |
US20190353741A1 (en) * | 2018-05-16 | 2019-11-21 | Siemens Healthcare Gmbh | Deep Learning Reconstruction of Free Breathing Perfusion |
-
2020
- 2020-11-26 CN CN202011346985.5A patent/CN112494029B/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070096733A1 (en) * | 2005-10-27 | 2007-05-03 | Arjun Arunachalam | Parallel magnetic resonance imaging method using a radial acquisition trajectory |
US20080279433A1 (en) * | 2007-05-07 | 2008-11-13 | General Electric Company | Method and apparatus for multi-coil magnetic resonance imaging |
US20110241669A1 (en) * | 2010-03-31 | 2011-10-06 | Weitian Chen | System and method of parallel imaging for magnetic resonance imaging near metallic implants |
US20130285662A1 (en) * | 2012-04-27 | 2013-10-31 | Toshiba Medical Systems Corporation | Magnetic resonance imaging apparatus and image processing apparatus |
US20150374237A1 (en) * | 2013-01-31 | 2015-12-31 | The Regents Of The University Of California | Method for accurate and robust cardiac motion self-gating in magnetic resonance imaging |
US20190325621A1 (en) * | 2016-06-24 | 2019-10-24 | Rensselaer Polytechnic Institute | Tomographic image reconstruction via machine learning |
US20180285695A1 (en) * | 2017-03-28 | 2018-10-04 | Siemens Healthcare Gmbh | Magnetic Resonance Image Reconstruction System and Method |
US20190172230A1 (en) * | 2017-12-06 | 2019-06-06 | Siemens Healthcare Gmbh | Magnetic resonance image reconstruction with deep reinforcement learning |
US20190236817A1 (en) * | 2018-01-30 | 2019-08-01 | The Board Of Trustees Of The Leland Stanford Junior University | Generalized Multi-Channel MRI Reconstruction Using Deep Neural Networks |
US20190266761A1 (en) * | 2018-02-28 | 2019-08-29 | General Electric Company | System and method for sparse image reconstruction |
CN108335339A (en) * | 2018-04-08 | 2018-07-27 | 朱高杰 | A kind of magnetic resonance reconstruction method based on deep learning and convex set projection |
US20190353741A1 (en) * | 2018-05-16 | 2019-11-21 | Siemens Healthcare Gmbh | Deep Learning Reconstruction of Free Breathing Perfusion |
CN109001660A (en) * | 2018-06-12 | 2018-12-14 | 上海联影医疗科技有限公司 | Film imaging method and magnetic resonance imaging system |
CN109993809A (en) * | 2019-03-18 | 2019-07-09 | 杭州电子科技大学 | Fast Magnetic Resonance Imaging Method Based on Residual U-net Convolutional Neural Network |
Also Published As
Publication number | Publication date |
---|---|
CN112494029B (en) | 2025-04-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11651531B2 (en) | Systems and methods for magnetic resonance imaging | |
US11348230B2 (en) | Systems and methods for generating and tracking shapes of a target | |
US11506735B2 (en) | Systems and methods for magnetic resonance imaging | |
US20180259604A1 (en) | Low field magnetic resonance imaging (mri) scanner for cardiac imaging | |
US11992289B2 (en) | Fast real-time cardiac cine MRI reconstruction with residual convolutional recurrent neural network | |
US11774535B2 (en) | Systems and methods for magnetic resonance imaging reconstruction | |
CN109917315A (en) | MRI scan method, apparatus, computer equipment and storage medium | |
CN112763957B (en) | Magnetic resonance imaging system and method | |
CN113506271A (en) | Medical scanning data processing method and system | |
US20230096252A1 (en) | Systems and methods for image processing | |
US20240268701A1 (en) | Imaging systems and methods | |
US11119172B2 (en) | Systems and methods for magnetic resonance imaging | |
US11980457B2 (en) | Systems and methods for simultaneous multi-slice multitasking imaging | |
CN112494029B (en) | Real-time MR movie data reconstruction method and system | |
US11940516B2 (en) | Systems and methods for magnetic resonance imaging | |
CN112494030B (en) | Cardiac imaging system and method | |
CN113303783B (en) | Cardiac signal prediction method and system | |
US12000919B2 (en) | Systems and methods for magnetic resonance imaging | |
US12023188B2 (en) | Systems and methods for simultaneous multi-slice imaging | |
Alam et al. | Deep-learning-based acceleration of MRI for radiotherapy planning of pediatric patients with brain tumors | |
US20230168326A1 (en) | Systems and methods for magnetic resonance t1 mapping | |
Huerta | Acceleration and Image Enhancement for High Resolution Magnetic Resonance Imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |