Anda di halaman 1dari 26

Predicting CT Image from MRI Data through Feature Matching with Learned Nonlinear

Local Descriptors

Abstract

Attenuation correction for PET/MR hybrid imaging systems and dose planning for MR-based
radiation therapy remain challenging due to insufficient high-energy photon attenuation
information. We present a novel approach that uses the learned nonlinear local descriptors and
feature matching to predict pseudo CT images from T1w and T2w MRI data. The nonlinear local
descriptors are obtained by projecting the linear descriptors into the nonlinear high-dimensional
space using an explicit feature map and low-rank approximation with supervised manifold
regularization. The nearest neighbors of each local descriptor in the input MR images are
searched in a constrained spatial range of the MR images among the training dataset. Then the
pseudo CT patches are estimated through k-nearest neighbor regression. The proposed method
for pseudo CT prediction is quantitatively analyzed on a dataset consisting of paired brain MRI
and CT images from 13 subjects. Our method generates pseudo CT images with a mean absolute
error of 75.25±18.05 Hounsfield units, a peak signal-to-noise ratio of 30.87±1.15 dB, a relative
mean absolute error of 1.56 ± 0.50% in PET attenuation correction, and a dose relative structure
volume difference of 0.055± 0.107% in D98%, as compared to true CT. The experimental results
also show that our method outperforms four state-of-the-art methods.

1
Chapter I

Introduction

Magnetic resonance imaging (MRI) is desirable for both the attenuation correction (AC) in
positron emission tomography (PET) [1] and the dose calculation in modern radiotherapy (RT)
treatment planning , owing to its non-ionizing radiation and superior soft tissue characterization.
Traditionally, the AC maps can be obtained by converting the computed tomography (CT)
images to attenuation in cm-1 at 511 keV. The electron densities derived from these CT images
can also be used for dose calculations in an MR-based RT workflow. However, especially in
PET/MRI studies, additional CT scans are not desirable in order to reduce the radiation dose to
patients. Therefore, pseudo CT images (pCT), accurately synthesized from MRI data, can be
useful for clinical applications where real CT information is not available.

Many innovative methods have been proposed for predicting CT images from MRI data
and can be classified into four categories: segmentation-, atlas-, specific sequence- and patch-
based methods.

In the segmentation-based methods , MR images were segmented into different tissue


classes (e.g., soft tissue, fat, air, and bone). Each class was then assigned the pre-defined
attenuation coefficients (LAC) or CT values. To obtain accurate segmentation, fuzzy clustering
technique and SPM8 software were utilized respectively. The accuracy of the segmentation-
based methods for pCT prediction is limited because the segmented tissue regions share the same
pre-defined CT value and the variation in the true CT value within the same tissue is ignored.

The basic idea of the atlas-based methods is straightforward. A dataset that contains
many MR/CT image pairs is required. First, an atlas dataset was registered to an input MRI
image by calculating the deformation field between the atlas and the MRI image. Then, the
corresponding CT images were warped to this MRI image using the multi-atlas information
propagation scheme . Lastly, the obtained CT images were fused into the CT prediction. In the
image fusion step, the Gaussian process regression , local image similarity based approach and
voxel-wise maximum probability intensity averaging approach were used and validated. The
performance of the atlas-based methods depends strongly on the registration accuracy and the
patient populations encompassed by the atlas.

2
Chapter II

System Analysis

System analysis is the overall analysis of the system before implementation and for arriving at a
precise solution. Careful analysis of a system before implementation prevents post
implementation problems that might arise due to bad analysis of the problem statement. Thus the
necessity for systems analysis is justified. Analysis is the first crucial step, detailed study of the
various operations performed by a system and their relationships within and outside of the
system. Analysis is defining the boundaries of the system that will be followed by design and
implementation.

Existing System

Due to the distinctively high attenuation value of the bones, the accurate prediction of bone in
pCT is desired. This requires that the MR image can supply enough clues to identify the bones.
However, conventional MRI signals are intrinsically unsuitable for depicting bones and air
structures owing to the proton density dominance and the MRI signal relaxation mechanism.
Recently, several methods based on specific imaging sequences have been proposed. Ultrashort
echo time sequence (UTE) and zero echo time sequence (ZTE) were used to improve the bone
identification in previous paper. The main limitation of these methods is the additional time cost
in sequence data acquisition in the clinical applications.

Proposed System

In this study, a patch-based method called feature matching with learned nonlinear descriptors
(FMLND) is proposed for predicting pCT from MRI data. To improve the capability of the bone
identification, a combination of dense scale invariant feature transform (SIFT) descriptors with
normalized raw patches is used as the primary descriptors of MR images rather than MR raw
patches or voxels. SIFT feature depicts structural information, which is valuable in identifying
bone tissue and air in MRI data. To better handle the nonlinearity of mapping between the
primary descriptors and the CT raw patches, the primary descriptors are projected to a high-
dimensional space using explicit feature maps to obtain extensive MRI information.

3
Chapter III

Feasibility Study

The preliminary investigation examines project feasibility; the likelihood the system will be
useful to the organization. The main objective of the feasibility study is to test Technical,
Operational and Economical feasibility for adding new modules and debugging oldest running
system. All systems are feasible if they are given unlimited resources and infinite time. There are
aspects in the feasibility study portion of the preliminary investigation:

Operational Feasibility

The application smart audit does not require additional manual involvement or labor towards
maintenance of the system. The Cost of training is minimized due to the user friendliness of the
developed application. Recurring expenditures on consumables and materials are minimized.

Technical Feasibility

Keeping in mind the existing system network, software & Hardware, already available the audit
application generated in java have provided an executable file that requires tomcat that provides
compatibility from windows98 without having to load java software. No additional hardware or
software is required which makes smart audit technically feasible.

Economic Feasibility

The system is economically feasible keeping in mind:

 Lesser investment towards training.


 One time investment towards development.
 Minimizing recurring expenditure towards training, facilities offered and Consumables.
 The system as a whole is economically feasible over a period of time.

4
Chapter IV
System Design

System design concentrates on moving from problem domain to solution domain. This
important phase is composed of several steps. It provides the understanding and procedural
details necessary for implementing the system recommended in the feasibility study. Emphasis
is on translating the performance requirements into design specification. The design of any
software involves mapping of the software requirements into Functional modules. Developing a
real time application or any system utilities involves two processes. The first process is to design
the system to implement it. The second is to construct the executable code.

Software design has evolved from an intuitive art dependent on experience to a science, which
provides systematic techniques for the software definition. Software design is a first step in the
development phase of the software life cycle. Before design the system user requirements have
been identified, information has been gathered to verify the problem and evaluate the existing
system. A feasibility study has been conducted to review alternative solution and provide cost
and benefit justification. To overcome this proposed system is recommended. At this point the
design phase begins.

The process of design involves conceiving and planning out in the mind and making a drawing.
In software design, there are three distinct activities: External design, Architectural design and
detailed design. Architectural design and detailed design are collectively referred to as internal
design. External design of software involves conceiving and planning out and specifying the
externally observable characteristics of a software product.

INPUT DESIGN:

Systems design is the process of defining the architecture, components, modules, interfaces, and
data for a system to satisfy specified requirements. Systems design could be seen as the
application of systems theory to product development. There is some overlap with the disciplines
of systems analysis, systems architecture and systems engineering.

5
Input Design is the process of converting a user oriented description of the inputs to a computer-
based business system into a programmer-oriented specification.

• Input data were found to be available for establishing and maintaining master and
transaction files and for creating output records

• The most suitable types of input media, for either off-line or on-line devices, where
selected after a study of alternative data capture techniques.

INPUT DESIGN CONSIDERATIONS

• The field length must be documented.

• The sequence of fields should match the sequence of the fields on the source document.

• The data format must be identified to the data entry operator.

Design input requirements must be comprehensive. Product complexity and the risk associated
with its use dictate the amount of detail

• These specify what the product does, focusing on its operational capabilities and the
processing of inputs and resultant outputs.

• These specify how much or how well the product must perform, addressing such issues
as speed, strength, response times, accuracy, limits of operation, etc.

OUTPUT DESIGN:

A quality output is one, which meets the requirements of the end user and presents the
information clearly. In any system results of processing are communicated to the users and to
other system through outputs.

In output design it is determined how the information is to be displaced for immediate need and
also the hard copy output. It is the most important and direct source information to the user.
Efficient and intelligent output design improves the system’s relationship to help user decision-
making.

6
1. Designing computer output should proceed in an organized, well thought out manner;
the right output must be developed while ensuring that each output element is designed so
that people will find the system can use easily and effectively. When analysis design
computer output, they should Identify the specific output that is needed to meet the
requirements.

2. Select methods for presenting information.

3. Create document, report, or other formats that contain information produced by the
system.

The output form of an information system should accomplish one or more of the following
objectives.

• Convey information about past activities, current status or projections of the

• Future.

• Signal important events, opportunities, problems, or warnings.

• Trigger an action.

• Confirm an action.

7
Architecture Diagram

8
Dataflow Diagram

The Data Flow Diagram is a graphical model showing the inputs, processes, storage & outputs of
a system procedure in structure analysis. A DFD is also known as a Bubble Chart. The Data flow
diagram provides additional information that is used during the analysis of the information
domain, and server as a basis for the modeling of functions. The description of each function
presented in the DFD is contained is a process specification called as PSPEC.

DFD Symbols

Data Flow

Arrows marking the movement of data through the system indicate data flows. It is the pipeline
carrying packets of data from an identified point of origin to specific destination.

Process

Bubbles or circles are used to indicate where incoming data flows are processed and then
transformed into outgoing data flows. The processes are numbered and named to indicate the
occurrence in the system flow.

External Entity

A rectangle indicates any source or destination of data. The entity can be a class of people, an
Organization or even another system. The function of the external entity is to, supply data to
Receive data from the system. They have no interest in how to transform the data.

9
Data Store
A data store denoted as open rectangles. It is observed that programs and sub systems have
complex interdependencies including flow of data, flow of control and interaction with data
stores. It is used to identify holding points.

Dataflow diagram

10
Chapter V

Literature Survey

S.no Title Year Methodology Disadvantage


Prediction of CT 2015 Local sparse High complexity
1. Substitutes from MR correspondence
Images Based combination (LSCC) is
on Local Sparse proposed for the prediction
Correspondence of CT substitutes from MR
Combination images.

Prediction of CT 2016 This study presents a Low speed


2. Substitutes from MR patch-based method for
Images Based on CT prediction from MR
Local Diffeomorphic images, generating
Mapping for Brain attenuation maps for
PET Attenuation PET reconstruction.
Correction
Estimating CT Image 2016 A learning-based method is Limited application
3. from MRI Data proposed for the reliable
Using Structured estimate of CT image from
Random Forest and its corresponding MR
Auto-context Model image of the same subject.

11
Predict CT image 2016 A k-nearest neighbor The proposed method can
4. from MRI data using (KNN)-regression method preserve the continuity
KNN-regression with is presented to predict CT and smoothness; however
learned image from MRI data. the prediction of the edge
local descriptors remains challenging.

PET Attenuation 2014 A patch-based method is Low accuracy


5. Correction Using proposed to generate
Synthetic CT from wholehead maps from
Ultrashort ultrashort echo-time (UTE)
Echo-Time MR MR imaging sequences.
Imaging

12
Chapter VI

System Requirements

The hardware and software specification specifies the minimum hardware and software required
to run the project. The hardware configuration specified below is not by any means the optimal
hardware requirements. The software specification given below is just the minimum
requirements, and the performance of the system may be slow on such system.

Hardware Requirements

 System : Pentium IV 2.4 GHz


 Hard Disk : 40 GB
 Floppy Drive : 1.44 MB
 Monitor : 15 VGA color
 Mouse : Logitech.
 Keyboard : 110 keys enhanced
 RAM : 256 MB

Software Requirements

 Operating System : Windows


 Language : MATLAB

13
Chapter VII

System Implementation

Implementation is the stage in the project where the theoretical design is turned into a working
system. The implementation phase constructs, installs and operates the new system. The most
crucial stage in achieving a new successful system is that it will work efficiently and effectively.
There are several activities involved while implementing a new project.
• End user Training
• End user Education
• Training on the application software

Modules

 Dataset Collection
 Preprocessing
 Feature Extraction
 Feature Matching

Dataset Collection

The input images are collected from the dataset.

Preprocessing

The N4 bias correction algorithm is utilized to remove the bias field artifacts in the Magnetic
Resonance (MR) images. Then, an intensity normalization technique is applied to reduce the
variance across the MR images of different patients. The intensities of the MR images are
stretched to the range of [0, 100]. The brain volumes are separated from the Computed
Tomography (CT) scanning bed in the CT images by thresholding. Finally, spatial normalization
is performed by linear affine registration in to align the paired MR and CT volumes of each
patient. This linear affine registration serves as the basis of the succeeding steps in the proposed
method.

14
Feature Extraction

To accurately predict the pCT images using a KNN estimator, the similarity or distance between
the features should reflect or be related to the similarity or distance between the predicted targets.
Tissue types can be identified by the structural and contextual information of large spatial
supports in the MR images. It is desired that the local descriptors of an MR image can represent
the structural information and be used to define effective similarity for KNN regression.

Dense Scale Invariant Feature transform (SIFT) is used to capture the textural and structural
information of a relatively large spatial support. The raw patch is used to capture the original
detail information of MR images. The dense SIFT and the raw patch is rearranged into a matrix
formation as the primary descriptor. To learn a compact descriptor, the Supervised descriptor
learning (SDL) can be formulated as a generalized low-rank approximation of the matrices with
the supervision of manifold regularization.

The supervised matrix S is defined on the global neighborhood in the entire training set, which
results in ignoring the spatial information of the patch location and the locality information in
feature space. We use the nearest neighborhood defined by the spatial locality and the locality in
the feature space. Ideally, the nearest neighbors of a CT patch should be sampled from the same
anatomic location. We search the nearest neighbors for a CT patch in the training set within a
constrained spatial range of CT images to define the supervised matrix S.

The learned descriptor (linear descriptor) is just the linear embedding of the primary descriptor.
The primary descriptor is projected into a high-dimensional space by a nonlinear mapping. The
feature map of the Positive Definite (PD) kernel maps the linear descriptor into the Hilbert space
with linear inner product. The obtained projection can be approximated by linear improved SDL
method. Explicit feature maps are suitable under the constraint that only if a given non-Euclidean
metric is additive and homogeneous.

The explicit feature map technique approximates projection using a discrete Fourier transform
(DFT) with a sampling rate. The primary dense SIFT descriptors are projected nonlinearly into a
high-dimensional space as the explicit feature map. Each projected primary dense SIFT
descriptor is then rearranged into a matrix and combined with the raw patch as a nonlinear
descriptor.

15
Feature Matching

For each location in the input MR images, we minimize the cost function to estimate the
corresponding CT patch centered at the location. KNN regression is utilized to estimate the value
of function. k nearest neighbors are searched and selected within a fixed spatial range of each
location in the MR images for the learned linear or nonlinear descriptor.

The weighted coefficient vector is utilized to measure the degree of similarity between descriptor
and its k nearest neighbors. The weighted coefficient vector can be calculated simply using a
Gaussian kernel function. After all the CT patches for an input MR image are estimated, a
weighted average procedure is performed on the overlapped CT patches to obtain the final
predicted pCT image.

16
Chapter VIII

Software Description

MATLAB

MATLAB is a programming language developed by MathWorks. It started out as a matrix


programming language where linear algebra programming was simple. It can be run both under
interactive sessions and as a batch job. This tutorial gives you aggressively a gentle introduction
of MATLAB programming language. It is designed to give students fluency in MATLAB
programming language. Problem-based MATLAB examples have been given in simple and easy
way to make your learning fast and effective.

MATLAB (matrix laboratory) is a fourth-generation high-level programming language and


interactive environment for numerical computation, visualization and programming. MATLAB
is developed by MathWorks.

It allows matrix manipulations; plotting of functions and data; implementation of algorithms;


creation of user interfaces; interfacing with programs written in other languages, including C,
C++, Java, and FORTRAN; analyze data; develop algorithms; and create models and
applications. It has numerous built-in commands and math functions that help you in
mathematical calculations, generating plots, and performing numerical methods.

MATLAB is used in every facet of computational mathematics. Following are some commonly
used mathematical calculations where it is used most commonly

 Dealing with Matrices and Arrays


 2-D and 3-D Plotting and graphics
 Linear Algebra
 Algebraic Equations
 Non-linear Functions
 Statistics
 Data Analysis
 Calculus and Differential Equations

17
 Numerical Calculations
 Integration
 Transforms
 Curve Fitting
 Various other special functions

Features of MATLAB

Following are the basic features of MATLAB −

 It is a high-level language for numerical computation, visualization and application


development.
 It also provides an interactive environment for iterative exploration, design and problem
solving.
 It provides vast library of mathematical functions for linear algebra, statistics, Fourier
analysis, filtering, optimization, numerical integration and solving ordinary differential
equations.
 It provides built-in graphics for visualizing data and tools for creating custom plots.
 MATLAB's programming interface gives development tools for improving code quality
maintainability and maximizing performance.
 It provides tools for building applications with custom graphical interfaces.
 It provides functions for integrating MATLAB based algorithms with external
applications and languages such as C, Java, .NET and Microsoft Excel.

Uses of MATLAB

 Signal Processing and Communications


 Image and Video Processing
 Control Systems
 Test and Measurement
 Computational Finance
 Computational Biology

18
Chapter IX

System Testing

Software Testing

Software testing is an investigation conducted to provide stakeholders with information about


the quality of the product or service under test. Software testing can also provide an objective,
independent view of the software to allow the business to appreciate and understand the risks of
software implementation. Test techniques include, but are not limited to the process of executing
a program or application with the intent of finding software bugs (errors or other defects).The
purpose of testing is to discover errors. Testing is the process of trying to discover every
conceivable fault or weakness in a work product. It provides a way to check the functionality of
components, sub-assemblies, assemblies and/or a finished product It is the process of exercising
software with the intent of ensuring that the software system meets its requirements and user
expectations and does not fail in an unacceptable manner. There are various types of test. Each
test type addresses a specific testing requirement.

Software testing is the process of evaluation a software item to detect differences between given
input and expected output. Also to assess the feature of a software item. Testing assesses the
quality of the product. Software testing is a process that should be done during the development
process. In other words software testing is a verification and validation process.

Types of testing

There are different levels during the process of Testing .Levels of testing include the different
methodologies that can be used while conducting Software Testing. Following are the main
levels of Software Testing:

 Functional Testing.

 Non-Functional Testing.

19
Steps Description

I The determination of the functionality that the intended application is meant to


perform.

II The creation of test data based on the specifications of the application.

III The output based on the test data and the specifications of the application.

IV The writing of Test Scenarios and the execution of test cases.

V The comparison of actual and expected results based on the executed test cases.

Functional Testing

Functional Testing of the software is conducted on a complete, integrated system to evaluate the
system's compliance with its specified requirements. There are five steps that are involved when
testing an application for functionality.

An effective testing practice will see the above steps applied to the testing policies of every
organization and hence it will make sure that the organization maintains the strictest of standards
when it comes to software quality.

Unit Testing

This type of testing is performed by the developers before the setup is handed over to the testing
team to formally execute the test cases. Unit testing is performed by the respective developers on
the individual units of source code assigned areas. The developers use test data that is separate
from the test data of the quality assurance team. The goal of unit testing is to isolate each part of
the program and show that individual parts are correct in terms of requirements and
functionality.

Limitations of Unit Testing

Testing cannot catch each and every bug in an application. It is impossible to evaluate every
execution path in every software application. The same is the case with unit testing.

20
There is a limit to the number of scenarios and test data that the developer can use to verify the
source code. So after he has exhausted all options there is no choice but to stop unit testing and
merge the code segment with other units.

Integration Testing

The testing of combined parts of an application to determine if they function correctly together is

Integration testing. There are two methods of doing Integration Testing Bottom-up Integration
testing and Top- Down Integration testing.

S.N. Integration Testing Method

1 Bottom-up integration
This testing begins with unit testing, followed by tests of progressively higher-
level combinations of units called modules or builds.

2 Top-Down integration
This testing, the highest-level modules are tested first and progressively lower-
level modules are tested after that.

In a comprehensive software development environment, bottom-up testing is usually done first,


followed by top-down testing. The process concludes with multiple tests of the complete
application, preferably in scenarios designed to mimic those it will encounter in customers'
computers, systems and network.

System Testing

This is the next level in the testing and tests the system as a whole. Once all the components are
integrated, the application as a whole is tested rigorously to see that it meets Quality Standards.
This type of testing is performed by a specialized testing team. System testing is so important
because of the following reasons:

 System Testing is the first step in the Software Development Life Cycle, where the
application is tested as a whole.

 The application is tested thoroughly to verify that it meets the functional and technical
specifications.

21
 The application is tested in an environment which is very close to the production
environment where the application will be deployed.

 System Testing enables us to test, verify and validate both the business requirements as
well as the Applications Architecture.

Regression Testing

Whenever a change in a software application is made it is quite possible that other areas within
the application have been affected by this change. To verify that a fixed bug hasn't resulted in
another functionality or business rule violation is Regression testing. The intent of Regression
testing is to ensure that a change, such as a bug fix did not result in another fault being uncovered
in the application. Regression testing is so important because of the following reasons:

 Minimize the gaps in testing when an application with changes made has to be tested.

 Testing the new changes to verify that the change made did not affect any other area of
the application.

 Mitigates Risks when regression testing is performed on the application.

 Test coverage is increased without compromising timelines.

 Increase speed to market the product.

Acceptance Testing

This is arguably the most importance type of testing as it is conducted by the Quality Assurance
Team who will gauge whether the application meets the intended specifications and satisfies the
client requirements. The QA team will have a set of pre written scenarios and Test Cases that
will be used to test the application.

More ideas will be shared about the application and more tests can be performed on it to gauge
its accuracy and the reasons why the project was initiated. Acceptance tests are not only intended
to point out simple spelling mistakes, cosmetic errors or Interface gaps, but also to point out any
bugs in the application that will result in system crashers or major errors in the application.

By performing acceptance tests on an application the testing team will deduce how the
application will perform in production. There are also legal and contractual requirements for
acceptance of the system.

22
Alpha Testing

This test is the first stage of testing and will be performed amongst the teams (developer and QA
teams). Unit testing, integration testing and system testing when combined are known as alpha
testing. During this phase, the following will be tested in the application:

 Spelling Mistakes

 Broken Links

 Cloudy Directions

 The Application will be tested on machines with the lowest specification to test loading
times and any latency problems.

Beta Testing

This test is performed after Alpha testing has been successfully performed. In beta testing a
sample of the intended audience tests the application. Beta testing is also known as pre-release
testing. Beta test versions of software are ideally distributed to a wide audience on the Web,
partly to give the program a "real-world" test and partly to provide a preview of the next release.
In this phase the audience will be testing the following:

 Users will install, run the application and send their feedback to the project team.

 Typographical errors, confusing application flow, and even crashes.

 Getting the feedback, the project team can fix the problems before releasing the software
to the actual users.

 The more issues you fix that solve real user problems, the higher the quality of your
application will be.

 Having a higher-quality application when you release to the general public will increase
customer satisfaction.

23
Chapter X

Conclusion

In this work, a feature matching method is proposed with learned local descriptors for predicting
CT from MR image data. The primary descriptors of the MR image are first projected to a high-
dimensional space to obtain the nonlinear descriptors using an explicit feature map. These
descriptors are optimized by adopting an improved SDL algorithm. The experimental results
demonstrate that the learned nonlinear descriptors are effective for dense matching and pCT
prediction. Moreover, the proposed CT prediction method can achieve competitive performance
compared with several state-of-the-art methods.

24
References

[1] H. Zaidi, M.-L. Montandon, and D. O. Slosman, “Magnetic resonance imaging-guided


attenuation and scatter corrections in three-dimensional brain positron emission tomography,”
Medical Physics, vol. 30, no. 5, pp. 937-948, 2003.

[2] S.-H. Hsu, Y. Cao, K. Huang, M. Feng, and J. M. Balter, “Investigation of a method for
generating synthetic CT models from MRI scans of the head and neck for radiation therapy,”
Physics in medicine and biology, vol. 58, no. 23, pp. 8419-8435, 2013.

[3] D. Izquierdogarcia, A. E. Hansen, S. Förster, D. Benoit, S. Schachoff, S. Fürst, K. T. Chen,


D. B. Chonde, and C. Catana, “An SPM8-based Approach for Attenuation Correction
Combining Segmentation and Non-rigid Template Formation: Application to Simultaneous
PET/MR Brain Imaging,” Journal of Nuclear Medicine, vol. 55, no. 11, pp. 1825-30, 2014.

[4] M. Hofmann, F. Steinke, V. Scheel, G. Charpiat, J. Farquhar, P. Aschoff, M. Brady, B.


Schölkopf, and B. J. Pichler, “MRI-Based Attenuation Correction for PET/MRI: A Novel
Approach Combining Pattern Recognition and Atlas Registration,” Journal of Nuclear Medicine,
vol. 49, no. 11, pp. 1875-1883, 2008.

[5] N. Burgos, M. J. Cardoso, K. Thielemans, M. Modat, S. Pedemonte, J. Dickson, A. Barnes,


R. Ahmed, C. J. Mahoney, J. M. Schott, J. S. Duncan, D. Atkinson, S. R. Arridge, B. F. Hutton,
and S. Ourselin, “Attenuation Correction Synthesis for Hybrid PET-MR Scanners: Application
to Brain Studies,” IEEE Transactions on Medical Imaging, vol. 33, no. 12, pp. 2332-2341, 2014.

[6] I. Mérida, N. Costes, R. A. Heckemann, A. Drzezga, S. Förster, and A. Hammers,


“Evaluation of several multi-atlas methods for PSEUDO-CT generation in brain MRI-PET
attenuation correction,” IEEE 12th International Symposium on Biomedical Imaging, pp. 1431-
1434, 2015.

[7] V. Keereman, Y. Fierens, T. Broux, Y. De Deene, M. Lonneux, and S. Vandenberghe, “MRI-


Based Attenuation Correction for PET/MRI Using Ultrashort Echo Time Sequences,” Journal of
Nuclear Medicine, vol. 51, no. 5, pp. 812-818, 2010.

25
[8] A. Johansson, M. Karlsson, and T. Nyholm, “CT substitute derived from MRI sequences with
ultrashort echo time,” Medical Physics, vol. 38, no. 5, pp. 2708-2714, 2011.

[9] M. E. Jens, M. K. Hans, L. Koen Van, H. H. Rasmus, A. L. A. Jon, and A. Daniel, “A voxel-
based investigation for MRI-only radiotherapy of the brain using ultra short echo times,” Physics
in Medicine and Biology, vol. 59, no. 23, pp. 7501-7519, 2014.

[10] S. Roy, W.-T. Wang, A. Carass, J. L. Prince, J. A. Butman, and D. L. Pham, “PET
Attenuation Correction Using Synthetic CT from Ultrashort Echo-Time MR Imaging,” Journal
of Nuclear Medicine, vol. 55, no. 12, pp. 2071-2077, 2014.

[11] M. R. Juttukonda, B. G. Mersereau, Y. Chen, Y. Su, B. G. Rubin, T. L. S. Benzinger, D. S.


Lalush, and H. An, “MR-based attenuation correction for PET/MRI neurological studies with
continuous-valued attenuation coefficients for bone through a conversion from R2* to CT-
Hounsfield units,” NeuroImage, vol. 112, pp. 160-168, 2015.

[12] G. Delso, F. Wiesinger, L. I. Sacolick, S. S. Kaushik, D. D. Shanbhag, M. Hüllner, and P.


Veit-Haibach, “Clinical Evaluation of Zero-Echo-Time MR Imaging for the Segmentation of the
Skull,” Journal of Nuclear Medicine, vol. 56, no. 3, pp. 417-422, 2015.

[13] Y. Wu, W. Yang, L. Lu, Z. Lu, L. Zhong, R. Yang, M. Huang, Y. Feng, W. Chen, and Q.
Feng, “Prediction of CT Substitutes from MR Images Based on Local Sparse Correspondence
Combination,” Medical Image Computing and Computer-Assisted Intervention -- MICCAI, pp.
93-100, 2015.

[14] Y. Wu, W. Yang, L. Lu, Z. Lu, L. Zhong, M. Huang, Y. Feng, Q. Feng, and W. Chen,
“Prediction of CT Substitutes from MR Images Based on Local Diffeomorphic Mapping for
Brain PET Attenuation Correction,” Journal of Nuclear Medicine, vol. 57, no. 10, pp. 1635-
1641, 2016.

[15] T. Huynh, Y. Gao, J. Kang, L. Wang, P. Zhang, J. Lian, and D. Shen, “Estimating CT Image
From MRI Data Using Structured Random Forest and Auto-Context Model,” IEEE Transactions
on Medical Imaging, vol. 35, no. 1, pp. 174-183, 2016.

26

Anda mungkin juga menyukai