Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Positron emission tomography
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Image reconstruction === The raw data collected by a PET scanner are a list of 'coincidence events' representing near-simultaneous detection (typically, within a window of 6 to 12 nanoseconds of each other) of annihilation photons by a pair of detectors. Each coincidence event represents a line in space connecting the two detectors along which the positron emission occurred (i.e., the line of response (LOR)). Analytical techniques, much like the reconstruction of [[computed tomography]] (CT) and [[single-photon emission computed tomography]] (SPECT) data, are commonly used, although the [[data set]] collected in PET is much poorer than CT, so reconstruction techniques are more difficult. Coincidence events can be grouped into projection images, called [[Radon transform|sinograms]]. The sinograms are sorted by the angle of each view and tilt (for 3D images). The sinogram images are analogous to the projections captured by CT scanners, and can be reconstructed in a similar way. The statistics of data thereby obtained are much worse than those obtained through transmission tomography. A normal PET data set has millions of counts for the whole acquisition, while the CT can reach a few billion counts. This contributes to PET images appearing "noisier" than CT. Two major sources of noise in PET are scatter (a detected pair of photons, at least one of which was deflected from its original path by interaction with matter in the field of view, leading to the pair being assigned to an incorrect LOR) and random events (photons originating from two different annihilation events but incorrectly recorded as a coincidence pair because their arrival at their respective detectors occurred within a coincidence timing window). In practice, considerable pre-processing of the data is required β correction for random coincidences, estimation and subtraction of [[compton scatter|scattered]] photons, detector [[Dead time|dead-time]] correction (after the detection of a photon, the detector must "cool down" again) and detector-sensitivity correction (for both inherent detector sensitivity and changes in sensitivity due to angle of incidence). [[Filtered back projection]] (FBP) has been frequently used to reconstruct images from the projections. This algorithm has the advantage of being simple while having a low requirement for computing resources. Disadvantages are that [[shot noise]] in the raw data is prominent in the reconstructed images, and areas of high tracer uptake tend to form streaks across the image. Also, FBP treats the data deterministically β it does not account for the inherent randomness associated with PET data, thus requiring all the pre-reconstruction corrections described above. '''Statistical, likelihood-based approaches''': Statistical, likelihood-based <ref>{{cite journal | vauthors = Lange K, Carson R | title = EM reconstruction algorithms for emission and transmission tomography | journal = Journal of Computer Assisted Tomography | volume = 8 | issue = 2 | pages = 306β16 | date = April 1984 | pmid = 6608535 }}</ref><ref>{{cite journal| vauthors = Vardi Y, Shepp LA, Kaufman L |title=A statistical model for positron emission tomography|journal=Journal of the American Statistical Association|date=1985|volume=80|issue=389|pages=8β37|doi=10.1080/01621459.1985.10477119|s2cid=17836207}}</ref> iterative [[expectation-maximization algorithm]]s such as the SheppβVardi algorithm<ref>{{cite journal | vauthors = Shepp LA, Vardi Y | title = Maximum likelihood reconstruction for emission tomography | journal = IEEE Transactions on Medical Imaging | volume = 1 | issue = 2 | pages = 113β122 | date = 1982 | pmid = 18238264 | doi = 10.1109/TMI.1982.4307558 }}</ref> are now the preferred method of reconstruction. These algorithms compute an estimate of the likely distribution of annihilation events that led to the measured data, based on statistical principles. The advantage is a better noise profile and resistance to the streak artifacts common with FBP, but the disadvantage is greater computer resource requirements. A further advantage of statistical image reconstruction techniques is that the physical effects that would need to be pre-corrected for when using an analytical reconstruction algorithm, such as scattered photons, random coincidences, attenuation and detector dead-time, can be incorporated into the likelihood model being used in the reconstruction, allowing for additional noise reduction. Iterative reconstruction has also been shown to result in improvements in the resolution of the reconstructed images, since more sophisticated models of the scanner physics can be incorporated into the likelihood model than those used by analytical reconstruction methods, allowing for improved quantification of the radioactivity distribution.<ref>{{cite journal | vauthors = Qi J, Leahy RM | title = Iterative reconstruction techniques in emission computed tomography | journal = Physics in Medicine and Biology | volume = 51 | issue = 15 | pages = R541-78 | date = August 2006 | pmid = 16861768 | doi = 10.1088/0031-9155/51/15/R01 | bibcode = 2006PMB....51R.541Q | s2cid = 40488776 }}</ref> Research has shown that [[Bayesian probability|Bayesian]] methods that involve a Poisson likelihood function and an appropriate [[prior probability]] (e.g., a smoothing prior leading to [[total variation regularization]] or a [[Laplacian distribution]] leading to <math>\ell_1</math>-based regularization in a [[wavelet]] or other domain), such as via [[Ulf Grenander]]'s [[Sieve estimator]]<ref>{{Cite journal | vauthors = Snyder DL, Miller M |title = On the Use of the Method of Sieves for Positron Emission Tomography |journal = IEEE Transactions on Medical Imaging|date = 1985 |pages =3864β3872|volume = NS-32(5) |issue = 5|doi= 10.1109/TNS.1985.4334521 |s2cid = 2112617|bibcode = 1985ITNS...32.3864S}}</ref><ref>{{Cite journal|title = Bayesian image analysis: An application to single photon emission tomography |journal = Proceedings Amererican Statistical Computing|date = 1985 |pages =12β18|first2 = Donald E. |last2 = McClure|first1 =Stuart|last1 = Geman | name-list-style = vanc |url=http://www.dam.brown.edu/people/geman/Homepage/Image%20processing,%20image%20analysis,%20Markov%20random%20fields,%20and%20MCMC/1985GemanMcClureASA.pdf}}</ref> or via Bayes penalty methods<ref>{{cite journal | vauthors = Snyder DL, Miller MI, Thomas LJ, Politte DG | title = Noise and edge artifacts in maximum-likelihood reconstructions for emission tomography | journal = IEEE Transactions on Medical Imaging | volume = 6 | issue = 3 | pages = 228β38 | date = 1987 | pmid = 18244025 | doi = 10.1109/tmi.1987.4307831 | s2cid = 30033603 }}</ref><ref>{{cite journal | vauthors = Green PJ | title = Bayesian reconstructions from emission tomography data using a modified EM algorithm | journal = IEEE Transactions on Medical Imaging | volume = 9 | issue = 1 | pages = 84β93 | date = 1990 | pmid = 18222753 | doi = 10.1109/42.52985 | url = http://www.stats.bris.ac.uk/~peter/papers/spect-TMI90.pdf | citeseerx = 10.1.1.144.8671 }}</ref> or via [[I.J. Good]]'s roughness method<ref>{{Cite journal|title = The role of likelihood and entropy in incomplete data problems: Applications to estimating point-process intensites and toeplitz constrained covariance estimates | journal = Proceedings of the IEEE|date = 1987 |pages =892β907|volume = 5 | issue = 7 |doi = 10.1109/PROC.1987.13825|first1 = Michael I.|last1 = Miller|first2 = Donald L. |last2 = Snyder| name-list-style = vanc | s2cid = 23733140}}</ref><ref>{{cite journal | vauthors = Miller MI, Roysam B | title = Bayesian image reconstruction for emission tomography incorporating Good's roughness prior on massively parallel processors | journal = Proceedings of the National Academy of Sciences of the United States of America | volume = 88 | issue = 8 | pages = 3223β7 | date = April 1991 | pmid = 2014243 | pmc = 51418 | doi = 10.1073/pnas.88.8.3223 | bibcode = 1991PNAS...88.3223M | doi-access = free }}</ref> may yield superior performance to expectation-maximization-based methods which involve a Poisson likelihood function but do not involve such a prior.<ref>{{cite book | vauthors = Willett R, Harmany Z, Marcia R |s2cid=246589 |title=17th IEEE International Conference on Image Processing |date=2010 |pages=4177β4180 |doi=10.1109/ICIP.2010.5649600 |isbn=978-1-4244-7992-4 |chapter=Poisson image reconstruction with total variation regularization |citeseerx=10.1.1.175.3149 }}</ref><ref>{{cite journal| vauthors = Harmany Z, Marcia R, Willett R |title=Sparsity-regularized Photon-limited Imaging |journal=International Symposium on Biomedical Imaging (ISBI)|date=2010}}</ref><ref>{{cite journal | vauthors = Willett R, Harmany Z, Marcia R |s2cid=7172003 | veditors = Bouman CA,Pollak I, Wolfe PJ |title= SPIRAL out of Convexity: Sparsity-regularized Algorithms for Photon-limited Imaging|journal=SPIE Electronic Imaging |volume=7533 |pages=75330R |date=2010 |bibcode=2010SPIE.7533E..0RH |doi=10.1117/12.850771|series=Computational Imaging VIII|citeseerx=10.1.1.175.3054}}</ref> '''Attenuation correction''': Quantitative PET Imaging requires attenuation correction.<ref>{{cite journal | vauthors = Huang SC, Hoffman EJ, Phelps ME, Kuhl DE | title = Quantitation in positron emission computed tomography: 2. Effects of inaccurate attenuation correction | journal = Journal of Computer Assisted Tomography | volume = 3 | issue = 6 | pages = 804β14 | date = December 1979 | pmid = 315970 | doi = 10.1097/00004728-197903060-00018 }}</ref> In these systems attenuation correction is based on a transmission scan using <sup>68</sup>Ge rotating rod source.<ref>{{cite journal | vauthors = Navalpakkam BK, Braun H, Kuwert T, Quick HH | title = Magnetic resonance-based attenuation correction for PET/MR hybrid imaging using continuous valued attenuation maps | journal = Investigative Radiology | volume = 48 | issue = 5 | pages = 323β332 | date = May 2013 | pmid = 23442772 | doi = 10.1097/rli.0b013e318283292f | s2cid = 21553206 }}</ref> Transmission scans directly measure attenuation values at 511 keV.<ref>{{cite journal | vauthors = Wagenknecht G, Kaiser HJ, Mottaghy FM, Herzog H | title = MRI for attenuation correction in PET: methods and challenges | journal = Magma | volume = 26 | issue = 1 | pages = 99β113 | date = February 2013 | pmid = 23179594 | pmc = 3572388 | doi = 10.1007/s10334-012-0353-4 }}</ref> Attenuation occurs when [[photon]]s emitted by the radiotracer inside the body are absorbed by intervening tissue between the detector and the emission of the photon. As different LORs must traverse different thicknesses of tissue, the photons are attenuated differentially. The result is that structures deep in the body are reconstructed as having falsely low tracer uptake. Contemporary scanners can estimate attenuation using integrated x-ray CT equipment, in place of earlier equipment that offered a crude form of CT using a [[gamma ray]] ([[positron]] emitting) source and the PET detectors. While attenuation-corrected images are generally more faithful representations, the correction process is itself susceptible to significant artifacts. As a result, both corrected and uncorrected images are always reconstructed and read together. '''2D/3D reconstruction''': Early PET scanners had only a single ring of detectors, hence the acquisition of data and subsequent reconstruction was restricted to a single transverse plane. More modern scanners now include multiple rings, essentially forming a cylinder of detectors. There are two approaches to reconstructing data from such a scanner: # Treat each ring as a separate entity, so that only coincidences within a ring are detected, the image from each ring can then be reconstructed individually (2D reconstruction), or # Allow coincidences to be detected between rings as well as within rings, then reconstruct the entire volume together (3D). 3D techniques have better sensitivity (because more coincidences are detected and used) hence less noise, but are more sensitive to the effects of scatter and random coincidences, as well as requiring greater computer resources. The advent of sub-nanosecond timing resolution detectors affords better random coincidence rejection, thus favoring 3D image reconstruction. '''Time-of-flight (TOF) PET''': For modern systems with a higher time resolution (roughly 3 nanoseconds) a technique called "time-of-flight" is used to improve the overall performance. Time-of-flight PET makes use of very fast gamma-ray detectors and data processing system which can more precisely decide the difference in time between the detection of the two photons. It is impossible to localize the point of origin of the annihilation event exactly (currently within 10 cm). Therefore, image reconstruction is still needed. TOF technique gives a remarkable improvement in image quality, especially signal-to-noise ratio.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Positron emission tomography
(section)
Add topic