Noise in computed tomography is an unwanted change in pixel values in an otherwise homogenous image. Often, noise is defined loosely as, the grainy appearance on cross-sectional imaging; more often than not, this is quantum mottle.
Noise in CT is measured via the signal to noise ratio (SNR); comparing the level of desired signal (photons) to the level of background noise (pixels deviating from normal). The higher the ratio, the less noise is present on the image.
Noise on a cross-sectional image will equal a decrease in the picture quality and inadvertently will hinder the contrast resolution.
Factors affect noise
The mAs or the dose of a CT scan has a direct relationship with the number of photons utilised in the examination. A useful stat to keep on hand:
2 x mAs = 40% increase SNR
Increasing the dose of the scan will decrease the amount of noise and hence improve the contrast resolution of the image. However, it comes at a cost and something that must be considered when determining acceptable dose levels for examinations.
Studies that rely on superior contrast resolutions will inescapably require a higher dose than examinations that can tolerate a higher amount of noise, for example, liver imaging vs. cardiac calcium scores.
The number of photons available to generate an image has a linear relationship to the slice thickness. The thicker the slice, the more photons available, the more photons available; the better the SNR, this isn't without a trade off, increasing the slice thickness will decrease the spatial resolution in the z-axis.
Larger patients will absorb more radiation than smaller ones, meaning fewer photons will reach the detector hence reducing the signal to noise ratio.
- 1. Euclid Seeram. Computed Tomography. ISBN: 9780323312882
computed tomography (CT)
- CT technology
- CT image reconstruction
- CT image quality
- CT dose
- CT contrast
- patient-based artifacts
- physics-based artifacts
- hardware-based artifacts
- helical and multichannel artifacts
- CT safety
- history of CT