Signal-to-noise ratio (MRI)

Last revised by Ammar Haouimi on 14 May 2021

Signal-to-noise ratio (SNR) is a generic term which, in radiology, is a measure of true signal (i.e. reflecting actual anatomy) to noise (e.g. random quantum mottle). On MRI the signal-to-noise ratio is measured frequently by calculating the difference in signal intensity between the area of interest and the background (usually chosen from the air surrounding the object).

In air, any signal present should be noise. The difference between the signal and the background noise is divided by the standard deviation of the signal from the background - an indication of the variability of the background noise. 

Signal-to-noise ratio is proportional to the volume of the voxel and to the square root of the number of averages and phase steps (assuming constant-sized voxels). Since averaging and increasing the phase steps takes time, SNR is related closely to the acquisition time. 

On MRI, the signal-to-noise ratio can be improved by:

Additionally, SNR can be improved by tweaking scan parameters. Assuming all other factors remain the same, SNR can be improved by:  

ADVERTISEMENT: Supporters see fewer/no ads