Signal-to-noise ratio (MRI)

Last revised by Candace Makeda Moore on 24 Sep 2024

Signal-to-noise ratio (SNR) is a generic term which, in radiology, is a measure of true signal (i.e. reflecting actual anatomy) to noise (e.g. random quantum mottle). On MRI the signal-to-noise ratio is measured frequently by calculating the difference in signal intensity between the area of interest and the background (usually chosen from the air surrounding the object).

In air, any signal present should be noise. The difference between the signal and the background noise is divided by the standard deviation of the signal from the background - an indication of the variability of the background noise. 

Signal-to-noise ratio is proportional to the volume of the voxel and to the square root of the number of averages and phase steps (assuming constant-sized voxels). Since averaging and increasing the phase steps takes time, SNR is related closely to the acquisition time. 

On MRI, the signal-to-noise ratio can be improved by:

  • volume acquisition as compared to 2D imaging, but imaging time is increased

  • spin-echo sequences as compared to gradient echo

  • decreasing the noise by reducing the bandwidth, using surface coils, and increasing the number of excitations

  • increasing the signal by decreasing the TE (time to echo) and increasing the TR (time to repeat), slice thickness, or field of view

  • increasing oversampling

  • increasing number of signal acquisitions (averages)

Additionally, SNR can be improved by tweaking scan parameters. Assuming all other factors remain the same, SNR can be improved by:  

Updating… Please wait.

 Unable to process the form. Check for errors and try again.

 Thank you for updating your details.