Signal-to-noise ratio (MRI)

Signal-to-noise ratio (SNR) is a generic term which, in radiology, is a measure of true signal (i.e. reflecting actual anatomy) to noise (e.g. random quantum mottle).  In MRI the signal-to-noise ratio is measured frequently by calculating the difference in signal intensity between the area of interest and the background (usually chosen from the air surrounding the object).

In air, any signal present should be noise. The difference between the signal and the background noise is divided by the standard deviation of the signal from the background - an indication of the variability of the background noise. 

Signal-to-noise ratio is proportional to the volume of the voxel and to the square root of the number of averages and phase steps (assuming constant-sized voxels). Since averaging and increasing the phase steps takes time, SNR is related closely to the acquisition time. 

In MRI, the signal-to-noise ratio can be improved by:

Additionally, SNR can be improved by tweaking scan parameters. Assuming all other factors remain the same, SNR can be improved by:  

Imaging technology

Article information

rID: 14045
Tag: pending
Synonyms or Alternate Spellings:
  • Signal to noise ratio (SNR)
  • SNR
  • Signal-noise ratio
  • Signal/noise ratio

ADVERTISEMENT: Supporters see fewers/no ads

Updating… Please wait.

 Unable to process the form. Check for errors and try again.

 Thank you for updating your details.