Citation, DOI, disclosures and article data
At the time the article was created Candace Makeda Moore had no recorded disclosures.View Candace Makeda Moore's current disclosures
At the time the article was last revised Candace Makeda Moore had no recorded disclosures.View Candace Makeda Moore's current disclosures
Normalization is a statistical process which allows the comparison of quantities or objects on appropriate scales. Normalization allows comparison of discrete values to have actual real meaning as opposed to simply numerical differences. A simple example of normalization any medical doctor is aware of is the assignment of percentiles to test scores.
Data normalization is often a critical step in data processing before building any algorithms for artificial intelligence, however normalization processes are used in many algorithms in radiology beyond AI as well. Mathematical formulas for normalization depend on the type of normalization. One type of feature scaling is an easily understood type of normalization in which values are made into new values such that a new value for a normalized N in a set with minimum Nmin and maximum Nmax would be made by (N- Nmin)/(Nmax- Nmin). More complex types of normalization may involve machine leaning e.g. in the NLP of radiology reports steps such as fixing spelling mistakes 1 or standardizing terminology 2. Types of normalization often used in preparing machine leaning for image classification include image normalization and batch normalization.
Traditionally, in some signal processing algorithms for MRI, normalization was used towards the goal of more interpretable final images. NB: Often several types of normalization algorithms are run on MRI imaging without radiologists being aware of them as they only see the results.