Articles

Articles are a collaborative effort to provide a single canonical page on all topics relevant to the practice of radiology. As such, articles are written and edited by countless contributing members over a period of time. A global group of dedicated editors oversee accuracy, consulting with expert advisers, and constantly reviewing additions.

89 results found
Article

Hebbian learning

Hebbian learning describes a type of activity-dependent modification of the strength of synaptic transmission at pre-existing synapses which plays a central role in the capacity of the brain to convert transient experiences into memory. According to Hebb et al 1, two cells or systems of cells th...
Article

Autoencoder

Autoencoders are an unsupervised learning technique in which artificial neural networks are used to learn to produce a compressed representation of the input data. Essentially, autoencoding is a data compression algorithm where the compression and decompression functions are learned automatical...
Article

Learning curve (machine learning)

A learning curve is a plot of the learning performance of a machine learning model (usually measured as loss or accuracy) over time (usually in a number of epochs). Learning curves are a widely used diagnostic tool in machine learning to get an overview of the learning and generalization behavi...
Article

Deep learning frameworks

Deep learning frameworks are instruments for training and validating deep neural networks, through high-level programming interfaces. Widely used deep learning frameworks include the libraries PyTorch, TensorFlow, and Keras. A programmer can use these libraries of higher functions to quickly de...
Article

Underfitting

Underfitting in statistical and machine learning modeling is the counterpart of overfitting. It happens when a model is not complex enough to accurately capture relationships between a dataset’s features and a target variable, i.e. the network is struggling with learning the patterns in the dat...
Article

ImageNet dataset

The ImageNet is an extensive image database which has been instrumental in advancing computer vision and deep learning research. It contains more than 14 million, hand-annotated images classified into more than 20,000 categories. In at least one million of the images, bounding boxes are also pro...
Article

Deep learning

Deep learning is a subset of machine learning based on multi-layered (a.k.a. “deep“) artificial neural networks. Their highly flexible architectures can learn directly from data (such as images, video or text) without the need of hand-coded rules and can increase their predictive accuracy when p...
Article

Generalisability

Generalisability in machine learning models represents how well the models can be adapted to new example datasets.  Evaluating generalisability of machine learning applications is crucial as this has profound implications for their clinical adaptability. Briefly, two main techniques are used fo...
Article

Information leakage

Information leakage is one of the common and important errors in data handling during all machine learning applications, including those in radiology. Briefly, it means the incomplete separation of the training, validation, and testing datasets, which can significantly change the apparent perfor...
Article

Explainable artificial intelligence

Explainable artificial intelligence (XAI) usually refers to narrow artificial intelligence models made with methods that enable and enhance human understanding of how the models reached outputs in each case. Many older AI models, e.g. decision trees, were inherently understandable in terms of ho...
Article

Findable accessible interoperable reusable data principles (FAIR)

The FAIR (findable accessible interoperable reusable) data principles are a set of guidance on enhancing semantic machine interpretability of data, thereby improving its richness and quality. Since its inception, multiple international organizations have endorsed the application of FAIR principl...
Article

Federated learning

Federated learning, also known as distributed learning, is a technique that facilitates the creation of robust artificial intelligence models where data is trained on local devices (nodes) that then transfer weights to a central model. Models can potentially be trained using larger and/or more d...
Article

Ground truth

Ground truth is a term used in statistics and machine learning to refer to data assumed to be correct. Regarding the development of machine learning algorithms in radiology, the ground truth for image labeling is sometimes based on pathology or lab results while, in other cases, on the expert o...
Article

Hyperparameter (machine learning)

Hyperparameters are specific aspects of a machine learning algorithm that are chosen before the algorithm runs on data. These hyperparameters are model specific e.g. they would typically include the number of epochs for a deep learning model or the number of branches in a decision tree model. Th...
Article

Linear discriminant analysis

Linear discriminant analysis (LDA) is a type of algorithmic model employed in machine learning in order to classify data. Unlike some other now popular models, linear discriminant analysis has been used for decades in both AI for radiology 1 and many other biomedical applications. Linear discri...
Article

Dice similarity coefficient

The Dice similarity coefficient, also known as the Sørensen–Dice index or simply Dice coefficient, is a statistical tool which measures the similarity between two sets of data. This index has become arguably the most broadly used tool in the validation of image segmentation algorithms created wi...
Article

Semi-supervised learning (machine learning)

Semi-supervised learning is an approach to machine learning which uses some labeled data and some data without labels to train models. This approach can be useful to overcome the problem of insufficient quantities of labeled data. Some consider it to be a variation of supervised learning, whilst...
Article

Computer vision

Computer vision is a field concerned with the creation of generalized automated computer insight into visual data i.e. making computers see. Although often understood as a field within computer science, the field actually involves work in informatics, various fields of engineering and neuroscien...
Article

Feature extraction

Feature extraction is a process utilized in both machine learning and image processing by which data is transformed into a smaller more relevant set of data. Feature extraction is a type of dimensionality reduction. Feature extraction can be performed on texts as part of NLP or on images for com...
Article

Imputation

Imputation refers to statistical methods for creating data when it is missing from a data set. Missing data is often not random (and can therefore lead to different forms of bias). Imputation theoretically improves research outcomes as opposed to simply discarding incomplete data subsets. Severa...
Article

Bayes' factor

A Bayes' factor is a number that quantifies the relative likelihood of two models or hypotheses to each other if made into a ratio e.g. if two models are equally likely based on the prior evidence ( or there is no prior evidence) then the Bayes factor would be one. Such factors have several use...
Article

Segmentation

Segmentation, in the context of informatics for radiology, refers to the delineation of areas of interest in imaging in terms of pixels or voxels. Segmentation is often accomplished by computerized algorithms that vary in complexity from simply selecting pixels of similar values in proximity to ...
Article

Cybersecurity

Cybersecurity is the protection of digital data, software and hardware from risks including attacks or other problems related to their integrity and/or data confidentiality. Cybersecurity may utilize many different types of tools and protocols including encryption, firewalls and other infrastruc...
Article

Noise reduction

Noise reduction, also known as noise suppression or denoising, commonly refers to the various algorithmic techniques to reduce noise in digital images once they are created although a few sources use the term more broadly to imply anything that reduces noise. In digital image processing various ...
Article

Class activation mapping (CAM)

Class activation mapping is a method to generate heatmaps of images that show which areas were of high importance in terms of a neural networks for image classification. There are several variations on the method including Score-CAM and Grad-CAM (Gradient Weighted Class Activation Mapping). The ...
Article

Centering

Centering is a statistical operation on data. In the context of neural networks for image classification related tasks, it implies intensity normalization across images in training data sets. In the context of neural networks specifically for x-ray based images it therefore implies correction fo...
Article

Kernel (computing)

A kernel, in terms of general computing terminology, is the main part of a specific software. The term, unless otherwise specified, refers to the main part of the operating system software and some sources even use it interchangeably with operating system. This term can also describe certain mac...
Article

Convolution

Convolution is a mathematical concept that implies the product of two functions. In practical terms for radiology, convolution implies the application of a mathematical operation to a signal such that a different signal is produced. Convolutions are applied in image processing for CTs and MRIs. ...
Article

Scaling

Scaling is a linear transformation that changes the size of a mathematical object. The mathematical objects of interest to radiologists that can be scaled are usually image matrices. This simple type of spatial normalization is a common step in image normalization for creating an image data set ...
Article

Image normalization

Image normalization is a process, often used in the preparation of data sets for artificial intelligence (AI), in which multiple images are put into a common statistical distribution in terms of size and pixel values; however, a single image can also be normalized within itself. The process usua...
Article

Fully connected neural network

Fully connected neural networks (FCNNs) are a type of artificial neural network where the architecture is such that all the nodes, or neurons, in one layer are connected to the neurons in the next layer.  While this type of algorithm is commonly applied to some types of data, in practice this t...
Article

R (Programming Language)

R is a programming language and free open-source software environment for statistical computing and graphics supported by the R Foundation. It is freely available under the GNU General Public License. R is a highly popular language for programming in statistics in general and bio-statistics in p...
Article

Python (programming language)

Python is a high-level, general-purpose computer programming language. Initially, Python was created by Dutch computer programmer Guido van Rossum and was first released in 1991.  The version 3.7.4 (which is the most recent stable release as of July 2019) Python language has objects and associat...
Article

Quantitative imaging biomarker

Quantitative imaging biomarkers are validated, standardized characteristics based on quantifiable features of biomedical imaging that can be reliably and objectively measured on a ratio or interval scale. The utility of quantitative imaging biomarkers lies in providing information beyond what ca...
Article

Bayes' theorem

Bayes' theorem, also known as Bayes' rule or Bayes' law, is a theorem in statistics that describes the probability of one event or condition as it relates to another known event or condition. Mathematically, the theory can be expressed as follows: P(A|B) = (P(B|A) x P(A) )/P(B), where given that...
Article

Clustering

Clustering, also known as cluster analysis, is a machine learning technique designed to group similar data points together. Since the data points do not necessarily have to be labeled, clustering is an example of unsupervised learning. Clustering in machine learning should not be confused with d...
Article

Heat map

Heat maps are visual representations of data in matrices with colors. Two dimensions of the data are captured by the location of a point (i.e., a map) and a third dimension is represented by the color of the point (i.e., the value). Some nuclear medicine studies are technically examples of heat...
Article

Automation bias

Automation bias is a form of cognitive bias occurring when humans overvalue information produced by an automated, usually computerized, system. Users of automated systems can fail to understand or ignore illogical or incorrect information produced by computer systems. Computer programs may crea...
Article

Boosting

Boosting is an ensemble technique that creates increasingly complex algorithms from building blocks of relatively simple decision rules for binary classification tasks. This is achieved by sequentially training new models (or 'weak' learners) which focus on examples that were classified incorre...
Article

Optimization algorithms

Optimization algorithms are widely utilized mathematical functions that solve problems via the maximization or minimization of a function. These algorithms are used for a variety of purposes from patient scheduling to radiology.  Machine learning Optimization algorithms are used in machine lea...
Article

Linear algebra

Linear algebra is a field of mathematics with extremely diverse applications. This type of mathematics extends arithmetical operations from numbers to complex objects like matrices and vectors. In terms of radiology, linear algebra applications include CT reconstruction algorithms, neural netwo...
Article

Transfer learning

The concept of transfer learning in artificial neural networks is taking knowledge acquired from training on one particular domain and applying it in learning a separate task. In recent years, a well-established paradigm has been to pre-train models using large-scale data (e.g., ImageNet) and t...
Article

Recurrent neural network

Recurrent neural networks (RNNs) are a form of a neural network that recognizes patterns in sequential information via contextual memory. Recurrent neural networks have been applied to many types of sequential information including text, speech, videos, music, genetic sequences and even clinical...
Article

Curse of dimensionality

The curse of dimensionality can refer to a number of phenomenon related to high-dimensional data in several fields. In terms of machine learning for radiology, it generally refers to the phenomenon that as the number of image features employed to train an algorithm increases there is a geometric...
Article

Generative adversarial network

Generative adversarial networks (GANs) are an elegant deep learning approach to generating artificial data that is indistinguishable from real data. Two neural networks are paired off against one another (adversaries). The first network generates artificial data to reproduce real data. The secon...
Article

Activation function

In neural networks, activation functions perform a transformation on a weighted sum of inputs plus biases to a neuron in order to compute its output. Using a biological analogy, the activation function determines the “firing rate” of a neuron in response to an input or stimulus. These functions...
Article

Mean squared error

Mean squared error is a specific type of loss function. Mean square error is calculated by the average, specifically the mean, of errors that have been squared from data as it relates to a function ( often a regression line).  The utility of mean square error comes from the fact that squared nu...
Article

Cross entropy

Cross entropy is a measure of the degree of inequality between two probability distributions. In the context of supervised learning, one of these distributions represents the “true” label for a training example, where the correct responses are assigned a value of 100%. Machine learning If p(x)...
Article

Single linear regression

Single linear regression, also known as simple linear regression, in statistics, is a technique that maps a relationship between one independent and one dependent variable into a first-degree polynomial. Linear regression is the simplest example of curve fitting, a type of mathematical problem i...
Article

Artificial Intelligence (AI) TI-RADS

AI TI-RADS (Artificial Intelligence Thyroid Imaging Reporting and Data System) is a data-driven analysis and revision of the 2017 ACR TI-RADS 1. Published in May 2019 2, this had the intention of simplifying categorization and improving specificity while maintaining high sensitivity. This system...
Article

Neural network architectures

Artificial neural networks can be broadly divided into different architectures, feedforward or recurrent neural architectures. Feedforward neural networks are more readily conceptualised in 'layers'. The first layer of the neural network is merely the inputs of each sample, and each neuron in e...
Article

Models (machine learning)

Each machine learning model will vary whilst being determined in part by the type of problem being solved. Although much of the recent work in the field of image processing generally, and more specifically radiology, has focussed on convolutional neural networks, a type of neural network, a numb...
Article

Machine learning processes

The specifics of how a machine learning algorithm is trained to recognize certain features and thereby become able to make accurate predictions on new examples varies depending on the type of data being used and the algorithm architecture. Four of the most commonly used learning processes are: ...
Article

DICOM to bitmap conversion

DICOM to bitmap conversion describes the process of converting medical images stored within DICOM file format to raw pixel data. Computer vision techniques for processing image data usually work on raw pixel values and therefore this conversion is required before further processing may take plac...
Article

Confusion matrix

Confusion matrices, a key tool to evaluate machine learning algorithm performance in classification, are a statistical tool. Contingency tables, a type of confusion matrix, are used in the evaluation of many diagnostic exams for sensitivity, specificity, positive and negative predictive values....
Article

Selection bias

Selection bias is a type of bias created when the data sampled is not representative of the data of the population or group that a study or model aims to make a prediction about. Selection bias is the result of systematic errors in data selection and collection. Practically-speaking selection bi...
Article

Imaging data sets (artificial intelligence)

The aggregation of an imaging data set is a critical step in building artificial intelligence (AI) for radiology. Imaging data sets are used in various ways including training and/or testing algorithms. Many data sets for building convolutional neural networks for image identification involve at...
Article

Training, testing and validation datasets

The division of the input data into training, testing and validation sets is crucial in the creation of robust machine learning algorithms. Firstly, machine learning algorithms require a training set to be trained on. Each iteration, it calculates the difference between the predicted and actual ...
Article

Loss function

A loss function is a mathematical function commonly used in statistics. Loss functions are frequently used to create machine learning algorithms. The loss function computes the error for a single training example in contrast to a Cost function, which is the average of the loss functions from ea...
Article

Random forest (machine learning)

Random Forest also known as random decision forests are a specific type of ensembling algorithm that utilizes a combination of decision trees based on subsets of a dataset. A random forest algorithm does not make a decision tree of smaller decision trees, but rather utilizes decision trees in pa...
Article

Bagging

Bagging is a term often used in the fields of machine learning, data science and computational statistics that refers to bootstrap aggregation. Bootstrapped aggregation of data can be employed in many different AI (artificial intelligence) algorithms, and is often a necessary step to making rand...
Article

Overfitting

Overfitting is a problem in machine learning that introduces errors based on noise and meaningless data into prediction or classification. Overfitting tends to happen in cases where training data sets are either of insufficient size or training data sets include parameters and/or unrelated featu...
Article

Synthetic and augmented data

In the context of radiological images, synthetic and augmented data are data that are not completely generated by direct measurement from patients. Machine learning models improve with increased data. However, there is a relative lack of open, free available radiology data sets. Issues of patie...
Article

Natural language processing

Natural language processing (NLP) is an area of active research in artificial intelligence concerned with human languages. Natural language processing programs use human written text or human speech as data for analysis. The goals of natural language processing programs can vary from generating ...
Article

Data augmentation

Data augmentation is a technique that increases the amount of data by adding slightly modified copies of already existing data. This increases the diversity of the training set, which helps to reduce overfitting when training a machine learning model and can have a positive effect on the model's...
Article

Regularisation (Regularization)

Regularisation is a process of reducing the complexity of a model through the inclusion of an additional parameter as in order to reduce the overfitting of a model to the training data. In the context of radiology, a common model type used to interpret images is the convolutional neural network...
Article

Feature scaling

Feature scaling a preprocessing technique that is used to standardize the range of values in data features, making sure that the features are on a similar scale. It is used when the range of values of a certain feature is too variable and contains extreme values as most algorithms perform poorly...
Article

Support vector machine (machine learning)

The support vector machine (SVM) is a supervised learning algorithm used to separate groups of data with a margin or plane which is made as well as possible to ensure it is more likely to generalize well to examples it has never seen before. In the case of a two feature data set a margin or line...
Article

Computer aided diagnosis

Computer aided diagnosis (CAD) is the use of a computer generated output as an assisting tool for a clinician to make a diagnosis. It is different from automated computer diagnosis, in which the end diagnosis is based on a computer algorithm only. As an early form of artificial intelligence, co...
Article

Principal component analysis

Principal component analysis is a mathematical transformation that can be understood in two parts: the transformation maps multivariable data (Nold dimensions) into a new coordinate system (Nnew dimensions) with minimal loss of information. data projected on the first dimension of the new coor...
Article

Logistic regression (machine learning)

Logistic regression in machine learning is a classification model which predicts the probabilities of binary outcomes, as opposed to linear regression which predicts actual values.  Logistic regression outputs are constrained between 0 and 1, and hence is a popular simple classification method ...
Article

Decision tree (machine learning)

The decision tree model in machine learning is an algorithm that offers choices based on characteristics of the data. It follows 'branch node theory' in which each branch will represent a variable alongside a decision.  Often decision tree models will be expressed in the following rule format: ...
Article

Linear regression (machine learning)

Linear regression in machine learning is a form of supervised learning, derived from the linear regression models in statistics. It operates under the assumption that two variables have a linear relationship, therefore, can calculate the value of an output variable based on the input variable. L...
Article

Radiomics

Radiomics (as applied to radiology) is a field of medical study that aims to extract a large number of quantitative features from medical images using data characterization algorithms. The data is assessed for improved decision support. It has the potential to uncover disease characteristics tha...
Article

Rule-based expert systems

A rule-based expert system is the simplest form of artificial intelligence and uses prescribed knowledge-based rules to solve a problem 1.  The aim of the expert system is to take knowledge from a human expert and convert this into a number of hardcoded rules to apply to the input data. In their...
Article

Artificial intelligence

Artificial intelligence (AI) has been defined by some as the "branch of computer science dealing with the simulation of intelligent behavior in computers" 1, however, the precise definition is actually a matter of debate among experts. An alternative definition is the branch of computer science ...
Article

Neural network (overview)

Artificial neural networks are a powerful type of model capable of processing many types of data. Initially inspired by the connections between biological neural networks, modern artificial neural networks only bear slight resemblances at a high level to their biological counterparts. Nonetheles...
Article

Ensembling

Ensembling (sometimes ensemble learning) is a class of meta-algorithmic techniques where multiple models are trained and their results are aggregated to improve classification performance. It is effective in a wide variety of problems.  Two commonly used methods are:  boosting: a method of wei...
Article

Backpropagation (machine learning)

Backpropagation in supervised machine learning is the process used to calculate the gradient of the error function associated with each parameter weighting within a convoluted neural network (CNN). Essentially, the gradient estimates how the system parameters should change in order to optimize t...
Article

Cost function (machine learning)

A cost function is a mechanism utilized in supervised machine learning, the cost function returns the error between predicted outcomes compared with the actual outcomes. The aim of supervised machine learning is to minimize the overall cost, thus optimizing the correlation of the model to the sy...
Article

Iteration (machine learning)

An iteration is a term used in machine learning and indicates the number of times the algorithm's parameters are updated. Exactly what this means will be context dependent. A typical example of a single iteration of training of a neural network would include the following steps: processing the ...
Article

Epoch (machine learning)

An epoch is a term used in machine learning and indicates the number of passes of the entire training dataset the machine learning algorithm has completed. Datasets are usually grouped into batches (especially when the amount of data is very large). Some people use the term iteration loosely and...
Article

Batch size (machine learning)

Batch size is a term used in machine learning and refers to the number of training examples utilized in one iteration. The batch size can be one of three options: batch mode: where the batch size is equal to the total dataset thus making the iteration and epoch values equivalent mini-batch mod...
Article

Unsupervised learning (machine learning)

Unsupervised learning is one of the main types of algorithms used in machine learning.  Unsupervised learning algorithms are used on datasets where output labels are not provided. Hence, instead of trying to predict a particular output for each input, these algorithms attempt to discover the un...
Article

Reinforcement learning (machine learning)

Reinforcement learning is one of the main algorithms used in machine learning in the context of an agent in an environment. In each timestep, this agent takes in information from their environment and performs an action. Certain actions reward the agent.  Reinforcement learning maximizes these ...
Article

Evolutionary algorithms (machine learning)

Evolutionary algorithms are one of the main types of algorithms used in machine learning, emulating natural selection whereby pseudorandom variations in the algorithm are measured against selective pressures created by functions. The more successful algorithms are then used as the 'parents' of t...
Article

Convolutional neural network

A convolutional neural network (CNN) is a particular implementation of a neural network used in deep learning that exclusively processes array data such as images, and is thus frequently used in machine learning applications targeted at medical images 1. Architecture A convolutional neural net...
Article

Supervised learning (machine learning)

Supervised learning is the most common type of machine learning algorithm used in medical imaging research. It involves training an algorithm from a set of images or data where the output labels are already known 1. Supervised learning is broken into two subcategories, classification and regres...
Article

Machine learning

Machine learning is a specific practical application of computer science and mathematics that allows computers to extrapolate information based on observed patterns without explicit programming. A defining characteristic of machine learning programs is the improved performance at tasks such as c...