SCHEDULED DOWNTIME: We will be performing a database migration that will result in the site being unavailable for approximately 1 hour starting at UTC: Monday, 20 May 2024 11:00 PM (check your local time here

Articles

Articles are a collaborative effort to provide a single canonical page on all topics relevant to the practice of radiology. As such, articles are written and continuously improved upon by countless contributing members. Our dedicated editors oversee each edit for accuracy and style. Find out more about articles.

91 results found
Article

Noise reduction

Noise reduction, also known as noise suppression or denoising, commonly refers to the various algorithmic techniques to reduce noise in digital images once they are created although a few sources use the term more broadly to imply anything that reduces noise. In digital image processing various ...
Article

Learning curve (machine learning)

A learning curve is a plot of the learning performance of a machine learning model (usually measured as loss or accuracy) over time (usually in a number of epochs). Learning curves are a widely used diagnostic tool in machine learning to get an overview of the learning and generalization behavi...
Article

Activation function

In neural networks, activation functions perform a transformation on a weighted sum of inputs plus biases to a neuron in order to compute its output. Using a biological analogy, the activation function determines the “firing rate” of a neuron in response to an input or stimulus. These functions...
Article

Synthetic and augmented data

In the context of radiological images, synthetic and augmented data are data that are not completely generated by direct measurement from patients. Machine learning models improve with increased data. However, there is a relative lack of open, free available radiology data sets. Issues of patie...
Article

Training, testing and validation datasets

The division of the input data into training, testing and validation sets is crucial in the creation of robust machine learning algorithms. Firstly, machine learning algorithms require a training set to be trained on. Each iteration, it calculates the difference between the predicted and actual ...
Article

Federated learning

Federated learning, also known as distributed learning, is a technique that facilitates the creation of robust artificial intelligence models where data is trained on local devices (nodes) that then transfer weights to a central model. Models can potentially be trained using larger and/or more d...
Article

Ground truth

Ground truth is a term used in statistics and machine learning to refer to data assumed to be correct. Regarding the development of machine learning algorithms in radiology, the ground truth for image labeling is sometimes based on pathology or lab results while, in other cases, on the expert o...
Article

Machine learning processes

The specifics of how a machine learning algorithm is trained to recognize certain features and thereby become able to make accurate predictions on new examples varies depending on the type of data being used and the algorithm architecture. Four of the most commonly used learning processes are: ...
Article

Data augmentation

Data augmentation is a technique that increases the amount of data by adding slightly modified copies of already existing data. This increases the diversity of the training set, which helps to reduce overfitting when training a machine learning model and can have a positive effect on the model's...
Article

Findable accessible interoperable reusable data principles (FAIR)

The FAIR (findable accessible interoperable reusable) data principles are a set of guidance on enhancing semantic machine interpretability of data, thereby improving its richness and quality. Since its inception, multiple international organizations have endorsed the application of FAIR principl...
Article

Linear algebra

Linear algebra is a field of mathematics with extremely diverse applications. This type of mathematics extends arithmetical operations from numbers to complex objects like matrices and vectors. In terms of radiology, linear algebra applications include CT reconstruction algorithms, neural netwo...
Article

Feature extraction

Feature extraction is a process utilized in both machine learning and image processing by which data is transformed into a smaller more relevant set of data. Feature extraction is a type of dimensionality reduction. Feature extraction can be performed on texts as part of NLP or on images for com...
Article

Deep learning frameworks

Deep learning frameworks are instruments for training and validating deep neural networks, through high-level programming interfaces. Widely used deep learning frameworks include the libraries PyTorch, TensorFlow, and Keras. A programmer can use these libraries of higher functions to quickly de...
Article

Unsupervised learning (machine learning)

Unsupervised learning is one of the main types of algorithms used in machine learning.  Unsupervised learning algorithms are used on datasets where output labels are not provided. Hence, instead of trying to predict a particular output for each input, these algorithms attempt to discover the un...
Article

Deep learning

Deep learning is a subset of machine learning based on multi-layered (a.k.a. “deep“) artificial neural networks. Their highly flexible architectures can learn directly from data (such as images, video or text) without the need of hand-coded rules and can increase their predictive accuracy when p...
Article

Autoencoder

Autoencoders are an unsupervised learning technique in which artificial neural networks are used to learn to produce a compressed representation of the input data. Essentially, autoencoding is a data compression algorithm where the compression and decompression functions are learned automatical...
Article

Cybersecurity

Cybersecurity is the protection of digital data, software and hardware from risks including attacks or other problems related to their integrity and/or data confidentiality. Cybersecurity may utilize many different types of tools and protocols including encryption, firewalls and other infrastruc...
Article

Curse of dimensionality

The curse of dimensionality can refer to a number of phenomenon related to high-dimensional data in several fields. In terms of machine learning for radiology, it generally refers to the phenomenon that as the number of image features employed to train an algorithm increases there is a geometric...
Article

Boosting

Boosting is an ensemble technique that creates increasingly complex algorithms from building blocks of relatively simple decision rules for binary classification tasks. This is achieved by sequentially training new models (or 'weak' learners) which focus on examples that were classified incorre...
Article

Dice similarity coefficient

The Dice similarity coefficient, also known as the Sørensen–Dice index or simply Dice coefficient, is a statistical tool which measures the similarity between two sets of data. This index has become arguably the most broadly used tool in the validation of image segmentation algorithms created wi...

Updating… Please wait.

 Unable to process the form. Check for errors and try again.

 Thank you for updating your details.