Items tagged “machine learning”
23 results found
Convolutional neural network
A convolutional neural network (CNN) is a particular implementation of a neural network used in machine learning that exclusively processes array data such as images, and is thus frequently used in machine learning applications targeted at medical images. Architecture A convolutional neural ne...
Batch size (machine learning)
Batch size is a term used in machine learning and refers to the number of training examples utilized in one iteration. The batch size can be one of three options: batch mode: where the batch size is equal to the total dataset thus making the iteration and epoch values equivalent mini-batch mod...
Epoch (machine learning)
An epoch is a term used in machine learning and indicates the number of passes of the entire training dataset the machine learning algorithm has completed. Datasets are usually grouped into batches (especially when the amount of data is very large). Some people use the term iteration loosely and...
Iteration (machine learning)
An iteration is a term used in machine learning and indicates the number of times the algorithm's parameters are updated. Exactly what this means will be context dependent. A typical example of a single iteration of training of a neural network would include the following steps: processing the ...
Cost function (machine learning)
A cost function is a mechanism utilized in supervised machine learning, the cost function returns the error between predicted outcomes compared with the actual outcomes. The aim of supervised machine learning is to minimize the overall cost, thus optimizing the correlation of the model to the sy...
Backpropagation (machine learning)
Backpropagation in supervised machine learning is the process used to calculate the gradient of the error function associated with each parameter weighting within a convoluted neural network (CNN). Essentially, the gradient estimates how the system parameters should change in order to optimize t...
Ensembling (sometimes ensemble learning) is a class of meta-algorithmic techniques where multiple models are trained and their results are aggregated to improve classification performance. It is effective in a wide variety of problems. Two commonly used methods are: boosting: a method of wei...
Neural network (overview)
Artificial neural networks are a powerful type of model capable of processing many types of data. Initially inspired by the connections between biological neural networks, modern artificial neural networks only bear slight resemblances at a high level to their biological counterparts. Nonetheles...
Radiomics (as applied to radiology) is a field of medical study that aims to extract a large number of quantitative features from medical images using data characterization algorithms. The data is assessed for improved decision support. It has the potential to uncover disease characteristics tha...
Logistic regression (machine learning)
Logistic regression in machine learning is a classification model which predicts the probabilities of binary outcomes, as opposed to linear regression which predicts actual values. Logistic regression outputs are constrained between 0 and 1, and hence is a popular simple classification method ...
Principal component analysis
Principal component analysis is a mathematical transformation that can be understood in two parts: the transformation maps multivariable data (Nold dimensions) into a new coordinate system (Nnew dimensions) with minimal loss of information. data projected on the first dimension of the new coor...
Augmentation is a process of artificial data generation, which produces a greater volume of data, and thus increasing the likelihood of obtaining higher predictive accuracy of a predictive model. Usually, a higher volume of data is likely to yield better predictive and more accurate models from...
Overfitting is a problem in machine learning that introduces errors based on noise and meaningless data into prediction or classification. Overfitting tends to happen in cases where training data sets are either of insufficient size or training data sets include parameters and/or unrelated featu...
Diagnosis not applicable
Published 19 Apr 2019
Bagging is a term often used in the fields of machine learning, data science and computational statistics that refers to bootstrap aggregation. Bootstrapped aggregation of data can be employed in many different AI (artificial intelligence) algorithms, and is often a necessary step to making rand...
Random forest (machine learning)
Random Forest also known as random decision forests are a specific type of ensembling algorithm that utilizes a combination of decision trees based on subsets of a dataset. A random forest algorithm does not make a decision tree of smaller decision trees, but rather utilizes decision trees in pa...
A loss function is a mathematical function commonly used in statistics. Loss functions are frequently used to create machine learning algorithms. The loss function computes the error for a single training example in contrast to a Cost function, which is the average of the loss functions from ea...
Mean squared error
Mean squared error is a specific type of loss function. Mean square error is calculated by the average, specifically the mean, of errors that have been squared from data as it relates to a function ( often a regression line). The utility of mean square error comes from the fact that squared nu...