Autoregressive neural networks (ARNN) are neural networks designed for time series data and sequential data analysis. They operate on the principle that the current value in a sequence can be predicted based on its previous values, making them highly relevant in medical applications such as signal processing, patient monitoring, and predictive analytics 1,2.
On this page:
Clinical applications
ARNN have several applications in radiology, such as 3:
predictive analytics: predicting disease progression from sequential imaging data
signal processing: enhancing image sequences by predicting missing or corrupted data
dynamic imaging: improving the quality and resolution of dynamic imaging modalities like functional MRI by predicting intermediate frames
Architecture
Autoregressive neural networks typically consist of the following components 4:
input: the primary input in radiology is sequential medical data, such as imaging series or dynamic imaging studies (e.g., functional MRI). Preprocessing steps for these inputs may include normalizing values, handling missing data and segmenting the series into appropriate time windows for the network to process efficiently.
feature extraction: within an ARNN feature extraction captures temporal dependencies by applying autoregressive components. The network uses past values from the series to predict future values, creating lagged versions of the input series as features. Activation functions (e.g., ReLU, Tanh) introduce non-linearity, enabling the network to model complex patterns.
prediction and output: the final layers of an ARNN are designed to output predictions based on the learned temporal features, where fully connected layers, called dense layers, integrate information from previous layers and produce the final predictions. The number of neurons in the output layer corresponds to the number of prediction steps, allowing the network to forecast future values based on the sequential data provided.
Training
Training an ARNN involves supervised learning, minimizing the difference between its predictions and actual values through 2:
backpropagation through time (BPTT): adjusts the weights of the layers based on prediction errors
loss functions: common loss functions include mean squared error (MSE) or mean absolute error (MAE)
Differences vs recurrent neural networks
An autoregressive model (AR) and a recurrent neural network (RNN) have similar structural foundations but differ in how they process information for predictions. Both models leverage data from past time steps, but they do so in distinct ways.
In recurrent neural networks, the network's output at a given time step relies primarily on the current input and the hidden state, which carries information from previous time steps. This hidden state is updated with each time step, incorporating both the current input and the information from previous time steps 1,2.
In contrast, an autoregressive model predicts the next output by directly incorporating information from both the current time step and all previous time steps. Unlike an RNN, where past information is maintained within a hidden state, an autoregressive model uses past inputs as additional features provided to the model. This means that each prediction is informed not only by the immediate input but also by a sequence of previous inputs, offering a more direct way to integrate historical data into the prediction process 1,2.