CASHformer: Cognition Aware SHape Transformer for Longitudinal Analysis

by   Ignacio Sarasua, et al.

Modeling temporal changes in subcortical structures is crucial for a better understanding of the progression of Alzheimer's disease (AD). Given their flexibility to adapt to heterogeneous sequence lengths, mesh-based transformer architectures have been proposed in the past for predicting hippocampus deformations across time. However, one of the main limitations of transformers is the large amount of trainable parameters, which makes the application on small datasets very challenging. In addition, current methods do not include relevant non-image information that can help to identify AD-related patterns in the progression. To this end, we introduce CASHformer, a transformer-based framework to model longitudinal shape trajectories in AD. CASHformer incorporates the idea of pre-trained transformers as universal compute engines that generalize across a wide range of tasks by freezing most layers during fine-tuning. This reduces the number of parameters by over 90 the original model and therefore enables the application of large models on small datasets without overfitting. In addition, CASHformer models cognitive decline to reveal AD atrophy patterns in the temporal sequence. Our results show that CASHformer reduces the reconstruction error by 73 previously proposed methods. Moreover, the accuracy of detecting patients progressing to AD increases by 3 data.


page 1

page 2

page 3

page 4


TransforMesh: A Transformer Network for Longitudinal modeling of Anatomical Meshes

The longitudinal modeling of neuroanatomical changes related to Alzheime...

Characterizing Alzheimer's Disease Biomarker Cascade Through Non-linear Mixed Effect Models

Alzheimer's Disease (AD) research has shifted to focus on biomarker traj...

Beyond the Snapshot: Brain Tokenized Graph Transformer for Longitudinal Brain Functional Connectome Embedding

Under the framework of network-based neurodegeneration, brain functional...

Explainable Identification of Dementia from Transcripts using Transformer Networks

Alzheimer's disease (AD) is the main cause of dementia which is accompan...

Optimizing Deeper Transformers on Small Datasets: An Application on Text-to-SQL Semantic Parsing

Due to the common belief that training deep transformers from scratch re...

Cure the headache of Transformers via Collinear Constrained Attention

As the rapid progression of practical applications based on Large Langua...

Reinforcement Learning based Disease Progression Model for Alzheimer's Disease

We model Alzheimer's disease (AD) progression by combining differential ...

Please sign up or login with your details

Forgot password? Click here to reset