Vision-based Approach for Autism Diagnosis using Transfer Learning and Eye-tracking
Résumé
The potentials of Transfer Learning (TL) have been well-researched in areas such as Computer Vision and Natural Language Processing. This study aims to explore a novel application of TL to detect Autism Spectrum Disorder. We seek to develop an approach that combines TL and eye-tracking, which is commonly used for analyzing autistic features. The key idea is to transform eye-tracking scanpaths into a visual representation, which could facilitate using pretrained vision models. Our experiments implemented a set of widely used models including VGG-16, ResNet, and DenseNet. Our results showed that the TL approach could realize a promising accuracy of classification (ROC-AUC up to 0.78). The proposed approach is not claimed to provide superior performance compared to earlier work. However, the study is primarily thought to convey an interesting aspect regarding the use of (synthetic) visual representations of eve-tracking output as a means to transfer representations from models pretrained on large-scale datasets such as ImageNet.