Computer vision for movie assessment of social cognition using eye-tracking data
Team: Julia Tang, Dr. Melissa Black, Prof. Torbjorn Falkmer. Curtin Autism Research Group (CARG).
CIC specialists: Dr. Sawitchaya (Nancy) Tippaya
The development of Areas of Interest (AOIs) becomes an essential part in analyses of eye-tracking data with the purpose of linking eye movement to particular parts of stimuli to quantitatively calculate eye movement measures (e.g., fixation count, fixation duration, and dwell time). However, AOI development for eye-tracking analysis remains a big issue for researchers as the methods to define AOIs for eye-tracking analysis is not well explored. Additionally, AOIs for video stimuli are typically annotated manually frame-by-frame which is time-consuming. Consequently, we are using tools and techniques in computer vision research to help researchers extract AOIs efficiently.
The video dataset used in this research was explicitly designed to assess social cognition in autism spectrum disorder (ASD) individuals. This project was to explore various techniques in computer vision using machine algorithms to define AOIs of video stimuli. Single or multiple instances of objects (e.g., human, human body, human faces, eyes, eyebrows, nose, mouth, and jawline) were automatically detected using recent object detection algorithms in computer vision. Techniques included employing open-source deep learning models and graphics processing units (GPUs) to speed up the computation time. Ultimately, the project aimed to increase the number and quality of research outputs in areas of neurodevelopmental disability. The AOIs identified by the AI were integrated with the post-processing of eye-tracking data methods producing additional experimental results.