|
To search, Click
below search items.
|
|

| All
Published Papers Search Service
|
|
Title
|
Transfer Learning for Cross-Domain Eye Gaze Classification: Quantifying Domain Gap and Fine-Tuning Effectiveness
|
|
Author
|
Rumana Ferdushi
|
| Citation |
Vol. 26 No. 2 pp. 39-46
|
|
Abstract
|
Eye tracking systems are essential for human-computer interaction, assistive technology, and driver monitoring; however, the domain shift makes it difficult to deploy models across various cameras and locations. This study investigates transfer learning as a viable solution to cross-domain eye gaze classification. We start by calculating the domain gap: a CNN trained on a source dataset achieves 99.76% in-domain accuracy, but when tested on a target dataset with a synthetic domain shift (14.84% gap), the accuracy reduces to 84.9%. Next, we use transfer learning through fine-tuning on the target domain, regaining 58.8% of lost performance and attaining 93.7% accuracy. Our results demonstrate that transfer learning successfully overcomes domain shifts in eye classification, allowing for reliable deployment in a variety of visual scenarios. Importantly, practitioners can achieve near-in-domain performance with minimal target-domain annotation, significantly reducing deployment costs.
|
|
Keywords
|
Transfer learning, domain adaptation, eye tracking, gaze classification, domain generalization, convolutional neural networks, biomedical signal processing.
|
|
URL
|
http://paper.ijcsns.org/07_book/202602/20260204.pdf
|
|