|
Abstract
|
Human-Centered Artificial Intelligence (HCAI) emphasizes aligning intelligent system behavior with human goals, cognitive states, and contextual needs. Although prior research has explored adaptive and affect-aware systems, most existing approaches remain reactive and rely on isolated interaction signals. This paper proposes a framework for intent-aware personalization in human-centered AI, grounded in multimodal cognitive interaction signals such as gaze, affect, physiological responses, and paralinguistic audio cues. The framework theorizes how multimodal cognitive signal integration enables accurate intent inference, which in turn drives adaptive personalization mechanisms that enhance engagement, reduce cognitive load, and improve trust. A set of research propositions is presented to guide future empirical validation. The proposed framework provides a theoretical foundation for designing
|