Modeling User Behaviors in Machine Operation Tasks for Adaptive Guidance

Long-fei Chen, Yuichi Nakamura, Kazuaki Kondo

An adaptive guidance system that supports equipment operators requires a comprehensive model of task and user behavior that considers different skill and knowledge levels as well as diverse situations. In the present paper, we introduced a novel method for machine operation modeling aimed to integrate visual operation records provided by users with different skills, knowledge levels, and interpersonal behavior patterns. For this purpose, we investigated the relationships between user behavior patterns that could be visually observed and their skill levels under machine operation conditions. We considered sixty samples of two sewing tasks performed by five operators using a head-mounted RGB-D camera and a static gaze tracker. We examined behavioral features, such as the operator gaze, head movements, and hand interactions with hotspots, and observed significant behavioral changes as a result of continuous skill improvement. We automatically modeled the variety of behaviors of operation tasks with a two-step approach, prototype selection and experiences integration. The experimental results indicated that features, such as duration of task execution and user head movements, could serve as appropriate indices for skill level evaluation, and provide useful information for integrating various records corresponding to different skill levels and behavioral characteristics. Integrating operation records with operating habits allowed developing a rich inclusive task model that could be used to flexibly adapt to various user-specific behavior patterns.

Knowledge Graph



Sign up or login to leave a comment