Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_cv 95% Match Research Paper HCI researchers,Robotics researchers,AI researchers,Mixed reality developers,UX designers 17 hours ago

SigmaCollab: An Application-Driven Dataset for Physically Situated Collaboration

robotics › human-robot-interaction
📄 Abstract

Abstract: We introduce SigmaCollab, a dataset enabling research on physically situated human-AI collaboration. The dataset consists of a set of 85 sessions in which untrained participants were guided by a mixed-reality assistive AI agent in performing procedural tasks in the physical world. SigmaCollab includes a set of rich, multimodal data streams, such as the participant and system audio, egocentric camera views from the head-mounted device, depth maps, head, hand and gaze tracking information, as well as additional annotations performed post-hoc. While the dataset is relatively small in size (~ 14 hours), its application-driven and interactive nature brings to the fore novel research challenges for human-AI collaboration, and provides more realistic testing grounds for various AI models operating in this space. In future work, we plan to use the dataset to construct a set of benchmarks for physically situated collaboration in mixed-reality task assistive scenarios. SigmaCollab is available at https://github.com/microsoft/SigmaCollab.

Key Contributions

Introduces SigmaCollab, a novel dataset designed for research on physically situated human-AI collaboration. The dataset comprises multimodal data from 85 sessions where participants performed procedural tasks guided by an MR assistive AI agent, providing a realistic environment for studying human-AI interaction.

Business Value

Facilitates the development of more intuitive and effective AI assistants for real-world tasks, improving productivity and user experience in collaborative settings.