Abstract
We present PiMForce, a novel framework that enhances hand pressure estimation by leveraging 3D hand posture information to augment forearm surface electromyography (sEMG) signals. Our approach utilizes detailed spatial information from 3D hand poses in conjunction with dynamic muscle activity from sEMG to enable accurate and robust whole-hand pressure measurements under diverse hand-object interactions. We also developed a multimodal data collection system that combines a pressure glove, an sEMG armband, and a markerless finger-tracking module. We created a comprehensive dataset from 21 participants, capturing synchronized data of hand posture, sEMG signals, and exerted hand pressure across various hand postures and hand-object interaction scenarios using our collection system. Our framework enables precise hand pressure estimation in complex and natural interaction scenarios. Our approach substantially mitigates the limitations of traditional sEMG-based or vision-based methods by integrating 3D hand posture information with sEMG signals. Video demos, data, and code are available online.
Abstract (translated)
我们提出了PiMForce,这是一个新颖的框架,通过利用三维手部姿态信息增强前臂表面肌电图(sEMG)信号来改进手部压力估计。我们的方法结合了来自3D手部姿势的详细空间信息和来自sEMG的动态肌肉活动,以实现准确且鲁棒的整体手部压力测量,适用于多种手-物互动场景。我们还开发了一个多模态数据收集系统,该系统集成了压力手套、sEMG臂带以及无标记手指跟踪模块。从21名参与者中创建了全面的数据集,使用我们的采集系统捕捉到了各种手部姿势和手-物交互场景下同步的手部姿态、sEMG信号及施加的手部压力数据。我们的框架能够在复杂且自然的互动场景中实现精确的手部压力估计。通过将3D手部姿态信息与sEMG信号相结合,我们显著减轻了传统基于sEMG或视觉的方法的局限性。视频演示、数据和代码均可在线获取。
URL
https://arxiv.org/abs/2410.23629