Abstract
The studies of human clothing for digital avatars have predominantly relied on synthetic datasets. While easy to collect, synthetic data often fall short in realism and fail to capture authentic clothing dynamics. Addressing this gap, we introduce 4D-DRESS, the first real-world 4D dataset advancing human clothing research with its high-quality 4D textured scans and garment meshes. 4D-DRESS captures 64 outfits in 520 human motion sequences, amounting to 78k textured scans. Creating a real-world clothing dataset is challenging, particularly in annotating and segmenting the extensive and complex 4D human scans. To address this, we develop a semi-automatic 4D human parsing pipeline. We efficiently combine a human-in-the-loop process with automation to accurately label 4D scans in diverse garments and body movements. Leveraging precise annotations and high-quality garment meshes, we establish several benchmarks for clothing simulation and reconstruction. 4D-DRESS offers realistic and challenging data that complements synthetic sources, paving the way for advancements in research of lifelike human clothing. Website: this https URL.
Abstract (translated)
为了填补这一空白,我们引入了4D-DRESS,这是第一个通过其高质量的4D纹理扫描和服装网格 Advance 人类服装研究 的真实世界4D数据集。4D-DRESS 捕捉了520个人动序列中的64个着装,共计78k个纹理扫描。创建真实世界的服装数据集具有挑战性,特别是在对广泛而复杂的4D人类扫描进行注释和分割方面。为了应对这个挑战,我们开发了一个半自动化的4D人体解析管道。我们有效地将人机交互过程与自动化相结合,准确地在各种服装和身体运动中标注4D扫描。利用精确的注释和高质量的服装网格,我们为服装模拟和重建建立了多个基准。4D-DRESS 提供了真实和具有挑战性的数据,补充了合成数据,为逼真的人类服装研究铺平了道路。网站:这是这个链接。
URL
https://arxiv.org/abs/2404.18630