Abstract
A typical assumption in state-of-the-art self-localization models is that an annotated training dataset is available in the target workspace. However, this does not always hold when a robot travels in a general open-world. This study introduces a novel training scheme for open-world distributed robot systems. In our scheme, a robot ("student") can ask the other robots it meets at unfamiliar places ("teachers") for guidance. Specifically, a pseudo-training dataset is reconstructed from the teacher model and thereafter used for continual learning of the student model. Unlike typical knowledge transfer schemes, our scheme introduces only minimal assumptions on the teacher model, such that it can handle various types of open-set teachers, including uncooperative, untrainable (e.g., image retrieval engines), and blackbox teachers (i.e., data privacy). Rather than relying on the availability of private data of teachers as in existing methods, we propose to exploit an assumption that holds universally in self-localization tasks: "The teacher model is a self-localization system" and to reuse the self-localization system of a teacher as a sole accessible communication channel. We particularly focus on designing an excellent student/questioner whose interactions with teachers can yield effective question-and-answer sequences that can be used as pseudo-training datasets for the student self-localization model. When applied to a generic recursive knowledge distillation scenario, our approach exhibited stable and consistent performance improvement.
Abstract (translated)
在最先进的自局部化模型中,通常认为在目标工作空间中存在已注释的训练数据。然而,当机器人在一个通用开放世界中移动时,这个假设并不总是成立。本文介绍了一种新的开放世界分布式机器人系统的训练方案。在我们的方案中,一个机器人(学生)可以向其遇到的不熟悉的老师(教师)寻求指导。具体来说,从教师模型中重构伪训练数据,然后用于持续学习学生模型。与典型的知识传递方案不同,我们的方案对教师模型的假设非常少,可以处理各种类型的开放集教师,包括不合作、无法训练(例如,图像检索引擎)和黑盒教师(即数据隐私)。我们不依赖教师的私有数据,而是利用自局部化任务中普遍存在的假设: "教师模型是一个自局部化系统",并重新使用教师的自局部化系统作为唯一的可访问通信渠道。我们特别关注设计一个优秀的学生/问题者,其与教师的交互可以产生有效的问答序列,可以作为学生自局部化模型的伪训练数据。将我们的方法应用于通用递归知识蒸馏场景时,我们的方法表现出稳定和一致的性能提升。
URL
https://arxiv.org/abs/2403.10552