AUPAIR: AutonomoUs Personal AssIstant Robot using Integrated Deep Learning Methods
This paper describes the technical approach and software algorithms implemented in AutonomoUs Personal AssIstant Robot (AUPAIR) framework that has been optimized with the hardware specification of SoftBank Pepper Robot. AUPAIR was used by Team AUPAIR in the 2017 RoboCup@Home challenge Social Standard Platform league. As it is inferable by the title of the league, no modifications are allowed to the Pepper robot. Therefore, to fully achieve the capability with the standardized hardware specifications, we focused on enhancing the robot by adopting deep learning to each modular components. Thi to robustly and accurately perceive the environment and act precisely with the perceived results.
The proposing framework is divided into three major modular components: Perception, Action, and Learning. Further, each category is divided into respective finer submodules. For example, in Perception: lower level of perceptions are basic object recognition and detection, human identiﬁcation and body skeleton; the higher level of perception skills are activity recognition, object distinction and scene description.
Using our AUPAIR framework, we solved ﬁve speciﬁc and two general scenarios that model dynamic learning environment. These challenging scenarios were designed by the RoboCup@Home committees and in every scenario, AUPAIR framework had state-of-the-art performance winning overall the 1st place. AUPAIR implementation will be publicly available for the research community at https://bi.snu.ac.kr/Robocup/index.html