National Robotics Initiative (NRI) funded by National Science Foundation
~The realization of co-robots acting in direct support of individuals and groups~
The aim of this project is to introduce the capability of tracking and localizing human partners outside of the robots’ Field of View (FOV) to the co-robots for human-robot interaction or estimating partner’s location outside the FOV in short. This is a capability that existing robots do not have but humans do. Blind people are known to have an astonishing echolocation capability. The principal investigator and formed team have previously worked on relevant topics and have developed various techniques for estimating the location of a sound target outside of the FOV, or inside the Non-FOV(NFOV). These technologies are however inapplicable to Human-Robot Interaction (HRI) whilst other researchers also have not tackled the robot’s auditory capabilities in NFOV target estimation for HRI. With the accumulated knowledge and experience of the team, this project’s aim is to:
- Develop an approach that auditorily estimates the location of a NFOV target by learning from humans and
utilizing the physics of sound wave propagation associated with the NFOV target;
- Develop an extensive approach that estimates a NFOV target in unknown indoor environments by using both
visual and auditory sensors;
- Implement the approaches as a new capability of the robot and quantitatively evaluate the capability for its future
application to HRI problems.
- Takami, K., Furukawa, T., Kumon, M., Kimoto, D. and Dissanayake, G., 2015. Estimation of a nonvisible field-of-view mobile target incorporating optical and acoustic sensors. Autonomous Robots, pp.1-17.
- Takami, K., Furukawa, T., Kumon, M. and Mak, L.C., 2015, September. Non-field-of-view indoor sound source localization based on reflection and diffraction. In Multisensor Fusion and Integration for Intelligent Systems (MFI), 2015 IEEE International Conference on (pp. 59-64). IEEE.