Indoor place classification using robot behavior and vision data

Chuho Yi, Young Ceol Oh, Il Hong Suh, Byung Uk Choi

Research output: Contribution to journalArticle

2 Scopus citations


To realize autonomous navigation of intelligent robots in a variety of settings, analysis and classification of places, and the ability to actively collect information are necessary. In this paper, visual data are organized into an orientation histogram to roughly express input images by extracting and cumulating straight lines according to direction angle. In addition, behavioral data are organized into a behavioral histogram by cumulating motions performed to avoid obstacles encountered while the robot is executing specified behavioral patterns. These visual and behavioral data are utilized as input data, and the probability that a place belongs to a specific class is calculated by designating the places already learnt by the robot as categories. The naïve Bayes classification method is employed, in which the probability is calculated that the input data belong to each specific category, and the category with the highest probability is then selected. The location of the robot is classified by merging the probabilities for visual and behavioral data. The experimental results are as follows. First, a comparison of behavioral patterns used by the robot to collect data about a place indicates that a rotational behavior pattern provides the best performance. Second, classification performance is more accurate with two types of input data than with a single type of data.

Original languageEnglish
Pages (from-to)49-60
Number of pages12
JournalInternational Journal of Advanced Robotic Systems
Issue number5
StatePublished - 2011 Nov 17


  • Behavior data
  • Naïve bayes
  • Place classification
  • Vision data

Fingerprint Dive into the research topics of 'Indoor place classification using robot behavior and vision data'. Together they form a unique fingerprint.

  • Cite this