Header logo is

Robust local learning in high dimensional spaces

1998

Conference Paper

am


Incremental learning of sensorimotor transformations in high dimensional spaces is one of the basic prerequisites for the success of autonomous robot devices as well as biological movement systems. So far, due to sparsity of data in high dimensional spaces, learning in such settings requires a significant amount of prior knowledge about the learning task, usually provided by a human expert. In this paper, we suggest a partial revision of this view. Based on empirical studies, we observed that, despite being globally high dimensional and sparse, data distributions from physical movement systems are locally low dimensional and dense. Under this assumption, we derive a learning algorithm, Locally Adaptive Subspace Regression, that exploits this property by combining a dynamically growing local dimensionality reduction technique as a preprocessing step with a nonparametric learning technique, locally weighted regression, that also learns the region of validity of the regression. The usefulness of the algorithm and the validity of its assumptions are illustrated for a synthetic data set, and for data of the inverse dynamics of human arm movements and an actual 7 degree-of-freedom anthropomorphic robot arm.

Author(s): Vijayakumar, S. and Schaal, S.
Book Title: 5th Joint Symposium on Neural Computation
Pages: 186-193
Year: 1998
Publisher: Institute for Neural Computation, University of California, San Diego

Department(s): Autonomous Motion
Bibtex Type: Conference Paper (inproceedings)

Address: San Diego, CA
Cross Ref: p1161
Note: clmc

BibTex

@inproceedings{Vijayakumar_JSNC_1998,
  title = {Robust local learning in high dimensional spaces},
  author = {Vijayakumar, S. and Schaal, S.},
  booktitle = {5th Joint Symposium on Neural Computation},
  pages = {186-193},
  publisher = {Institute for Neural Computation, University of California, San Diego},
  address = {San Diego, CA},
  year = {1998},
  note = {clmc},
  crossref = {p1161}
}