Amir massoud Farahmand
#
Manifold Learning

###
Amir Massoud Farahmand

*
Dept. of Computing Science, University of Alberta
*

In many real-world problems, such as visually-guided robots, we need
to deal with high-dimensional data. Unfortunately, what is so called
as "the curse of dimensionality" states that dealing with a problem
with a high-dimensional input can be very difficult in the worst case
unless there is some regularities in the problem which we exploit.
Manifold learning is an umbrella term for research directions and
methods that try to benefit from the possibility that the data come
from a lower-dimensional submanifold embedded in a higher-dimensional
space. The hope is that by exploiting this kind of regularity, using
methods from the mathematics of differential manifolds, we
have data analysis methods that efficiently work in problems with
high-dimensional input space.

In the first part of my talk, I introduce some prominent manifold
learning methods such as Isomap, Locally Linear Embedding, and
Laplacian Eigenmaps. These are basically nonlinear dimension
reduction methods.

In the second part of the talk, I represent my work and show that
there are certain machine learning methods that can provably benefit
from the fact that the data are lying on a lower-dimensional
submanifold.

###
Biography

Amir massoud Farahmand has a B.S. and M.S. in electrical engineering,
and is a Ph.D. student in the department of Computing Science.
Nowadays, his research interest is studying reinforcement learning
methods that can benefit from regularities of data such as smoothness
and having a low-dimensional submanifold structure.