Prof. Emo Welzl and Prof. Bernd Gärtner
|Mittagsseminar Talk Information|
Date and Time: Friday, February 10, 2017, 12:15 pm
Duration: 30 minutes
Location: CAB G11
Speaker: Ioannis Emiris (University of Athens)
Approximation algorithms have been trying to address the curse of dimensionality in geometric search. This talk focuses on recent and current work in Approximate Nearest Neighbor Search. The main paradigm for obtaining complexity bounds polynomial in the space dimension has been Locality-Sensitive Hashing (LSH). We take a different tack based on random projections, which also achieves polynomial complexity in the dimension, but also optimizes space consumption, which can be important in applications. Our first method is a randomised dimensionality reduction technique that generalizes the seminal Johnson-Lindenstrauss lemma and projects points to a space of significantly lower dimension, while maintaining sufficient information on proximity relations. We prove that approximate neighbors in the projection contain the image of an approximate neighbor in the original dataset. This yields a search time complexity which is sublinear in the number of data points, exponential in the inverse of the error bound, and linear in the dimension. Our second method proposes a simple data structure requiring linear space and sublinear query time for any approximation factor. For any metric space admitting an LSH family of functions, such functions are used to randomly project the input points to vertices of the 0/1 cube. The query point is similarly projected to the cube, then the algorithm examines points assigned to the same or nearby vertices. Experiments with both methods in up to 1000 dimensions and 1 Million points show that query time is somewhat slower than with LSH but space usage and preprocessing are more efficient.
Automatic MiSe System Software Version 1.4803M | admin login