CS Events

Seminar

Machine Learning in a Setting of Ordinal Distance Information

 

Download as iCal file

Wednesday, September 20, 2017, 11:00am

 

In a typical machine learning scenario we are given numerical dissimilarity values between objects (or feature representations of objects, from which such dissimilarity values can readily be computed). In the relaxed setting of ordinal distance information we are only provided with binary answers to comparisons of dissimilarity values such as d(A,B)<d(A,C) instead. This setting has attracted interest in recent years, mainly due to the possibility of simple collection of such data by means of crowdsourcing.
In my talk, I want to present two of my contributions to this field. First, I will talk about a result that states the asymptotic uniqueness of ordinal embeddings. Ordinal embedding, up to now, is the standard approach to machine learning in a setting of ordinal distance information. The idea is to map the objects of a data set to points in a Euclidean space such that the points preserve the given ordinal distance information as well as possible (with respect to the Euclidean interpoint distances). Second, I will introduce data-dependent kernel functions that can be evaluated given only ordinal distance information about a data set. They provide a generic alternative to the ordinal embedding approach and avoid some of its drawbacks. For both works, I want to address a number of open questions.

Speaker: Matthäus Kleindessner

Bio

NULL

Location : CoRE A 301

Committee

Rutgers/DIMACS Theory of Computing Seminar

Event Type: Seminar

Organization

University of Tübingen, Germany