Please help transcribe this video using our simple transcription tool. You need to be logged in to do so.
A fundamental problem in statistics is the estimation of dependence between random variables. While information theory provides standard measures of dependence (e.g. Shannon-, Renyi-, Tsallis-mutual information), it is still unknown how to estimate these quantities from i.i.d. samples in the most efficient way. In this presentation we review some of our recent results on copula based nonparametric dependence estimators and demonstrate their robustness to outliers both theoretically in terms of finite-sample breakdown points and by numerical experiments in independent subspace analysis and image registration.
Questions and AnswersYou need to be logged in to be able to post here.