Generalized Talagrand Inequality for Sinkhorn Distance using Entropy Power Inequality

Detta är en Master-uppsats från KTH/Skolan för elektroteknik och datavetenskap (EECS)

Sammanfattning: Measure of distance between two probability distributions plays a fundamental role in statistics and machine learning. Optimal Transport (OT) theory provides such distance. Recent advance in OT theory is a generalization of classical OT with entropy regularized, called entropic OT. Despite its convenience in computation, it still lacks theoretical support. In this thesis, we study the connection between entropic OT and Entropy Power Inequality (EPI). First, we prove an HWI-type inequality making use of the infinitesimal displacement convexity of OT map. Second, we derive two Talagrand-type inequalities using the saturation of EPI that corresponds to a numerical term in our expression. We evaluate for a wide variety of distributions this term whereas for Gaussian and i.i.d. Cauchy distributions this term is found in explicit form. We show that our results extend previous results of Gaussian Talagrand inequality for Sinkhorn distance to the strongly log-concave case. Furthermore, we observe a dimensional measure concentration phenomenon using the new Talagrand-type inequality. 

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)