Kernel Stein Discrepancy on Lie Groups: Theory and Applications

被引:0
|
作者
Qu, Xiaoda [1 ]
Fan, Xiran [2 ]
Vemuri, Baba C. [3 ]
机构
[1] University of Florida, Department of Statistics, Gainesville,FL,32611, United States
[2] Visa, San Francisco,CA,94128, United States
[3] University of Florida, Department of CISE, Gainesville,FL,32611, United States
关键词
D O I
10.1109/TIT.2024.3468212
中图分类号
学科分类号
摘要
Distributional approximation is a fundamental problem in machine learning with numerous applications across all fields of science and engineering and beyond. The key challenge in most approximation methods is the need to tackle the intractable normalization constant present in the candidate distributions used to model the data. This intractability is especially common in distributions of manifold-valued random variables such as rotation matrices, orthogonal matrices etc. In this paper, we focus on the distributional approximation problem in Lie groups since they are frequently encountered in many applications including but not limited to, computer vision, robotics, medical imaging and many more. We present a novel Stein's operator on Lie groups leading to a kernel Stein discrepancy (KSD), which is a normalization-free loss function. We present several theoretical results characterizing the properties of this new KSD on Lie groups and its minimizer namely, the minimum KSD estimator (MKSDE). Properties of MKSDE are presented and proved, including strong consistency, CLT and a closed form of the MKSDE for the von Mises-Fisher and in general, the exponential family on SO nolimits (N). Finally, we present several experimental results depicting advantages of MKSDE over maximum likelihood estimation. © 2024 IEEE.
引用
收藏
页码:8961 / 8974
相关论文
共 50 条