Multimedia coding & delivery protocol & system (since 1996)

Recent works

Here are some recent works.

Riemannian optimization: overview

Let be a smooth real-valued function on a Riemannian manifold.
We consider

where is a given model variable. This problem has many applications; for example, in principal component analysis (PCA) and the subspace tracking problem on the Grassmann manifold. The low-rank matrix & tensor completion problem is a promising application concerning the manifold of fixed-rank matrices/tensors. The linear regression problem is also defined on the manifold of fixed-rank matrices. The independent component analysis (ICA) is an example defined on the oblique manifold.

where is the total number of the elements, which is generally extremely large.
A popular choice is the Riemannian stochastic gradient descent algorithm (R-SGD). As R-SGD calculates only for the -th sample, the complexity per iteration is independent of the size of .
However, R-SGD is hindered by a slow convergence rate. To this end, we propose the Riemannian stochastic variance reduced gradient algorithm (R-SVRG), which reduces the variance of the noisy stochastic gradient. R-SQN-VR has also recently been proposed, which achieves practical improvements for ill-conditioned problems. We also propose the Riemannian stochastic recursive gradient algorithm (R-SRG), which does not require the inverse of retraction between two distant iterates on the manifold. Convergence analyses of them are performed on both retraction-convex and non-convex functions under computationally efficient retraction and vector transport operations.

Riemannian inexact trust-region algorithm and theoretical analysis

The Riemannian trust-region algorithm (RTR) comes with a global convergence property, and a superlinear local convergence rate to a second-order optimal point. However, it is computationally prohibitive in a large-scale setting to handle big Hessian matrices. The proposed algorithm approximates the gradient and the Hessian in addition to the solution of a TR sub-problem. Addressing large-scale finite-sum problems, we specifically propose sub-sampled algorithms (Sub-RTR) with a fixed bound on sub-sampled Hessian and gradient sizes, where the gradient and Hessian are computed by a random sampling technique.