26868020190812204806.0978-1-4799-8131-1doi10.1109/ICASSP.2019.8683286CONFStochastic Gradient Descent for Spectral Embedding with Implicit Orthogonality ConstraintIEEE20192019Conference PapersIn this paper, we propose a scalable algorithm for spectral embedding. The latter is a standard tool for graph clustering. However, its computational bottleneck is the eigendecomposition of the graph Laplacian matrix, which prevents its application to large-scale graphs. Our contribution consists of reformulating spectral embedding so that it can be solved via stochastic optimization. The idea is to replace the orthogonality constraint with an orthogonalization matrix injected directly into the criterion. As the gradient can be computed through a Cholesky factorization, our reformulation allows us to develop an efficient algorithm based on mini-batch gradient descent. Experimental results, both on synthetic and real data, confirm the efficiency of the proposed method in term of execution speed with respect to similar existing techniques.251490El Gheche, Mireille291868Chierchia, Giovanni241061Frossard, Pascal101475IEEE ICASSPBrighton, UK12-17 May, 2019Proceedings of IEEE ICASSPpascal.frossard@epfl.chView paper in IEEExplorehttps://ieeexplore.ieee.org/document/8683286252393pascal.frossard@epfl.chLTS4U10851Marselli, Béatriceoai:infoscience.epfl.ch:268680STIconfpascal.frossard@epfl.chalessandra.bianchi@epfl.chEPFLREVIEWEDCONFoverwrite