FBIMATRIX (Full Bayesian Inference in Matrix and Tensor Factorization Models)
2016-2021
ANR-16-CE23-0014
FBIMATRIX was an international collaborative research project that was jointly supported by the Agence Nationale de la Recherche (ANR) and the Scientific and Technological Research Council of Turkey (TUBITAK).
Scientific Scope:
Matrix and tensor factorization methods provide a unifying view for a broad spectrum of techniques in machine learning and signal processing, providing both sensible statistical models for datasets as well as efficient computational procedures framed as decomposition algorithms. So far, algebraic or optimization based approaches prevailed for computation of such factorizations. In contrast, the FBIMATRIX project aims to develop the state-of-the-art Markov Chain Monte Carlo (MCMC) methods for Full Bayesian Inference in MATRIX and tensor factorization models. The randomization of Monte Carlo is useful in both Bayesian and non-Bayesian analysis such as model selection, model averaging, privacy preservation or simply better accuracy in computing approximate solutions. MCMC methods are generally perceived as being computationally very demanding and impractical, yet by exploiting parallel and distributed computation, we wish to push the state-of-the-art in terms of scalability, statistical efficiency, computational and communication complexity. In fact, we perceive MCMC as a natural general purpose computational tool of the future for inference and model selection in distributed data, eventually complementing optimization for certain big data problems due to its inherently randomized nature. The project will address Bayesian model selection and model averaging for factorization models, using parallel and distributed computation and current advances in Hybrid Monte Carlo methods that simulate an augmented stochastic dynamics. As such, we aim at developing faster algorithms for hard computational problems such as marginal likelihood estimation and improving convergence rates. We will illustrate the practical utility of the developed parallel and distributed MCMC methods on two challenging applications from two domains: audio source separation and missing link prediction.
Partners:
French Side:
- Télécom ParisTech (scientific coordinator: Assoc. Prof. Umut Simsekli)
Turkish Side:
- Boğaziçi University (scientific coordinator: Prof. A. Taylan Cemgil)
- Sabancı University
Publications and Pre-prints:
Journals
- S. Yildirim, M. B. Kurutmaz, M. Barsbey, U. Simsekli, A. T. Cemgil, « Bayesian Allocation Model: Marginal Likelihood-based Model Selection for Count Tensors », IEEE Journal of Selected Topics in Signal Processing (JSTSP), 2020
- T. H. Nguyen, U. Şimşekli, G. Richard, A. T. Cemgil, « Efficient Bayesian Model Selection in PARAFAC via Stochastic Thermodynamic Integration », IEEE Signal Processing Letters (SPL), 2018
International Conferences
- U. Şimşekli, O. Sener, G. Deligiannidis, M. A. Erdogdu, « Hausdorff Dimension, Heavy Tails, and Generalization in Neural Networks », Advances in Neural Information Processing Systems Conference (NeurIPS), 2020
- K. Nadjahi, A. Durmus, L. Chizat, S. Kolouri, S. Shahrampour, U. Şimşekli, « Statistical and Topological Properties of Sliced Probability Divergences », Advances in Neural Information Processing Systems Conference (NeurIPS), 2020
- V. De Bortoli, A. Durmus, X. Fontaine, U. Şimşekli, « Quantitative Propagation of Chaos for SGD in Wide Neural Networks », Advances in Neural Information Processing Systems Conference (NeurIPS), 2020
- A. Camuto, M. Willetts, U. Şimşekli, S. Roberts, C. Holmes, « Explicit Regularisation in Gaussian Noise Injections », Advances in Neural Information Processing Systems Conference (NeurIPS), 2020
- U. Şimşekli, L. Zhu, Y. W. Teh, M. Gürbüzbalaban, « Fractional Underdamped Langevin Dynamics: Retargeting SGD with Momentum under Heavy-Tailed Gradient Noise », International Conference on Machine Learning (ICML), 2020
- K. Nadjahi, V. De Bortoli, A. Durmus, R. Badeau, U. Şimşekli, « Approximate Bayesian Computation with the Sliced-Wasserstein Distance », IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Barcelona, Spain, 2020
- T. H, Nguyen, U. Şimşekli, M. Gürbüzbalaban, G. Richard, « First Exit Time Analysis of Stochastic Gradient Descent Under Heavy-Tailed Gradient Noise« , Advances in Neural Information Processing Systems Conference (NeurIPS), Vancouver, British Columbia, Canada, 2019
- K. Nadjahi, A. Durmus, U. Şimşekli, R. Badeau, « Asymptotic Guarantees for Learning Generative Models with the Sliced-Wasserstein Distance », Advances in Neural Information Processing Systems Conference (NeurIPS), Vancouver, British Columbia, Canada, 2019 (Spotlight Presentation)
- S. Kolouri, K. Nadjahi, U. Şimşekli, R. Badeau, G. K. Rohde, « Generalized Sliced Wasserstein Distances », Advances in Neural Information Processing Systems Conference (NeurIPS), Vancouver, British Columbia, Canada, 2019
- U. Şimşekli, L. Sagun, M. Gürbüzbalaban, « A Tail-Index Analysis of Stochastic Gradient Noise in Deep Neural Networks », International Conference on Machine Learning (ICML),Long Beach, CA, USA, 2019
- T. H. Nguyen, U. Şimşekli, G. Richard, « Non-Asymptotic Analysis of Fractional Langevin Monte Carlo for Non-Convex Optimization », International Conference on Machine Learning (ICML),Long Beach, CA, USA, 2019
- A. Liutkus, U. Şimşekli, S. Majeswsky, A. Durmus, F. Stoter, “Sliced-Wasserstein Flows: Nonparametric Generative Modeling via Optimal Transport and Diffusions”, International Conference on Machine Learning (ICML),Long Beach, CA, USA, 2019
- T. Birdal, U. Şimşekli, « Probabilistic Permutation Synchronization using the Riemannian Structure of the Birkhoff Polytope », IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 2019
- S. Leglaive, U. Şimşekli, A. Liutkus, L. Girin, R. Horaud, « Speech Enhancement with Variational Autoencoders and Alpha-Stable Distributions », IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Brighton, UK, 2019
- T. Birdal, U. Şimşekli, M. O. Eken, S. Ilic, “Bayesian Pose Graph Optimization via Bingham Distributions and Tempered Geodesic MCMC”, Advances in Neural Information Processing Systems Conference (NeurIPS), Montréal, Quebec, Canada, 2018
- U. Şimşekli, Ç. Yildiz, T. H. Nguyen, G. Richard, A. T. Cemgil, “Asynchronous Stochastic Quasi-Newton MCMC for Non-Convex Optimization”, International Conference on Machine Learning (ICML), Stockholm, Sweden, 2018
- M. Fontaine, F. R. Stöter, A. Liutkus, U. Şimşekli, R. Serizel, R. Badeau, « Multichannel Audio Modeling with Elliptically Stable Tensor Decomposition », International Conference on Latent Variable Analysis and Signal Separation (LVA-ICA), Guildford, UK, 2018
- U. Şimşekli, H. Tinky-Winky, S. Leglaive, A. Liutkus, R. Badeau, G. Richard, « Alpha-stable Low-rank Plus Residual Decomposition For Speech Enhancement », IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Calgary, AB, Canada, 2018
- M. Jas, T. Dupré La Tour, U. Şimşekli, A. Gramfort, « Learning the Morphology of Brain Signals Using Alpha-Stable Convolutional Sparse Coding » , Advances in Neural Information Processing Systems (NIPS), Long Beach, CA, USA, 2017
- U. Şimşekli, « Fractional Langevin Monte Carlo: Exploring Lévy Driven Stochastic Differential Equations for Markov Chain Monte Carlo » , International Conference on Machine Learning (ICML), Sydney, Australia, 2017
- U. Şimşekli, A. Durmus, R. Badeau, G. Richard, E. Moulines, A. T. Cemgil, « Parallelized Stochastic Gradient Markov Chain Monte Carlo Algorithms For Non-negative Matrix Factorization » , IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), New Orleans, LA, USA, 2017
International Workshops
- M. B. Kurutmaz, M. Barsbey, A. T. Cemgil, S. Yildirim, U. Şimşekli, « Bayesian Model Selection for Identifying Markov Equivalent Causal Graphs », Advances in Approximate Bayesian Inference Workshop (AABI), Vancouver, British Columbia, Canada, 2019
- M. B. Kurutmaz, A. T. Cemgil, U. Şimşekli, M. Barsbey, S. Yildirim, « Bayesian Learning of Non-Negative Matrix/Tensor Factorizations by Simulating Polya Urns », Advances in Approximate Bayesian Inference Workshop (AABI), Montréal, Quebec, Canada, 2018
- M. B. Kurutmaz, A. T. Cemgil, U. Şimşekli, S. Yildirim, « Bayesian Nonnegative Matrix Factorization as an Allocation Model », Advances in Approximate Bayesian Inference Workshop (AABI), in Neural Information Processing Systems Conference (NIPS), Long Beach, California, USA, 2017
Preprints
- U. Şimşekli, M. Gürbüzbalaban, L. Sagun, T. H. Nguyen, G. Richard, « On the Heavy-Tailed Theory of Stochastic Gradient Descent for Deep Neural Networks », arXiv, 2019
- A. T. Cemgil, M. B. Kurutmaz, S. Yildirim, M. Barsbey, U. Şimşekli, « Bayesian Allocation Model: Inference by Sequential Monte Carlo for Nonnegative Tensor Factorizations and Topic Models using Polya Urns », arXiv, 2019
Events:
- Kickoff meeting, Istanbul, 30 March, 2017
- Scientific meeting, Paris, 26-30 June, 2017
- Scientific meeting, Paris, 10-26 August, 2018
- Scientific meeting, Istanbul, 09-15 October, 2018