Zhang, Kaiqi and Hawkins, Cole and Zhang, Zheng (2022) General-Purpose Bayesian Tensor Learning With Automatic Rank Determination and Uncertainty Quantification. Frontiers in Artificial Intelligence, 4. ISSN 2624-8212
pubmed-zip/versions/1/package-entries/frai-04-668353/frai-04-668353.pdf - Published Version
Download (559kB)
Abstract
A major challenge in many machine learning tasks is that the model expressive power depends on model size. Low-rank tensor methods are an efficient tool for handling the curse of dimensionality in many large-scale machine learning models. The major challenges in training a tensor learning model include how to process the high-volume data, how to determine the tensor rank automatically, and how to estimate the uncertainty of the results. While existing tensor learning focuses on a specific task, this paper proposes a generic Bayesian framework that can be employed to solve a broad class of tensor learning problems such as tensor completion, tensor regression, and tensorized neural networks. We develop a low-rank tensor prior for automatic rank determination in nonlinear problems. Our method is implemented with both stochastic gradient Hamiltonian Monte Carlo (SGHMC) and Stein Variational Gradient Descent (SVGD). We compare the automatic rank determination and uncertainty quantification of these two solvers. We demonstrate that our proposed method can determine the tensor rank automatically and can quantify the uncertainty of the obtained results. We validate our framework on tensor completion tasks and tensorized neural network training tasks.
Item Type: | Article |
---|---|
Subjects: | EP Archives > Multidisciplinary |
Depositing User: | Managing Editor |
Date Deposited: | 23 Mar 2023 05:39 |
Last Modified: | 02 Jun 2024 06:49 |
URI: | http://research.send4journal.com/id/eprint/891 |