Uncertainty in Artificial Intelligence
First Name   Last Name   Password   Forgot Password   Log in!
    Proceedings   Proceeding details   Article details         Authors         Search    
Iterative Splits of Quadratic Bounds for Scalable Binary Tensor Factorization
Beyza Ermis, Guillaume Bouchard
Abstract:
Binary matrices and tensors are popular data structures that need to be efficiently approxi- mated by low-rank representations. A standard approach is to minimize the logistic loss, well suited for binary data. In many cases, the num- ber m of non-zero elements in the tensor is much smaller than the total number n of possible en- tries in the tensor. This creates a problem for large tensors because the computation of the lo- gistic loss has a linear time complexity with n. In this work, we show that an alternative approach is to minimize the quadratic loss (root mean square error) which leads to algorithms with a training time complexity that is reduced from O(n) to O(m), as proposed earlier in the restricted case of alternating least-square algorithms. In addi- tion, we propose and study a greedy algorithm that partitions the tensor into smaller tensors, each approximated by a quadratic upper bound. This technique provides a time-accuracy trade- off between a fast but approximate algorithm and an accurate but slow algorithm. We show that this technique leads to a considerable speedup in learning of real world tensors.
Keywords:
Pages: 192-199
PS Link:
PDF Link: /papers/14/p192-ermis.pdf
BibTex:
@INPROCEEDINGS{Ermis14,
AUTHOR = "Beyza Ermis and Guillaume Bouchard",
TITLE = "Iterative Splits of Quadratic Bounds for Scalable Binary Tensor Factorization",
BOOKTITLE = "Proceedings of the Thirtieth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-14)",
PUBLISHER = "AUAI Press",
ADDRESS = "Corvallis, Oregon",
YEAR = "2014",
PAGES = "192--199"
}


hosted by DSL   •   site info   •   help