Computing Square Root Factorization for Recursively Low-Rank Compressed Matrices

We present an algorithm for computing a factorization A = GG* for an n X n Hermitian positive definite matrix A, where both A and G have the same recursively low-rank compressed structure. This factorization is a Cholesky factorization in a general sense, because the factor G is not (block) triangular. Both time and storage costs are O(n). The factorization can be used for sampling a multivariate normal distribution or a Gaussian process, where A is the (compressed) covariance matrix. In this case, a Gaussian sample is the mean vector plus the matrix-vector product of G with a random vector from the standard Gaussian. The matrix-vector product can be formed by using an O(n) algorithm discussed in [6].

By: Jie Chen

Published in: RC25499 in 2014

LIMITED DISTRIBUTION NOTICE:

This Research Report is available. This report has been submitted for publication outside of IBM and will probably be copyrighted if accepted for publication. It has been issued as a Research Report for early dissemination of its contents. In view of the transfer of copyright to the outside publisher, its distribution outside of IBM prior to publication should be limited to peer communications and specific requests. After outside publication, requests should be filled only by reprints or legally obtained copies of the article (e.g., payment of royalties). I have read and understand this notice and am a member of the scientific community outside or inside of IBM seeking a single copy only.

rc25499.pdf

Questions about this service can be mailed to reports@us.ibm.com .