torch.qr¶
-
torch.
qr
(input, some=True, *, out=None) -> (Tensor, Tensor)¶ Computes the QR decomposition of a matrix or a batch of matrices
input
, and returns a namedtuple (Q, R) of tensors such that with being an orthogonal matrix or batch of orthogonal matrices and being an upper triangular matrix or batch of upper triangular matrices.If
some
isTrue
, then this function returns the thin (reduced) QR factorization. Otherwise, ifsome
isFalse
, this function returns the complete QR factorization.Warning
torch.qr
is deprecated. Please usetorch.linalg.qr()
instead.Differences with
torch.linalg.qr
:torch.linalg.qr
takes a string parametermode
instead ofsome
:some=True
is equivalent ofmode='reduced'
: both are the defaultsome=False
is equivalent ofmode='complete'
.
Warning
If you plan to backpropagate through QR, note that the current backward implementation is only well-defined when the first columns of
input
are linearly independent. This behavior will propably change once QR supports pivoting.Note
This function uses LAPACK for CPU inputs and MAGMA for CUDA inputs, and may produce different (valid) decompositions on different device types or different platforms.
- Parameters
input (Tensor) – the input tensor of size where * is zero or more batch dimensions consisting of matrices of dimension .
some (bool, optional) –
Set to
True
for reduced QR decomposition andFalse
for complete QR decomposition. If k = min(m, n) then:some=True
: returns (Q, R) with dimensions (m, k), (k, n) (default)'some=False'
: returns (Q, R) with dimensions (m, m), (m, n)
- Keyword Arguments
out (tuple, optional) – tuple of Q and R tensors. The dimensions of Q and R are detailed in the description of
some
above.
Example:
>>> a = torch.tensor([[12., -51, 4], [6, 167, -68], [-4, 24, -41]]) >>> q, r = torch.qr(a) >>> q tensor([[-0.8571, 0.3943, 0.3314], [-0.4286, -0.9029, -0.0343], [ 0.2857, -0.1714, 0.9429]]) >>> r tensor([[ -14.0000, -21.0000, 14.0000], [ 0.0000, -175.0000, 70.0000], [ 0.0000, 0.0000, -35.0000]]) >>> torch.mm(q, r).round() tensor([[ 12., -51., 4.], [ 6., 167., -68.], [ -4., 24., -41.]]) >>> torch.mm(q.t(), q).round() tensor([[ 1., 0., 0.], [ 0., 1., -0.], [ 0., -0., 1.]]) >>> a = torch.randn(3, 4, 5) >>> q, r = torch.qr(a, some=False) >>> torch.allclose(torch.matmul(q, r), a) True >>> torch.allclose(torch.matmul(q.transpose(-2, -1), q), torch.eye(5)) True