QR Decomposition using Python

Python3

import numpy as np
 
# Create a numpy array
arr = np.array([[1, 2, 4], [0, 0, 5],
                [0, 3, 6]])
 
print(arr)
 
# Find the QR factor of array
q, r = np.linalg.qr(arr)
print('\nQ:\n', q)
print('\nR:\n', r)
print(np.allclose(arr, np.dot(q, r)))  # to check result is correct or not

                    

Output:

[[1 2 4]
[0 0 5]
[0 3 6]]
Q:
[[ 1. 0. 0.]
[ 0. 0. -1.]
[ 0. -1. 0.]]
R:
[[ 1. 2. 4.]
[ 0. -3. -6.]
[ 0. 0. -5.]]
True

Mathematical explantions

Let’s understand the QR Decomposition process by

Suppose we are provided with the matrix A:

As mentioned in the steps before, we will be using Gram-Schmidt Orthogonalization.

We will be finding orthogonal components q1 , q2 and q3 :

First, perform normalization and we get the first normalized vector:

The norm of the first column is calculated as:

The inner product of between a2 and q1 is " title="Rendered by QuickLaTeX.com" height="20" width="108" style="vertical-align: 28px;"> = is considered and the projection of the second column on is multiplied with the inner product.

is the residual of the projection:

q1 \\[10pt] \hspace{0.55cm} = \begin{bmatrix} 2 \\ 0 \\ 3 \end{bmatrix} - 2 * \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} \\[10pt] \hspace{0.55cm} = \begin{bmatrix} 0 \\ 0 \\ 3 \end{bmatrix} \\[10pt] " title="Rendered by QuickLaTeX.com" height="247" width="300" style="vertical-align: 0px;">

Now, we will normalize the residual:

Now, we will project a3 on q1 and q2 :

q1 - q2 \\[10pt] \hspace{0.55cm} = \begin{bmatrix} 4 \\ 5 \\ 6 \end{bmatrix} - 4 * \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} - 6 * \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} \\[10pt] \hspace{0.55cm} = \begin{bmatrix} 0 \\ 5 \\ 0 \end{bmatrix} \\[10pt] " title="Rendered by QuickLaTeX.com" height="222" width="410" style="vertical-align: 0px;">

Now, we will normalize the residual. :

We got Q matrix.

The given R is an upper triangular matrix.

Mathematical Calculation (Q = [q1 q2 q3], so A = QR) value is different compared to python Numpy package. Reason described below.

Reason for difference of NumPy results and our calculation from steps:

The QR decomposition is not unique all the way down to the signs. One can flip signs in Q as long as you flip the corresponding signs in R. Some implementations enforce positive diagonals in R, but this is just a convention. Since NumPy defer to LAPACK for these linear algebra operations, we follow its conventions, which do not enforce such a requirement.

QR Decomposition in Machine learning

QR decomposition is a way of expressing a matrix as the product of two matrices: Q (an orthogonal matrix) and R (an upper triangular matrix). In this article, I will explain decomposition in Linear Algebra, particularly QR decomposition among many decompositions.

Similar Reads

What is QR Decomposition?

Decomposition or Factorization is dividing the original single entity into multiple entities for easiness. Decomposition has various applications in numerical linear algebra, optimization, solving systems of linear equations, etc. QR decomposition is a versatile tool in numerical linear algebra that finds applications in solving linear systems, least squares problems, eigenvalue computations, etc. Its numerical stability and efficiency make it a valuable technique in a range of applications....

Compute QR decomposition:

Gram-Schmidt Orthogonalization...

QR Decomposition using Python

Python3 import numpy as np # Create a numpy arrayarr = np.array([[1, 2, 4], [0, 0, 5],                [0, 3, 6]]) print(arr) # Find the QR factor of arrayq, r = np.linalg.qr(arr)print('\nQ:\n', q)print('\nR:\n', r)print(np.allclose(arr, np.dot(q, r)))  # to check result is correct or not...

Applications:

...

Advantages

It has many applications in algebra and machine learning whether it is for least square method, linear regression, PCA, eigenvalue problem or regularization of model in machine learning. Few of them are written below....

Disadvantage:

It allows for a numerically stable and efficient solution of system of equation.Compared to LU decomposition, this method does not require that the decomposition be carried out on a square matrix....

Contact Us