One of the most used matrix decompositions is the eigendecomposition, which is related to the concept of diagonalization, and decomposes a matrix into eigenvectors and eigenvalues. Eigendecomposition plays a key role in computer vision and machine learning in general. Well known examples are PCA (Principal Component Analysis) for dimensionality reduction or EigenFaces for face recognition. As another important example of the use of this decomposition, Google, relies upon eigenvalues and eigenvectors to rank pages with respect to relevance.
Although this problem of the week doesn't request to find the eigendecomposition of a matrix, it is related to the concept of eigenvalues, and to solve it, you will need to know the foundation of the procedures used to compute them.
Orthogonal bases have some practical advantages and are very useful when dealing with projections onto subspaces. These bases are defined in spaces equipped with an inner product also called a dot product, and by definition, a basis is called orthogonal if every pair of basis vectors are orthogonal, that is, their inner product is 0. When the length of each vector is 1 (vectors are normalized), the basis is called an orthonormal basis.
In an inner product space, it is always possible to get an orthonormal basis starting from any basis, by using the Gram-Schmidt algorithm. To solve this problem of the week you will need to prove you master the Gram–Schmidt process, and you also need to compute the change of basis matrix.
In Linear Algebra the most important subspaces are tied to matrices. One of these subspaces is the Column Space, which consists of all linear combinations of the columns of a matrix. This subspace, spanned by the columns of a matrix, is crucial in Linear Algebra and it is related to 4 of the most important subjects: Matrices, System of Linear Equations, Vector Spaces and Linear Transformations.
This problem of the week is about this subspace, but you should also apply the concept of symmetric matrix.
Linear transformations are one of the key concepts of Linear Algebra, and they are considered the most useful part of this branch of mathematics. A linear transformation is a mapping between two vector spaces that preserves linearity.
There are some important concepts students must master to solve linear transformation problems, like kernel, image, nullity, and rank of a linear transformation. This problem of the week will deal with the kernel (the set of vectors in the starting vector space which are transformed to the zero vector) and nullity of a linear transformation, and its solution only requires to know how to work with matrices and make elementary row operations.
Many concepts of Linear Algebra have emerged from geometric problems, and then generalized to non-visual higher-dimensional spaces. Some of the most widely used geometric concepts are length, distance and perpendicularity, which provide powerful geometric tools for solving many applied problems, including the least-squares problems.
These all three notions are defined in terms of the inner product of two vectors, which is also the key concept to deal with orthogonal bases, the subject of the problem of this week. Orthogonal bases, and particularly orthonormal bases, are very useful when dealing with projections onto subspaces, among other problems.