AvnishPart 24 : Diagonalization and Similarity of MatricesDiagonalization is a process of converting a n x n square matrix into a diagonal matrix having eigenvalues of first matrix as diagonal…Aug 23, 20191Aug 23, 20191
AvnishPart 23 : Orthonormal Vectors, Orthogonal Matrices and Hadamard MatrixTwo vector x and y are said to be orthogonal if they are perpendicular to each other i.e. their dot product is 0.Aug 18, 20191Aug 18, 20191
AvnishPart 22 : Eigenvalues and EigenvectorsWe multiply a vector x with A, we get another vector as a product. If that product vector points in same direction as x, then it will be…Aug 2, 2019Aug 2, 2019
AvnishPart 21 : Properties of DeterminantsWe have solved determinants using Laplace expansion but by leveraging the properties of determinants, we can solve determinants much…Jul 12, 2019Jul 12, 2019
AvnishPart 20 : DeterminantsWe learned about minors and cofactors in Part 19. Now, we have per-requisite knowledge for calculating determinant of any (square) matrix.Jun 17, 2019Jun 17, 2019
AvnishPart 19 : Minors and CofactorsA determinant is a scalar quantity that was introduced to solve linear equations. To compute the determinant of any matrix we have to…Jun 7, 2019Jun 7, 2019
AvnishPart 18 : NormsNorm is a function that returns length/size of vector (except zero vector).Apr 19, 20192Apr 19, 20192
AvnishPart 17 : ProjectionsWe have covered projections in Dot Product. Now, we will take a deep dive into projections and projection matrix.Apr 15, 20191Apr 15, 20191
AvnishPart 16 : Dimension and BasisThe maximum number of linearly independent vectors in V will be called dimension of V. Represented as dim(V).Mar 20, 2019Mar 20, 2019
AvnishPart 15 : Orthogonality and four fundamental subspacesTwo vectors are said to be orthogonal if they are subtended on a right angle.Mar 15, 2019Mar 15, 2019