본문 바로가기

Math/Linear Algebra

(15)
Diagonlizable Matrix Example 1.1 ( Diagonalization with N independent eigenvectors ) Suppose we have n linearly independent eigenvectors of $A$ Put them in the columns of $S$ $AS = A[v_1,...,v_n]=[Av_1,...,Av_n]=[\lambda_1 v_1,...,\lambda_n v_n]=[v_1,...,v_n]\begin{pmatrix}[\lambda_1 & 0 & ... &\\0&\lambda_2&...& 0]\end{pmatrix}=S\Lambda$ $S^{-1}AS = \Lambda$ Example 1.2 ($A^2$) under the same setting with exmple 1...
고유값과 고유벡터 Eigenvectors $Ax$ parallel to $x$ What does it mean to be parallel? ⇒ $Ax=\lambda x$ Example 1.1 $\lambda=0$ $\lambda=0$ 이면, Eigen vector들은 $A$의 Null Space가 된다. 즉 $A$가 non singular면 $\lambda=0$인 Eigen Value를 가진다. Example 1.2 Projection For any $x$ in plane , $Px=x$. $\lambda =1$ and the whole plane becomes eigen vector For any $x\perp plane$ , $Px=0$ eigen value becomes 0 ⇒ Therefore, eigen valu..
Cramer's Rule, Inverse Matrix, Volume $(detA )A=C^T$ 어떤 행렬의 determinant를 Cofactor로 표현할 수 있다. 2 by 2 에 대해서는 아래의 간단한 예시를 생각해볼 수 있다. Cofactor formula를 생각해보면, $\sum\limits_i a_{1i}C_{1i}$이다 diagonal term에 대해서는 Cofactor 표현식이 성립함을 알 수 있다. 또한 Off Diagonal Term이 0이 되는 이유는 , $\sum\limits a_{1i}C_{2i}$를 구하는 것을 생각해보면, 결국에 A matrix의 2행이 1행에 해당하는 $a_{1*}$으로 바뀐 행렬의 Determinant를 구한다고 볼 수 있다. 동일행이 두개가 되므로 이는 Non Singular Matrix가 되고, 따라서 off diagona..
Cofactors and Determinants n by n determinant 구하기 $|a;b;c;d|=|a;0;c;d|+|0;b;c;d|$ ( by Property 3: linearity of determinant) $=|a;0;c;0|+|a;0;0;d|+|0;b;c;0|+|0;b;0;d|=ad-bc$ Now we can do 3 by 3,4 by 4 ,.... 3 by 3의 경우도 아래처럼 linearity 와 permutation 성질을 활용해서 Determinant를 수비게 구할 수 있다. Example 1 이를 일반화 해서 생각해보면, n by n matrix의 경우, $det A=\Sigma_{n!}$처럼 된다. Example 2 이를 활용하면 $A^T$와 $A$의 Determinant가 같다는 것도 이해할 수 있다. Example 3..
Determinant Properties of Determinant 1) $detI=1$ 2) Exchange rows : reverse sign of determinant permutation matrix has determinnat 1 or -1 (even or odd changes) $|0;1;1;0|$ = -1 10 rows exchange 로 7 rows exchange된 Matrix를 만들 수 없다! 3) Linear for each Row a) $|ta;b;tc;d|=t|a;b;c;d|$ b) $|a+a';b+b';c;d|=|a;b;c;d|+|a';b';c;d|$ 4) 2 row equal ⇒ det = 0 why for n by n ? Exchange the rows ⇒ get th..
Orthonormal basis Definition 1.1 orthonormal sets For a set of $k$ vectors $q_1,q_2,...q_k\in Q$, is is said to be an orthonormal set if and only if $q^T_i q_j =0$ $if\;i\neq j\;else \;1$ Therefore, all vectors are orthogonal to each other and have unir norm. Definition 1.2 orthonormal basis If orthonormal set is a basis for its space, then it is called an orthonormal basis Definition 1.3 orthogonal matrix If $Q$..
Least Squares and Straight Line Projection $P=A(A^TA)^{-1}A^T$ extreme cases If $b$ is in columns space , $Pb=b$ $b$ is column combination of $A$ $b=Ax$ ⇒ $Pb= A(A^TA)^{-1}A^TAx=Ax=b$ If $bㅗ$columns space, $Pb=0$ $b$ is in left null space ⇒ $A^Tb=0$ Projection onto left null space $I-P$ is projection matrix onto left null space Find the best straight line line : $b=C+Dt$ data points : (1,1), (2,1) ,(3,1) $C+D=1$ $C+2D=2$ $C+3D..
Projections onto subspace Projection of Vector $p = \frac{aa^T}{a^Ta}b$ If b is doubled, the projection of b is doubled If a is doubled the projection of b is the same $proj\;p=Pb$ $P=\frac{aa^T}{a^Ta}$ $col(P)=line\;through\;a$ $rank(P)=1$ $P^T=P$ $P^2= P$ Projection of Matrix Because $Ax=b$ may have no solution. ⇒ find closest vector in the columns space ⇒ Solve $A\hat{x}=p$ $p= a_1\hat{x_1} +a_2\hat{x_2}$ $p=A\hat{x}$..