Consider a linear transformation represented by a matrix $~$A$~$, and some vector $~$v$~$. If $~$Av = \lambda v$~$, we say that $~$v$~$ is an *eigenvector* of $~$A$~$ with corresponding eigenvalue $~$\lambda$~$. Intuitively, this means that $~$A$~$ doesn't rotate or change the direction of $~$v$~$; it can only stretch it ($~$|\lambda| > 1$~$) or squash it ($~$|\lambda| < 1$~$) and maybe flip it ($~$\lambda < 0$~$). While this notion may initially seem obscure, it turns out to have many useful applications, and many fundamental properties of a linear transformation can be characterized by its eigenvalues and eigenvectors.