# Conservation of Total Angular Momentum Proof

For this post, I want to prove that in the absence of external forces, the total angular momentum of an N-particle system is conserved.

I start with  $\vec{P} = \displaystyle \sum_{\alpha} \vec{p}_{\alpha}$ which is the total momentum of an N-particle system. Now I can vectorially multiple the total momentum by $\vec{r}$ which is the position vector measured from the same origin $O$ for each particle. This will give me the total angular momentum of the system which can be written as $\vec{L} = \displaystyle \sum_{\alpha = 1}^{N} \vec{\ell}_{\alpha} = \displaystyle\sum_{\alpha = 1}^{N} \vec{r}_{\alpha} \times \vec{p}_{\alpha}$. After differentiating with respect to $t$, I obtain $\dot{\vec{L}} = \displaystyle \sum_{\alpha} \dot{\vec{\ell}}_{\alpha} = \displaystyle \sum_{\alpha} \frac{d}{dt} (\vec{r} \times \vec{p}) = \displaystyle \sum_{\alpha} ( \dot{\vec{r}} \times \vec{p}) + (\vec{r} \times \dot{\vec{p}})$. In the first cross product, I can substitute $\vec{p}$ with $m\dot{\vec{r}}$, and since the cross product of any two parallel vectors is zero, the first term becomes zero. This leaves implies that my sum becomes $\displaystyle\sum_{\alpha}\vec{r}_{\alpha} \times \vec{F}_{\alpha}$. Now, I can rewrite the net force on particle $\alpha$ as $\vec{F}_{\alpha} = \displaystyle \sum_{\beta \neq \alpha} \vec{F}_{\alpha \beta}$, where $\vec{F}_{\alpha \beta}$ represents the force exerted on particle $\alpha$ by particle $\beta$. Now I can make a substitution for $\vec{F}_{\alpha}$ to give me $\dot{\vec{L}} = \displaystyle \sum_{\alpha} \displaystyle \sum_{\beta \neq \alpha} \vec{r}_{\alpha} \times \vec{F}_{\alpha \beta}$

…I will finish the rest of this at some point. I seem to have misplaced the book.

# Matrix Proof

I want show that $e^{(At)} = Se^{(\Lambda t)} S^{-1}$, where $S = (v_{1} v_{2})$,  $A$ is a $2 \times 2$ matrix, and $\Lambda = \begin{pmatrix} \lambda_{1} & 0 \\0 & \lambda_{2} \end{pmatrix}$

I start by writing the middle sum in summation notation which gives me $Se^{(\Lambda t)}S^{-1} = S( \displaystyle\sum_{k = 0}^{\infty} \frac{1}{k!}(\Lambda t)^{k})S^{-1}$. Now I can use the identity $S^{-1}AS = \Lambda$ which will then give me $S(\displaystyle\sum_{k = 0}^{\infty} \frac{1}{k!}(S^{-1}AS)^{k}t^{k})(S^{-1})$. After pulling terms out of the sum, I will get $SS^{-1} (\displaystyle \sum_{k = 0}^{\infty} \frac{t^{k}A^{k}}{k!})SS^{-1}$. The $SS^{-1}$ terms create an identity matrix and the middle sum is equivalent to $e^{At}$ as shown below.

# Matrix Proof

Given the matrix equation $\dot{u} = Au$, where $u =$ $\begin{pmatrix} x \\ y \end{pmatrix}$ and $A$ is a $2 \times 2$ matrix, I want to show that $\dot{u} = e^{At}u_{0}$ is the solution where $e^{At} = I + At + \frac{1}{2!}A^{2}t^{2} + \frac{1}{3!}A^{3}t^{3} + ...$ and $I$ is the identity matrix.

I start out by writing the above series in summation notation which gives me $\displaystyle\sum_{k = 0}^{\infty} \frac{t^{k}A^{k}}{k!}$. I can now take a time derivative of the sum to give me $\frac{d}{dt}e^{At} = \displaystyle\sum_{k = 0}^{\infty}\frac{kt^{k - 1}A^{k}}{k!}$. Since the first term of the series after differentiation is $0$, I can rewrite and reduce the sum to give me $\displaystyle\sum_{k = 1}^{\infty}\frac{t^{k - 1}A^{k}}{(k - 1)!}$. Now, I can pull out a single matrix term to give me $A \displaystyle\sum_{k = 1}^{\infty}\frac{t^{k - 1}A^{k - 1}}{(k - 1)!}$. I can now simplify the sum once again to give me $A \displaystyle\sum_{k = 0}^{\infty} \frac{t^{k}A^{k}}{k!}$. This is now equivalent to $Ae^{At}$. Assuming that $u = e^{At}u_{0}$  is the solution, I can differentiate it with respect to time to give me $\dot{u} = \frac{d}{dt}[e^{At}]u_{0}$. I just showed that $\frac{d}{dt} e^{At} = Ae^{At}$, which I can plug in to give me the equation $\dot{u} = Ae^{At}u_{0}$. Since $e^{At}u_{0} = u$, I am left with my original matrix equation $\dot{u} = Au$