Tuesday 5 May 2009

exp(K)

It is time for some nice extra math, with some proofs and applications. I have two theorems that don't look obvious at first. And initially I kind of tried to avoid them.

The first one is that the matrix exponential of a real antisymmetric matrix is an orthogonal matrix with determinant +1

The second one is that for any matrix K (for example the anti-symmetric one from I) and a matrix A the following holds.


Or explicitely:


This is an expansion in the powers of K. And if K is small, you can cut it at some point. The reason why this is useful is that it gives you a direct possibility to manipulate orthogonal matrices. You don't have to worry about making the matrix orthogonal but you just choose any paramters for K and exp(K) will be orthogonal. II gives the possibility to carry out the basis transformation in an efficient way.


The first part of I is actually very straight forward if you write it down like this. It is provided that the limit of a transposed sequence is the transposition of the original limit and that the inverse is formed as shown.


I will prove the second part with the spectral theorem (maybe you can also do this in a more direct way). If we move into the complex space. K is now correctly spoken anti-Hermitian, and we know that a unitary matrix U exists which diagonalizes K. The eigenvalues of K are purely imaginary (yet quite important).


The exponential is given according to


At this point you can also notice that exp(K) is unitary because it is formed as a product of three unitary matrices.

Can we say anything about the determinant? Yes of course.


The determinants of the mutually inverse matrices UH and U cancel out. The determinant of a diagonal matrix is just the product of the diagonal terms.


The sum of the eigenvalues is the trace of K. In a real antisymmetric matrix all diagonal elements have to be zero (therefore also the trace).


And that is what we wanted to see.


The proof shows that exp(.) is a mapping between the set of real antisymmetric matrices and the set of real orthogonal matrices with determinant +1 (and of course that -exp(.) is the mapping to matrices with det=-1). In fact it would also be necessary to show that this mapping is bijective (i.e. that there is a 1:1 correspondence). This is apparently true but I don't think the proof is so simple. Interestingly it does not work with complex anti-Hermitian matrices. In this case there are n extra purely imaginary values in the diagonal. And the mapping is no longer injective for example:




Point I was the elegant part. Point II is more of the index juggling variety but a nice piece of complete induction. I will probably put that up soon, too.

No comments: