e^A

From: Squeak <squeak_at_xirr.com>
Date: Wed, 15 Sep 1999 03:31:21 -0400 (EDT)

Herein are the ramblings of a deranged mind, see M.Artin, Algebra for
details on derangement, or algebra.

defining cos (A) = (e^Ai+ e^-Ai)/2 etc. we see that
sin,cos,tan,cot,csc,sec,cosh,sinh,tanh,coth,sech,csch
all exist on matricies.

Their derivatives remain the same. Some integral formulas
don't work (like the FTC derivative of integral version).

Inverses I guess still depend on this whole logarithm question,
log doesn't exist on the complex plane - {0} only on sections
of it, or on a more general domain. I would suspect log of matrices
to be somewhat similar, given that the complex plane is a subset
of all 2x2 matricies: [ a,b:-b,a].

Not having found a good reference on power series of matricies yet, AND
being that we are covering all sorts of linear operator stuff and
convergent series of functions blah blah in my classes, I think I'm gonna
sit tight and wait for tuition to pay for itself.

In the meantime here are a few formulae:

Inverses are like normal:
        (e^A)^-1 = e^(-A)

Conjugates are nice:
        P*e^A * P^-1 = e^(P*A*P^-1)

Diagonals are nice, let D be diagonal with entries d_1, ..., d_n

        e^D is a diagonal matrix with entries e^(d_1), ..., e^(d_n)

and if A*B = B*A then:
         e^(A+B) = e^A*e^B.

If A is 2x2 with eigen values a and b, then :

        e^A = (a*e^b - b*e^a)/(a-b)*I + (e^a - e^b)/(a-b)*A

If A is nxn a similar formula holds in A^i with i=0 to n-1, I don't
know the coefficients, but they would basically be roots of
n'th degree polynomial of eigenvalue type guys, and so the formula
may not _really_ exist for n>4. I'll let you know if I find anything
out.

Also I think most people have seen this, but this is a much faster way
of calculating fibonacci numbers (faster than the matrix method too),

        f_n = 1/a( ((1+a)/2)^n - ((1-a)/2)^n) where a = sqrt(5)

Math stuff, maybe not useful:

Let G be a group, and g,h elements thereof.

 (g*h*g^-1)^n = g*h*g^-1*g*h*g^-1*g* ... * h*g^-1 = g*h^n*g-1

        ( h' ^ n = (h^n)' )

Now let A and P be n*n matricies, P invertible.

Define e^A by I + A + 1/2! A^2 + ... = sum 1/n! * A^n

e^(P*A*P^-1) = sum 1/n! * (P*A*P^-1) ^n = sum 1/n! * P*A^n*P^-1 =
        P* (sum 1/n! A^n)* P^-1 = P*e^A*P^-1

Let D be a diagonal n*n matrix with entries d_1, ..., d_n

e^D is a diagonal matrix with entries e^(d_1), ..., e^(d_n)

At least the following is true:
Every invertible matrix can be written as L*D*L' where D is diagonal
and L' is the trqanspose of L.

Every orthogonal matrix can thus be written Q*D*Q^-1 where Q^-1 is
Q transpose, is the ortho-normal set of (rotated) eigen vectors.

Maybe this isn't terribly useful, finding eigenvectors is hard, and
we need n distinct ones. To show that it is theoretically useful:
note that for every e>0 there exists a B_e such that |B_e-A| < e
(the infinity/max/sup norm, not the determinant) and B has n
distinct eigen vectors. Thus I can get a diagonalizable B_k
as close to A as I want.

Noting also that P*B_k*P^-1 -> P*B*P^-1 if B_k -> B, so that we could
make a series of diagonalizable matricies that converge to the matrix
we are given, and then calculate the exponential on them. If anyone
knows a relatively easy way to construct the sequence and it's Q's
it would reduce e^A to the sequence creation 2 multiplies, a transpose,
a n floating-point exponentials.

Theoretically convergent method of discovering eigen values available
on request, Gauss elimination will of course discover them if there
are n distinct ones ( since LDLt is QDQ-1 in that case).

Hope this is slightly useful at least.
Received on Wed Sep 15 1999 - 00:06:22 CDT

This archive was generated by hypermail 2.2.0 : Sun Apr 17 2011 - 21:00:02 CDT