From: kovarik@mcmail.cis.McMaster.CA (Zdislav V. Kovarik)
Subject: Re: Matrix for rotating one vector into another
Date: 20 Dec 1999 02:06:46 -0500
Newsgroups: sci.math
Keywords: direct rotation
In article <83kh0a$bad$1@nnrp1.deja.com>,
Andrei Zmievski wrote:
:Hi,
:
:I have a little problem here. I need to derive a matrix to rotate one
:vector into another.
:
:Basically, I have in 3 dimensional space the source vector which is
:always z-axis [0, 0, 1]. Then I have an arbitrary vector that is not
:z-axis. I need to figure out how to construct a matrix which, when
:applied to z-axis, transforms it into this vector. So I figured out that
:row 3 of the matrix is the vector itself. But what about the other two
:rows?
:
:Example:
:source = z-axis = [0, 0, 1]
:target = [1, 0, 0]
:
:Matrix:
:[?, ?, ?]
:[?, ?, ?]
:[1, 0, 0]
:
:What is the most efficient way to do it?
:Thanks for your help.
:
:-Andrei
There is a "direct rotation", which is the product of two reflections.
I will use the column convention for vectors, and a matrix acts on such a
vector from the left. You can translate everything by transposition, if
your favorite vectors are row vectors.
The dash (') denotes transpose.
A reflection between two unit vectors p, q (not equal) is a matrix
H(p,q) = I - 2 * b * b'
where b is the normalized bisector of p and -q, that is,
b = (p-q)/norm(p-q).
Also, b is the normal to the mirror that interchanges p with q.
It does this: H(p,q)*q = p , H(p,q)*p = q, and H(p,q)*x = x if x is
perpendicular to both p and q.
In particular, H(p,-p) = I - 2 * p * p' .
A direct rotation from p to q is then
R(p,q) = H(q, -p) * H(p, -p)
(You can make a sketch in the plane spanned by p and q since everything
perpendicular to p and q stays put.)
In your example, p = [0, 0, 1]' and q = [1, 0, 0]'
H(p, -q) =
[ 0 0 -1 ]
[ 0 1 0 ]
[-1 0 0 ]
H(p,-p) =
[ 1 0 0 ]
[ 0 1 0 ]
[ 0 0 -1 ]
and R(p,q) =
[ 0 0 1 ]
[ 0 1 0 ]
[ -1 0 1 ]
Of course, working with row vectors and right multiplication by matrices
requires transposing the result.
This direct rotation is "most efficient" in a way you might not have
expected: the distance from R(p,q) to the identity matrix is minimal among
all orthogonal matrices that transform p into q.
This approach is valid for all dimensions from 2 up, and extends to
(sufficiently close) subspaces to be rotated whose dimensions are (equal
and) greater than 1.
Reference for the higher dimensional stuff: Matrix Calculations by
G.H.Golub and C.F. Van Loan ("angles between subspaces") and look for a
reference by C. Davis and W. Kahan: Rotation of Eigenvectors.
Cheers, ZVK(Slavek).