Home » Blog » Prove that there is exactly one vector satisfying given conditions

Prove that there is exactly one vector satisfying given conditions

Prove that there is one and only one vector B satisfying the equations

    \[ A \times B = C, \qquad A \cdot B = 1 \]

where A \neq O and C is orthogonal to A in \mathbb{R}^3.


Proof. First, we show uniqueness. If B is any vector such that A \times B = C and A \cdot B = 1 and B' is another vector such that A \times B' = C and A \cdot B' = 1 then we have

    \[ A \times B = A \times B', \qquad A \cdot B = A \cdot B'. \]

But, by a previous exercise (Section 13.11, Exercise #8) we know these conditions imply B = B'. Hence, there is at most one such B.

Now, for existence. Let A = (a_1, a_2, a_3), \ B = (b_1, b_2,b_3), \ C = (c_1, c_2, c_3). Since A,C are orthogonal we know A \cdot C = 0,

    \[ a_1 c_1 + a_2 c_2 + a_3 c_3 = 0. \]

From A \times B = C we then have

    \begin{align*}  a_2 b_3 - a_3 b_2 &= c_1 \\  a_3 b_1 - a_1 b_3 &= c_2 \\  a_1 b_2 - a_2 b_1 &= c_3. \end{align*}

Since A \neq O, we know at least one of the a_i \neq 0. Without loss of generality, assume a_1 \neq 0. From the second and third equations we have

    \[ b_3 = \frac{a_3 b_1 - c_2}{a_1}, \qquad b_2 = \frac{c_3 + a_2 b_1}{a_1}. \]

This implies

    \begin{align*}  a_2 b_3 - a_3 b_2 = c_1 && \implies && \frac{a_2 a_3 b_1 - a_2 c_2}{a_1} - \frac{a_3 c_3 + a_3 a_2 b_1}{a_1} &= c_1 \\[9pt]  && \implies && a_2 a_3 b_1 + a_3 a_2 b_1 - a_3 c_3 + a_3 a_2 b_1 &= a_1 c_1 \\[9pt]  && \implies && a_2 a_3 b_1 + a_3 a_2 b_1 &= a_1 c_1 + a_2 c_2 + a_3 c_3 \\[9pt]  && \implies && (a_2 a_3 + a_3 a_2)b_1 &= 0 \\  && \implies && a_2 a_3 + a_3 a_2 &= 0. \end{align*}

Hence, b_1 can take any value. The vectors B such that A \times B = C are then of the form

    \[ B = \left( b_1, \frac{c_3 + a_2 b_1}{a_1}, \frac{a_3 b_1 - c_2}{a_1} \right). \]

Then,

    \begin{align*}  A \cdot B = 1 && \implies && a_1 b_1 + a_2 \left( \frac{c_2 + a_2 b_1}{a_1} \right) + a_3 \left( \frac{a_3 b_1 - c_2}{a_1} \right) &= 1 \\[9pt]  && \implies && a_1^2 b_1 + a_2 c_3 + a_2^2 b_1 + a_3^2 b_1 - a_3 c_2 &= 1\\[9pt]  && \implies && b_1 &= \left( \frac{1}{a_1^2 + a_2^2 + a_3^2} \right) (a_3 c_2 - a_2 c_3). \end{align*}

Since a_1^2 + a_2^2 + a_3^2 \neq 0 (since at least one of the a_i \neq 0, we have that such a vector B always exists. \qquad \blacksquare

3 comments

  1. William C says:

    A dot B = A dot C implies A dot (B-C) = 0 which implies C = B-cA for some c

    Using the above expression for c with A x B = A x C shows that c must equal 0 and thus C = B

Point out an error, ask a question, offer an alternative solution (to use Latex type [latexpage] at the top of your comment):