Vector Stacking Operation and Poltors (Polygonic Vector)

Abstract

Investigate the results of a vector stacking operator in analogy to the matrix multiplication or inner and outer vector products.

Introduction

We have vector dot product \[\begin{bmatrix}a&b\end{bmatrix}\cdot\begin{bmatrix}c\\d\end{bmatrix} =ac+bd\]

and outer ’dyadic’ of this product by reversing the position of row and column vectors

\[\begin{bmatrix} a \\ b \end{bmatrix} \cdot \begin{bmatrix} c & d \end{bmatrix} = \begin{bmatrix} ac & ad \\ bc & bd \end{bmatrix}\]

So investigate coupling of row and column vectors such that \[\begin{bmatrix} a \\ b \end{bmatrix} \cdot \begin{bmatrix} c \\ d \end{bmatrix} = \begin{bmatrix} ac \\ ad \\ bc \\ bd \end{bmatrix} or \begin{bmatrix} ac \\ bc \\ ad \\ bd \end{bmatrix}\]

Higher matricies could be made with a transform such as \[M=(v\cdot u)\otimes(v^T\cdot u^T)\]

With this concept in mind let us examine the matrix muliplication operation \[\begin{bmatrix} a & b \\ c & d \end{bmatrix} \cdot \begin{bmatrix} e & f \\ g & h \end{bmatrix} = \begin{bmatrix} ae+bg & af+bh \\ ce+dg & cf+dh \end{bmatrix} = \begin{bmatrix} \begin{bmatrix}a&b\end{bmatrix}\cdot\begin{bmatrix}e\\g\end{bmatrix} & \begin{bmatrix}a&b\end{bmatrix}\cdot\begin{bmatrix}f\\h\end{bmatrix} \\ \begin{bmatrix}c&d\end{bmatrix}\cdot\begin{bmatrix}e\\g\end{bmatrix} & \begin{bmatrix}c&d\end{bmatrix}\cdot\begin{bmatrix}f\\h\end{bmatrix} \end{bmatrix}\]

With this concept in mind let us define a similar operation by switching row and column vectors \[\begin{bmatrix}a&b\\c&d\end{bmatrix}\otimes \begin{bmatrix}e&f\\g&h\end{bmatrix}=\begin{bmatrix} \begin{bmatrix}a\\b\end{bmatrix}\cdot\begin{bmatrix}e&g\end{bmatrix}& \begin{bmatrix}a\\b\end{bmatrix}\cdot\begin{bmatrix}f&h\end{bmatrix}\\ \begin{bmatrix}c\\d\end{bmatrix}\cdot\begin{bmatrix}e&g\end{bmatrix}& \begin{bmatrix}c\\d\end{bmatrix}\cdot\begin{bmatrix}f&h\end{bmatrix} \end{bmatrix}= \begin{bmatrix} ae & ag & af & ah \\ be & bg & bf & bh \\ ce & cg & cf & ch \\ de & dg & df & dh \end{bmatrix}\]

This appears to be the outer product of two vectors of the form \[(a b c d) \otimes (e g f h)\]

It can be seen that the trace of each sub matrix is the result of a regular matrix-matrix product!

However this differs from the Kronecker product which would yeild \[\begin{bmatrix}a&b\\c&d\end{bmatrix}\otimes \begin{bmatrix}e&f\\g&h\end{bmatrix}=\begin{bmatrix} \begin{bmatrix}a\\b\end{bmatrix}\cdot\begin{bmatrix}e&g\end{bmatrix}& \begin{bmatrix}a\\b\end{bmatrix}\cdot\begin{bmatrix}f&h\end{bmatrix}\\ \begin{bmatrix}c\\d\end{bmatrix}\cdot\begin{bmatrix}e&g\end{bmatrix}& \begin{bmatrix}c\\d\end{bmatrix}\cdot\begin{bmatrix}f&h\end{bmatrix} \end{bmatrix}= \begin{bmatrix} ae & af & be & bf \\ ag & ah & bg & bh \\ ce & cf & de & df \\ cg & ch & dg & dh \end{bmatrix}\]

Poltors

New idea, using old concepts. Statement: “In 3 dimensions, x is as much to y as it is to z”, repeating for other permutations of x,y,z. So why do we seperate them in the three vector, x has to jump across y to get to z, they seem distant but they are actually neightbors.

Humans like to make squares and lines, lets try a triangle like structure and see where it goes. This isn’t easy to do in vanilla Latex though, define a three vector \(a\) with elements \((a_1, a_2, a_3)\) as \[/a_1 \backslash\\ /a_2 a_3\backslash\]

For symmetry reasons we can adress the elements in a cyclic manner fitting to permutations. Then for example one can take the dot product as normal \[/a_1 \backslash \cdot/b_1\backslash \hspace{95pt}\\ /a_2 a_3\backslash /b_2 b_3\backslash = a_1b_1+a_2b_2+a_3b_3\]

The cross product makes a different sense in this format \[\hspace{67pt}/a_1 \backslash \times/b_1\backslash \hspace{45pt} (a_2b_3-a_3b_2)\hspace{95pt}\\ /a_2 a_3\backslash \hspace{5pt} /b_2 b_3\backslash = (a_3b_1-a_1b_3) (a_1b_2-a_2b_1)\]

Where the triangle like frame is beginning to be ommited. However the result on the right hand side as a new three vector can now be explained by the sentence: “the element at a position is the anti-clockwise element of LH operand times clockwise element of RH operand minus clockwise element of LH operand times anti-clockwise element of RH operand”. Bit of a mouthfull, but there is order here. Actually, if we define the index as having a modulo count on, such that elements 4 and 1 are equivalent then we can state the cross product as \[(a \times b)_i = a_{i+1}b_{i-1}-a_{i-1}b_{i+1}\]

Which is an alternate definition. Much more concise, only uses one index.

Perhaps these triangles are a stepping stone onto, 4-vectors. If there is a corner for each element in the construct, that that would natrually make the 4-vector equivalent a 2x2 matrix lookalike. But, we still keep our rotational definition of the cross product. \[\begin{bmatrix}a_1 & a_4 \\a_2 & a_3\end{bmatrix} \times \begin{bmatrix}b_1 & b_4 \\b_2 & b_3\end{bmatrix} = \begin{bmatrix}a_2b_4 - a_4b_2 & a_1b_3-a_3b_1\\a_3b_1-a_1b_3 & a_4b_2 - a_2b_4\end{bmatrix}\]