Monday, October 9, 2017

Representation Theory

Groups on Linear Spaces

In physical mathematics we frequently speak of group operations on linear spaces.  Consider the elements of a group ${G_a}$ and linear operators $T(G_a)$, so that $T(G_a)v \implies v'$, where $v$ and $v'$ are vectors.

\begin{align}
T(G_a) e_i &= T_{ji}(G_a) e_j \\
(e_j, T(G_a) e_i) &= T_{ji}(G_a) \\
T(G_a) T(G_b) &= T(G_c) \text{, if $ G_a G_b = G_c$}
\end{align}

The operators $T$ are an example of a representation of the group ${G_a}$ with respect to operator multiplication.

What is a group?

We restrict attention to finite groups, which are sets with a binary operation that is 
$\require{enclose}$
1.  closed, and
2.  associative

Also,
3.  $\exists$ an inverse for every element, and
4.  $\exists$ an identity element

Reducibility

This is an important topic in rep theory. Suppose L is a vector space, invariant with respect to the transformations $T(G_a)$. If $L=L_1 + L_2$, two orthogonal spaces also invariant with respect to $T(G_a)$, then T reduces. Otherwise, $T$ is irreducible. Amazing fact: all finite groups have a finite number of (non-equivalent) irreducible representations. We will show this.

Schur's Lemma

Before going further, we need a preliminary result.

$$
T^{\alpha}(G_a) A - A T^{\beta}(G_a) = 0 \; \enclose{horizontalstrike}{\lor}  \;G_a \implies A = \lambda \delta_{\alpha \beta} I
$$

But we can construct just such an $A$

$$
A = \sum_b T^{\alpha} (G_a) X T^{\beta}(G_a^{-1})
$$
$X$ is arbitrary, but determines $\lambda$

$$
\sum_{bmn} T_{im}^{\alpha}(G_b) X_{mn} T_{nj}^{\beta}(G_b^{-1}) = \lambda \delta_{\alpha \beta} \delta_{ij}
$$

Choose $X_{mn} = \delta_{mq}\delta_{np}$

\begin{equation}
\sum_{b} T_{iq}^{\alpha}(G_b)T_{pj}^{\beta}(G_b^{-1}) = \lambda \delta_{\alpha \beta} \delta_{ij} \label{eq:1}
\end{equation}

Let $\alpha = \beta$ and $i=j$

$$
\sum_{b} T_{iq}^{\alpha}(G_b)T_{pi}^{\beta}(G_b^{-1}) = \lambda
$$

Sum both sides over $i$

$$
\sum_{b} T_{pq}^{\alpha}(E) = \lambda S_{\alpha}
$$

$ S_{\alpha}$ is the dimension of the $\alpha$ representation.

\begin{equation}
g \delta_{pq} = \lambda S_{\alpha}  \label{eq:2}
\end{equation}


$g$ is the number of elements in the group.

Using $\eqref{eq:2}$ in $\eqref{eq:1}$:

\begin{equation}
\sum_{b} T_{iq}^{\alpha}(G_b)T_{pj}^{\beta}(G_b^{-1}) = \delta_{pq} \delta_{\alpha \beta} \delta_{ij} \frac{g}{S_{\alpha}} \label{eq:3}
\end{equation}
Let $T$ be unitary, $i=j$, $\alpha = \beta$,  and $p = q$.

\[
\sum_{b} T_{iq}^{\alpha}(G_b)T_{iq}^{* \alpha}(G_b) = \sum_{b}{|T_{iq}^{\alpha}(G_b)|}^2 = \frac{g}{S_{\alpha}}
\]

Characters

In $\eqref{eq:3}$, let $q=i$ and $p=j$ and sum over $i$ and $j$

\begin{align*}
\sum_{ijb} T_{ii}^{\alpha}(G_b)  T_{jj}^{* \beta}(G_b) & = \sum_{ij} \delta_{ij} \frac{g}{S_{\alpha}} \delta_{\alpha \beta} \\
\sum_{b} \chi^{\alpha}(G_b)  \chi^{* \beta}(G_b)  &= g \delta_{\alpha \beta}
\end{align*}

Where the character $\chi^{\alpha}(G_a)$ is the trace of the operator $T^{\alpha}(G_a)$

If we let $C_p$ count the members of a given class, the elements of which share a character, we can write this as

\begin{align}
\sum_p C_p \chi_p^{\alpha} \chi_p^{* \beta} &= g \delta_{\alpha \beta}  \label{eq:4} \\
\sum_p C_p |\chi_p|^2  &= g
\end{align}

For any representation, $\chi_p = \sum_i m_i \chi_p^i$ since any rep can be resolved to its irreducible components.  We can find the $m_i$, given the characters.

\begin{align*}
\sum_p C_p \chi_p^{\alpha}  \chi_p &= \sum_i \sum_p C_p \chi_p^{\alpha} m_i \chi_p^i \\
&= \sum_i g \delta_{\alpha i} m_i \\
&= m_\alpha g
\end{align*}

For any rep, $\sum_p |\chi_p|^2 \gt g$ with equality only if the rep is irreducible.

\begin{align}
\sum_p |\chi_p|^2 &= \sum_p C_p \sum_{\alpha} m_\alpha \chi_p^\alpha \sum_{\beta} m_\beta \chi_p^\beta \\
&= \sum_{\alpha \beta} \sum_p C_p  \chi_p^\alpha \chi_p^{* \beta}  m_\alpha  m_\beta \\
&= \sum_{\alpha \beta} \sum_p  g \delta_{\alpha \beta} m_\alpha  m_\beta \\
&= \sum_\alpha g |m_\alpha|^2
\end{align}

This equals $g$ only if the $m_\alpha$ are all $0$, with one exception, which is $1$

The Regular Representation


Define the $g$-dimensional representation $T^\text{Regular}$ by

$$
G_a G_b = \sum_c T_{cb}^\text{Regular}(G_a) G_c
$$

The $\chi^R(G)$ are all zero except $\chi^R(E)$ which is $g$.  Consider the reduction of the regular representation:

\begin{align}
\sum_p C_p \chi_p^{* \alpha} \chi_p^R &= g m_\alpha \\
C_E \chi_E^{* \alpha} \chi_E^R(E) &= g m_\alpha \\
S_\alpha g  &= g m_\alpha
\end{align}

So $m_\alpha = S_\alpha$, and $g=\sum_\alpha m_\alpha S_\alpha = \sum_\alpha S_\alpha^2$

The number of orthogonal vectors, $T_{ij}^\alpha$, is $\sum_\alpha S_\alpha^2$, which we now know is $g$. That is, these vectors span the vector space. We could expand any vector in this space as

$$
v = \sum_{ij\alpha}  Z(ij\alpha) T_{ij}^\alpha
$$

Or in component form
$$
v_a = \sum_{ij\alpha}  Z(ij\alpha) T_{ij}^\alpha(G_a)
$$

Character Relations

Now, consider a vector $v$ that has the same component along all directions in a class, like so:
$$
v_a = \frac {1}{g} \sum_{b=1}^g v_c  \,\,\, \text{ where $G_c = G_b^{-1} G_a G_b$}
$$

and decompose $v_a$ and $v_c$ along $T$ like so:

\begin{align}
v_c &= \sum_{ij\alpha}  Z(ij\alpha) T_{ij}^\alpha(G_c) \\v_a &= \sum_{b}  \sum_{ij \alpha} \sum_{kl} \frac{Z(ij\alpha)}{g} T_{ik}^\alpha(G_b^{-1}) T_{kl}^\alpha(G_a) T_{lj}^\alpha(G_b)
\end{align}

Using $\eqref{eq:3}$ to sum out $b$

\begin{align}
v_a &=    \sum_{ij \alpha}  \sum_{kl} \frac {Z(ij\alpha)}{g} \frac{g}{S_\alpha} \delta_{kl}  \delta_{ij}  T_{kl}^{\alpha}(G_a) \\
&= \sum_{i\alpha} \sum_k \frac{Z(ii\alpha)}{S_\alpha} T_{kk}^{\alpha}(G_a) \\
&= \sum_{i\alpha} \frac{Z(ii\alpha)}{S_\alpha} \chi^{\alpha}(G_a) = \sum_{\alpha} \chi^{\alpha}(G_a)  \left[ \sum_i \frac{Z(ii\alpha)} {S_\alpha} \right] = \sum_\alpha z(\alpha) \chi^\alpha(G_a)
\end{align}

We see that the $\chi^\alpha$ span this space of classes so we conclude there must be $n$ $\alpha$'s if there are $n$ classes.

\[
\bbox[5px,border:2px solid red]
{
\text{# of irreps = # of classes}
}
\]

Since the character table is square, we can re-write $\eqref{eq:4}$, which tells us the rows of the table are orthogonal, in the following form

$$
B_{\alpha p} = \sqrt{\frac{C_p}{g}} \chi_p^\alpha
$$

$$
\sum_p  B_{\alpha p} B_{\beta p}^{*} = \delta_{\alpha \beta}
$$


So $Det(B) = Det(B^{\dagger}) = 1$, so $B$ has an inverse, $B^{-1} = B^{\dagger}$, which implies the columns are also orthogonal:

$$
\sum_{\alpha} B_{\alpha p}^{*} B_{\alpha q} = \delta_{pq}
$$

$$
\sum_{\alpha} \frac{C_p }{g} \chi_p^{\alpha *} \chi_q^{\alpha} = \delta_{pq}

$$

\begin{equation}
\sum_\alpha \chi_p^\alpha \chi_q^\alpha = \frac{g}{C_p} \delta_{pq}
\end{equation}

Woo!

Projection Operators

$$
G_a e_i^{\alpha} = \sum_j T_{ji}^\alpha(G_a) e_j^{\alpha} \, \text {  (defines a representation) }
$$

\begin{align}
\sum_a  T_{mn}^{\beta}(G_a^{-1}) G_a e_i^\alpha  &= \sum_{aj} T_{mn}^{\beta}(G_a^{-1}) T_{ji}^\alpha(G_a) e_j^\alpha \\

&= \sum_j \frac{g}{S_\beta} \delta_{mi} \delta_{nj} \delta_{\alpha \beta} e_j^{\alpha} \\
&= \frac{g}{S_\beta} \delta_{mi}  \delta_{\alpha \beta} e_n^{\beta}
\end{align}

Let $m=n$
$$
\sum_a T_{nn}^{\beta}(G_a^{-1}) G_a e_i^\alpha = \frac{g}{S_\beta} \delta_{ni}  \delta_{\alpha \beta} e_n^{\beta}
$$

Define a projection operator

$$
\left[ \sum_a \frac{S_\beta}{g} T_{nn}^{\beta}(G_a^{-1}) G_a  \right] e_i^\alpha= \delta_{ni}  \delta_{\alpha \beta} e_n^{\beta}
$$

$$
P_n^\beta e_i^\alpha = \delta_{ni} \delta_{\alpha \beta} e_n^{\beta}

$$