Definition. matrices [br99]
Definition. matrices [br99]
Suppose $\{ \mathbf{x}_{1}, \dots, \mathbf{x}_{n} \}$ and $\{ \mathbf{y}_{1}, \dots, \mathbf{y}_{m} \}$ are bases of vector spaces $X$ and $Y$, respectively. Then every $A \in L(X, Y)$ determines a set of numbers $a_{ij}$ such that $$ A \mathbf{x}_{j} = \sum_{i=1}^{m} a_{ij} \mathbf{y}_{i} \qquad (1 \leq j \leq n). $$ We represent $A$ by an $m$ by $n$ matrix: $$ [A] = \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m 1} & a_{m 2} & \cdots & a_{mn} \end{bmatrix} $$
$[A]$ depends not only on $A$ but also on the choice of bases in $X$ and $Y$.
If $\mathbf{x} = \sum_{j} \mathbf{x}_{j}$, then the Schwarz inequality shows that $$ \lvert A \mathbf{x} \rvert^{2} = \sum_{i} \left( \sum_{j} a_{ij} c_{j} \right)^{2} \leq \sum_{i} \left( \sum_{j} a_{ij}^{2} \cdot \sum_{j} c_{j}^{2} \right) = \sum_{i, j} a_{ij}^{2} \lvert \mathbf{x} \rvert ^{2}. $$ Thus $$ \lVert A \rVert \leq \sqrt{ \sum_{i, j} a_{ij}^{2} }. $$ Moreover, if we replace $A$ by $B - A$, where $A, B \in L(\mathbb{R}^{n}, \mathbb{R}^{m})$, and view each $a_{ij}$ as continuous functions of a single parameter, then we have the following:
If $S$ is a metric space, if $a_{11}, \dots, a_{mn}$ are real continuous functions on $S$, and if, for each $p \in S$, $A_{p}$ is the linear transformation of $\mathbb{R}^{n}$ into $\mathbb{R}^{m}$ whose matrix has entries $a_{ij}(p)$, then the mapping $p \mapsto A_{p}$ is a continuous mapping of $S$ into $L(\mathbb{R}^{n}, \mathbb{R}^{m})$.