# Dumb Questions (and Answers!)

## Daniel McLaury

I constantly find myself (and my friends) asking stupid questions, and/or tricked into believing things that aren't true. Even figuring them out for myself isn't always enough to keep me from making the same mistake again. So I've decided I'm going to keep a list here with the hopes that someone out there (or, failing that, I myself) will find it useful one day.

## Basics

- If f is a function and X is a subset of its domain, then in general $f(f^{-1}(X)) \subseteq X \subseteq f^{-1}(f(X))$. (Here $f^{-1}$ denotes the preimage.) As an exercise, what conditions would you need to put on f to make each inclusion an equality?
- On a similar note, a function can have exactly
*one*inverse, since if g and h are inverses of f then h = (gf)h = g(fh) = g, and of course the same holds for any associative operation. However, a function which does not have a left inverse can have arbitrarily many right inverses, and vice-versa.

## Calculus

- A lot of results in elementary calculus are "proved" by pulling an infinite sum inside of an integral, or a derivative outside of an integral, or something similar. This sort of thing is not safe to do in general, and the reasons generally aren't made clear until some sort of analysis class — in fact, the field of mathematical analysis was motivated largely by questions of when it was and wasn't safe to do this sort of thing.

## Group Theory

- Suppose we have a group G, a normal subgroup N, and a quotient group H = G / N. It is certainly
*not*neccessarily the case that G is isomorphic to $H \times N$ (this mistake I made in the first term of my freshman algebra class), nor is it true that G is isomorphic to a semidirect product $H \rtimes N$ (*this*mistake I made in the second term!) In fact, see section 3.4 of Dummit and Foote: "[T]he number of groups of order $2^n$ grows (exponentially) as a function of $2^n$, so the number of ways of putting groups of 2-power order together is not bounded." - For that matter, there is in general no such thing as "the" semidirect product $H \rtimes N$ of two groups. There is always at least one semidirect product — namely the direct product itself — but there are many more possibilities.

## Linear Algebra

- While it's true that Tr(AB) = Tr(BA) in general (given that A and B can be multiplied), this does
*not*mean that Tr(AB) = Tr(A)Tr(B), or even that there's some general "commutative property" of matrices inside traces. For instance, it's not true that Tr(ABC) = Tr(ACB) in general. I figured this out the hard way, and — if you're to take his word for it — so did at least one of my professors. - (Diagonalizable, square) matrices commute if and only if they can be diagonalized simultaneously, i.e. if there's a basis in which both are diagonal. This isn't hard to prove — just show that an eigenvector of one has to be an eigenvector of the other, and vice versa — but it's surprisingly little-known among certain groups of people who work with matrices. I probably learned this in school, but I'd forgotten it by the time I needed it.
- Infinite-dimensional vector spaces don't behave the way you might expect. For instance, if you take the real vector space V consisting of all infinite sequences of real numbers under componentwise multiplication and addition, the "elementary" vectors $e_1 = (1, 0, 0, \ldots)$, $e_2 = (0, 1, 0, \ldots)$, and so on do
*not*form a basis, because a linear combination is a*finite*sum: and element of V which has infinitely many nonzero terms is not a linear combination of the $e_i$'s above. This makes the result which says that*every*vector space has a basis much more incredible, and may serve as your first introduction to some of the uncomfortable philosophical questions raised by the axiom of choice. - The product of symmetric matrices is not necessarily symmetric. It's not even particularly hard to find examples of this -- two-by-two matrices will suffice.

## Representation Theory

- A representation of a Lie algebra L is a homormophism $\varphi : L \rightarrow \mathfrak{gl}(V)$; as such, the elements of L act on V as vector space homomorphisms (that is, linear maps) —
*not*necessarily as L-module homomorphisms. Of course there are analogs in other categories.