I could use some clarification, this isn't that difficult of a concept but I feel it's easily screwed up when doing so

If you have a structure with a binary operation. First thing to do is check whether associativity holds this one is not an issue.

now, for identity.

You assume there exists an identity then show it exists and that it is the identity.

Then for inverse, you assume an inverse exists because the identity exists so then you show it exists.

Is this the correct way to think about it and to do it. I run into issues with showing why an identity must exist or why an inverse exists in problems that have structures with binary operations that are not very clear.

I'm finding my issue then becomes (sometimes) worse when trying to prove normal subgroups or even subgroups.

  • 3,657
  • 3
  • 23
  • 57
  • 3
    Assuming a thing exists (e.g., an identity) and then from that assumption proving it exists, is logically questionable. Don't assume it exists. Show it exists. – paw88789 Feb 18 '15 at 19:11
  • 1
    I'm not really sure what you mean with *you assume there exists an identity, then show it exists*. From my poor knowledge of algebra I can tell you that most of the time, to show that a set has the structure of a group, you define an element to be the identity of that set and then show that it has the properties of the identity in a group. – Bman72 Feb 18 '15 at 19:11
  • 1
    It makes no sense to say assume the identity exists, then show it exists. You simply show that an element satisfies the definition of the identity element. – Tim Raczkowski Feb 18 '15 at 19:11
  • 1
    You should not write off associativity -- it is certainly quite tricky to establish for elliptic curve groups. It is easy if you are looking at a subset of a group, but otherwise it is often the most laborious property to check. See my example [here](http://math.stackexchange.com/questions/959280/existence-of-finite-non-associative-group-like-structures/1120879#1120879) of a non-associative structure that satisfies all the other conditions for a group (and it's even commutative) to get an idea why associativity is not necessarily obvious. – Robin Balean Feb 18 '15 at 21:13

2 Answers2


Here is an "illustrative example":

Show that the set $\Bbb Q-\{-1\}$ (all the rationals except $-1$) forms a group under the operation $\ast$, where $a\ast b = a + b + ab$.

First, we need to know $\ast$ is actually a binary operation on said set. It is clear that the domain of $(\Bbb Q-\{-1\})\times (\Bbb Q-\{-1\})$ is perfectly acceptable, but it is not immediately evident that the range of $\ast$ is wholly contained in $\Bbb Q-\{-1\}$. This is an important condition, called closure, which often needs to be verified.

So let's see if this can happen: suppose $a\ast b = -1$. Since we are dealing with rational numbers, we can freely use facts about rational numbers we already know. Thus:

$a + b + ab = -1 \implies a(b + 1) + b + 1 = 0 \implies (a + 1)(b + 1) = 0$

Since this can only happen if either $a$ or $b$ (or both) is $-1$, and neither is, we are assured of closure. Verifying associativity is, in this case, straightforward (but a little tedious):

$(a\ast b)\ast c = (a + b + ab)\ast c = a + b + ab + c + (a + b + ab)c$

$= a + b + c + ab + ac + bc + abc$

$= a + (b + c + bc) + ab + ac + abc = a + (b + c + bc) + a(b + c + bc)$

$= a \ast (b + c + bc) = a \ast(b \ast c)$

Now rather than assuming an identity exists (because we have no idea of knowing IF it does), we instead look for which rational numbers might be possible candidates. Suppose that $e$ is one such candidate, then for it actually to be an identity we need for any given $a$:

$a\ast e = a$, that is:

$a + e + ae = a \implies e + ae = 0 \implies e(1 + a) = 0$.

Since $1 + a \neq 0$ (because $a \neq -1$), the only viable candidate is $e = 0$. Now it is straightforward to show $0$ is indeed a two-sided identity for $\ast$ (alternatively, we could show that $\ast$ is commutative, and show $0$ is a one-sided identity).

Similarly, we have no idea if inverses exist under $\ast$. If $a$ has an inverse, say $b$ (bearing in mind such an $a$ may not exist, or that only SOME $a$'s may have such an inverse $b$), we have that:

$a \ast b = 0 \implies a + b + ab = 0 \implies b + ab = -a \implies b = \dfrac{-a}{1 + a}$.

Note that the only rational number $a$ for which $b$ is undefined is $a = -1$, which is not an element of our set. One more caveat, we have to show $b$ is always in our set, as well, which in this example boils down to showing $b$ cannot be $-1$ (it is clearly rational). But that leads to:

$-1 = \dfrac{-a}{1 + a} \implies \dfrac{a}{1 + a} = 1 \implies a = 1 + a \implies 0 = 1$, a contradiction. So it is clear this does not happen, and now we can readily verify that given $a \in \Bbb Q - \{-1\}$, that we have the unique inverse:

$a^{-1} = \dfrac{-a}{1+a}$, and we indeed have a group.

This procedure "doesn't always work", for example, when trying to determine if all 2x2 real matrices have inverses, one has to solve a system of two simultaneous linear equations:

$\begin{bmatrix}a&b\\c&d\end{bmatrix}$ has an inverse if the system:

$ax + by = 0$

$cx + dy = 0$

has only the unique solution $x = 0, y = 0$. When one tries to use elimination and substitution on this system of equations, to ensure uniqueness one must make the assumption the quantity $ad - bc \neq 0$. However, this cannot be guaranteed to be the case, and it turns out it is precisely these 2x2 matrices that fail to have inverses.

David Wheeler
  • 14,733
  • 1
  • 23
  • 34

It sounds like you're confusing two kind of assumptions.* There's assumptions you make when they're reasonable and assumptions you make to show their logical consequences are appealing or make an object completely understandable that, when similar objects would be harder to pin down if you don't make those assumptions. When you say you assume there's an identity and then you show it exists, that's inconsistent with the second kind of assumption - assuming something is true does not show it's true, does not show it follows from independent assumptions. Also, when you say "then show it exists and that it is the identity," the word "it" is still talking about an identity element, so you carry that assumption forward into the phrase "and that it is the identity," so that actually means "show that the identity is the identity." What you'd do instead is there might be some special element of the algebra, like 12 o'clock for the 12-hour clock under addition, that you suspect is the identity. You then show it satisfies being the identity of that addition, assuming the values of the 12-by-12 addition table for the clock are what they are. You never assume an algebra has an identity if you need to prove it has one.

When you then say you assume an inverse exists because the identity exists, you seem to be talking about it being reasonable for an element to have an inverse under some operation whenever there is an identity element for that operation. You could never take an element of an algebra and assume an inverse element exists and then rightfully conclude that it exists. There's a famous debacle about the fact that a monoid being cancellative doesn't imply it can be embedded in a group. Reasonable assumption, but if you assume the monoid you're working with is cancellative and can be embedded in a group, you've just chosen a special case, whereas if you assume that if your monoid is cancellative then it must be embeddable in a group, you've assumed a contradiction - that all cancellative monoids can be embedded in group, which is false, there are counterexamples.

Unless your algebra has been found "in the wild," you often have to assume an operation has an identity element to begin with. This is because if you take any magma $(S,!)$ with an identity $i$ and any $x \not\in S$, you can construct a new algebra $(S\cup\{x\},!')$ where $!'$ restricts on $S \times S$ to $!$, but where $x!'s=s!'x=x, \forall s \in S.$ However, suppose $(A,+),(B,*)$ are magmas, $(A,+)$ has an identity element $0 \in A,$ and $f:A \rightarrow B$ is a surjective homomorphism. Then since $\forall f(a):a \in A, f(a)*f(0)=f(a+0)=f(a),$ and since $\forall b \in B \exists a \in A : f(a)=b,$ we find that $\forall b \in B \exists a \in A : f(a)=b \ \&\ f(a)*f(0)=f(a)=b \implies \forall b \in B, b*f(0)=b.$ So, you can prove that the image of any magma with identity under a homomorphism also has an identity element.

In a similar way, you can prove other identities of an algebra are preserved by homomorphisms. This leads to constructing free algebras over a set, using universal algebra, and proving whether every algebra of a certain signature is an epimorphic image of the free algebra, of the same signature, over some set. If an isomorphism is just a relabelling, an epimorphism is a simulation. Ideally, every algebra with a certain signature can be simulated by free algebras of the same signature, making proving identities by their preservation under homomorphisms from an algebra which you know to have them a universal strategy. A presentation of a group is in fact notation for a surjective homomorphism from a free group, and once you know the basics of normal subgroups and free groups it'll become clear how to extract the surjection from the presentation.

Here's an example of how you derive a new identity from ones that hold by axiom. You could have gone your whole life without knowing that in every loop $(L,*,/,\backslash,1)$, $\forall x, y \in L, x/(y \backslash x) = y.$

But proof: Let $x, y \in L$ hold. Then $x/(y \backslash x) = c \iff x = c*(y \backslash x) \iff y*(y \backslash x) = c*(y \backslash x) \iff c = y.$

More elaborately, "Assume $(L,*,/,\backslash,1)$ is a loop. Then (recite loop identities). Then let $x, y \in L$ hold...."

  • Different modalities?
Loki Clock
  • 2,113
  • 11
  • 13
  • Notation: $\forall$ read "for all" is the universal quantifier, $\exists$ read "there exist(s)", $(:)$ after quantifiers means "such that", and if $!$ is a binary operation on a set $S$ then since closure is that it takes any two elements $a, b$ of $S$ and defines a unique element $a!b$ of $S$, then it is equivalent to a function to $S$ from the set of pairs of elements of $S$, $!(a,b) \equiv a!b$. For example $+:\mathbb{N} \times \mathbb{N} \rightarrow \mathbb{N}, (a,b) \mapsto a+b, (2,3) \mapsto 2+3=5.$ – Loki Clock Feb 18 '15 at 23:30