85

This is ONE thing about my undergraduate studies in computer science that I haven't been able to 'link' in my real life (academic and professional). Almost everything I studied I've observed be applied (directly or indirectly) or has given me Aha! moments understanding the principles behind the applications.

Groups, Rings and Fields have always eluded me. I always thought they were useful (instinctively) but failed to see where/how. Are they just theoretical concepts without practical applications? I hope not. So what are their applications, especially in the field of computer science. No matter how arcane/remote their use I still want to know.

Pete L. Clark
  • 93,404
  • 10
  • 203
  • 348
PhD
  • 2,543
  • 5
  • 25
  • 31
  • 40
    Cryptography does not ring a bell? – sxd Jul 21 '13 at 22:12
  • 3
    Would it be so terrible to have learned something beautiful that lacked practical applications? I hope not. – Jesse Madnick Jul 21 '13 at 22:21
  • 10
    This paper [The Equivalence Problem of Multitape Finite Automata](http://users.utu.fi/harju/articles/eilenb.pdf) is a particularly nice example of algebra usage in computer science, featuring monoids, groups, skew fields and others ;-) Another nice example might be this blog post [An Approach to Algorithm Parallelisation](http://blog.sigfpe.com/2008/11/approach-to-algorithm-parallelisation.html) which uses ring abstraction to fit a seemingly non-parallelizable function into the subset-sum scheme. – dtldarek Jul 21 '13 at 22:24
  • 13
    @JesseMadnick advocating pure mathematics as useless is, for us pure mathematicians, a dangerous thing to do. How can you justify being paid for doing pure maths if all you can say is "it's beautiful, but only a select few geniuses like us can see that beauty". Mathematics, even pure, is practical and important to science. That is why it pays well. If it were an art, nobody would pay for it. – Ittay Weiss Jul 21 '13 at 22:28
  • Take a look at Frédérique Oggier's [monagraphs.](http://www1.spms.ntu.edu.sg/~frederique/Publi.html) One of her monographs is about applications of central simple algebras to wireless communication. – Prism Jul 21 '13 at 22:45
  • 1
    http://math.stackexchange.com/questions/324253/are-there-real-world-applications-of-finite-group-theory/324435#324435 – Amzoti Jul 21 '13 at 23:18
  • 11
    I'm habitual to linking everything I've learnt to other things I've learnt. I just want to see it. It's a turn on to see the linkage. Satisfies my curiosity. I am not debunking them as useless. Just want to know their use/applicability since I wasn't able to discern them. – PhD Jul 22 '13 at 01:03
  • @Ngua - Honestly, I was never taught/learnt cryptography with an iota of groups/rings/fields in it. I'd love to see that be made obvious. – PhD Jul 22 '13 at 01:05
  • Another little answer, that I don't know enough about to expand on, is that finite fields come up in (at least the basic part of) the theory of error-correcting codes. – Eric Stucky Jul 22 '13 at 01:10
  • 2
    @IttayWeiss: I'm well aware that pure math is very applicable, and I'm not advocating its uselessness. My point was that _even if_, _hypothetically_, it were the case that pure math lacked applications, it would still be worth learning for its beauty alone. To reiterate: I hope the OP would not consider it a waste of time to learn something beautiful (like art or literature) that lacked practical applications. – Jesse Madnick Jul 22 '13 at 01:26
  • 8
    @JesseMadnick - Absolutely NOT! Math is beautiful! I wanted to know if it's only the beauty I need to appreciate or am I being superficial and not seeing the 'hidden' beauty of its application(s). Hence the question PS: I study math topics just 'cause they are beautiful and don't care so much about their practical applications. But engineering schools rarely teach you something for beauty and thus was curious. – PhD Jul 22 '13 at 01:31
  • 25
    can we not do the whole "I don't need *real life* (sneer) applications" routine? Good for you if you don't *need* them, but the applications to other academic fields have always been a huge source of inspiration for mathematics and play a large part in making it so wonderfully rich. Acting like "I don't need them" $\implies$ "we should all ignore them" is just as ignorant as the reverse view. – Robert Mastragostino Jul 22 '13 at 11:47
  • 4
    If you ever take a course on advanced 3D rendering topics, they'll use topology *(which uses group-theory)* extensively. – BlueRaja - Danny Pflughoeft Jul 22 '13 at 17:14
  • What is a real life application of Fermat's last theorem? Personally I don't care about it. However, it does not mean that we should all ignore the question. – Makoto Kato Aug 07 '13 at 18:41

12 Answers12

94

If every second or so your computer’s memory were wiped completely clean, except for the input data; the clock; a static, unchanging program; and a counter that could only be set to 1, 2, 3, 4, or 5, it would still be possible (given enough time) to carry out an arbitrarily long computation — just as if the memory weren’t being wiped clean each second. This is almost certainly not true if the counter could only be set to 1, 2, 3, or 4. The reason 5 is special here is pretty much the same reason it’s special in Galois’ proof of the unsolvability of the quintic equation.

-Scott Aaronson

PhD
  • 2,543
  • 5
  • 25
  • 31
43

Groups and fields, primarily finite ones, are used extensively in coding theory. Many of the results in number theory that give rise to important encryption systems (e.g., RSA) can actually be seen to be results in group theory. If you include applications outside of computer science it would really be hard to exaggerate on the importance of group theory. Groups are literally everywhere. The theory of group representations for instance is useful in chemistry (particularly in crystallography).

The reason for the importance of groups is that they model symmetry and for fields, at least for coding theory and cryptography, is that they codify very intricate combinatorics.

So, in computer science, whenever you watch a video online, make a phone-call, purchase something over the internet, compress a file, send an email, or communicate with the Mars Rover lots of groups and fields are being used behind the scenes.

Ittay Weiss
  • 76,165
  • 7
  • 131
  • 224
31

Saying my bit and it's too long to fit into a comment. I apologize in advance about being a bit chatty. Well, this is a soft question, so the answer is gonna be soft as well.

Several posters have emphasized some technological applications of abstract algebra. From the point of view of algebraists these are fine answers (surprisingly generously upvoted actually - may be the hearts of practitioners of abstract algebra warm up to these). You will get different answers, if you ask a different group of people. Some might refer to how algebraic structures give the playground to problems central to study of complexity classes of algorithms. I dunno?

I describe two discussions I have had that I think are relevant to this question.

I once chatted with a professor in computer science. I suggested that may be I should supplement our linear algebra lecture notes with a chapter on how orthogonal coordinate transformations (rotations and such) are applied in 3D-graphics. I had found out that homework problems related to this theme motivated some of my students. His reply was that it is not clear cut. Most of the programmers that the university spews out will end up working with teams. His point was that it is unnecessary for ALL the programmers to know about groups of rotations, because only a small subset of the team would work on the 3D-graphics engine - if any at all. In DOS-era with homegrown code it was enough for SOME of the programmers in the team to know this inside out. But nowadays a lot of the hard work has been "outsourced" via a standard interface to the manufacturer of the graphics card (so I was told). True, SOME OTHER programmers still need to know enough to use that interface to good effect.

I am thinking that the same principles apply elsewhere. It is not at all necessary for most of the programmers to know about finite fields even if they are working in a team building apps on top of a layer of error-correcting-codes and/or cryptographic modules/protocols. Yes, these are fascinating applications of algebra, but even for those guys an in-depth understanding of the algebra involved is not a high priority.

However, when you are creating something new, it is clearly good to have people in your team, who have a deeper understanding of the underlying principles. And "deep" means something else than what it means to the readers of this question. This is a pons asinorum to the other discussion I recall. During my short stint in telcomm industry I met this guy from Nokia-Siemens Networks. I was told that he is the top patent inventor of the team from the Siemens side. I also recognized him as one of the guys who competed at IMO for West Germany at the same time I represented Finland. A modest dude who explained the recipe of his success as "Oh, most of my inventions are trivial applications of modular arithmetic - programmers and engineers don't understand it." I will testify that his last sentence is true. They learn about the binary mod and various rules around it, but they don't learn the bidirectional power of congruences, and don't learn to think in terms of the residue class rings - it's all remainders of divisions of integers to them. So, by being at the right place at the right time, you can dine out simply because you can work swiftly with periodically repeating discrete structures and patterns.

Again, something that not all the CS majors need to know, but "in the land of blind people one-eyed man is the king". <- Sorry, I just couldn't resist. The upshot here is that we don't know what kind of an eye will become useful.

So the reason to learn a bit about abstract algebra is not its usefulness in any given set of applications. I guess (I truly don't think I am the right guy to say this) it is more a question about having more tools to bring order to the chaos surrounding the (currently unknown) programming tasks that lie in your future.

Jyrki Lahtonen
  • 123,594
  • 24
  • 249
  • 599
  • 2
    This is delightfully cynical, although I am not sure I agree with the sentiment (which I interpret as being "you do not need to know about Rings and things because someone else will do that bit for you"). Although this sentiment may be true, knowing about Rings and things means that you can understand how these black boxes work. For example, I know about CRC because my wife's uncle was trying to understand how hardware communicated for a program he was writing. He didn't know about Rings, so he was struggling (he had the source code, nothing more, and it didn't mention rings...). – user1729 Jul 22 '13 at 12:26
  • 9
    Oh, my message was apparently lost. The message was meant to be that A) you don't need to know about rotation groups because they are useful in 3D-graphics; B) you don't need to know about finite fields because they are useful in coding theory; C) et cetera. You may not need to know about any of that and be a good programmer. But knowing about these gives you more tools to model your tasks with. So IF you know them, it will be easier for you to create something new. – Jyrki Lahtonen Jul 22 '13 at 12:54
  • 4
    The cynical part is that it is not cost-efficient to teach all the CS majors abstract algebra (when I was in a committee drafting a math curriculum for engineers this saddened me no end). But the aspiring ones should learn it and other tools - just in case (and also because that enables them to learn more tools). We simply cannot foresee what kind of math they will be needing during their careers, because the field is moving fast and different people will face such a variety of problems. – Jyrki Lahtonen Jul 22 '13 at 12:59
  • 1
    I like your second comment, and I understand your answer much better for it. Thanks. – user1729 Jul 22 '13 at 13:47
  • 2
    Programming is all math under the hood. Therefore more you know about math, the more it can inform your programming. There is nothing more frustrating than when you can visualize what you want to program, but in trying to implement it, you hit a math wall. – CommaToast Jun 19 '17 at 20:18
  • "They learn about the binary mod and various rules around it, but they don't learn the bidirectional power of congruences, and don't learn to think in terms of the residue class rings - it's all remainders of divisions of integers to them" Can you elaborate on this? I would love to know more. I am the kind of programmer you're referring to. – Otavio Macedo Dec 12 '21 at 10:04
17

Group theory has been used in theoretical physics for about 80 years.

Dusan Nesic
  • 121
  • 4
  • 5
    This was once explained to me by using [Noether's theorem](http://en.wikipedia.org/wiki/Noether%27s_theorem), which, so the explanation went, says that preserved energy corresponds to symmetry, so, to a group! – user1729 Jul 22 '13 at 09:46
17

When a piece of computer hardware communicates with another piece it sends a string of 1s and 0s of arbitrary length, and clearly there is the possibility of an error. To check for errors, the hardware does some nifty ring theory. Formally, this is called Cyclic Redundancy Check (CRC), and is a faithful generalisation of the "check digit" we all learned about in school. However, it can spot $n$ errors as opposed to just the one (or rather, not-equal-to-zero-mod-($n+1$)-errors as opposed to an-odd-number-of-errors).

Right, so, what the hardware does is this: There is a fixed string, say $1011$. Every piece of hardware known this. Then, the hardware wants to send a string, which has arbitrary length. For example, $11111001$. The hardware then repeatedly performs XOR with this string, which is something the hardware can do really quickly (which is one of the benefits of CRC). $$\begin{align*} &11111001\\ &1011\\ &-----\\ \rightarrow & 01001001 \end{align*}$$ We now have a new string, $1001001$. So XOR again, repeatedly until we are forced to stop. $$\begin{align*} &1001001\\ &1011\\ &-----\\ \rightarrow & 0010001\\ &-----\\ &10001\\ &1011\\ &-----\\ \rightarrow &111 \end{align*}$$ and we have to stop as $1011$ is longer than $111$ so we cannot XOR. If we had ended up with a string of length less than three, for example, $11$, then we store it as $011$. The hardware then sends the concatenation of the original string with the new string of length three, $11111001111$. The receiving piece of hardware cuts off the last three digits to get the original string and performs the identical operation (remember: it knows the string $1011$ too). If it ends up with $111$ things are hunky-dory. Otherwise, there is an error so it asks to send the string again.

This system has a number of advantages. For example, XOR is performed quickly by hardware and strings can be of unbounded length. So, this is genuinely used by hardware.

Right, now for the mathematics! What is going on is we are viewing strings as elements over the ring $\mathbb{F}_2[x]$, so our example string $11111001$ is the polynomial $x^7+x^6+x^5+x^4+x^3+1$. The hardware is simply finding the coset representative of this polynomial in the quotient ring $\mathbb{F}_2[x]/(x^3+x^2+1)$, which here is $x^2+x+1$. Simples!

For the extra mathematical twist: This works best if the fixed string corresponds to an irreducible polynomial, so then the quotient ring is actually a field. Otherwise, there are zero-divisors, and zero-divisors mean that errors will not be picked up.

user1729
  • 28,057
  • 7
  • 60
  • 128
11

Two important applications which are involved at every phone call on your mobile phone: Coding theory (adding redundancy to the information, such that occuring errors can be compensated) and cryptography.

In both cases, basic mathematical structures are needed where the computer can do the arithmetics. In contrast to numerical simulations of the "real life", we are completely free in our choice. A human would probably go with the integers, rationals or the reals, where he feels "at home". But for a computer, they are not the best choice, since they can grow arbitrarily big. Much better suited are the finite fields, having only a finite number of elements. The theory behind them involves a lot of abstract algebra. And the coding theoretic / cryptographic methods built upon them involve even more abstract algebra...

azimut
  • 20,315
  • 10
  • 65
  • 122
8

Semi-rings are used in computational linguistics for weighted automata algorithms. An automaton is a kind of directed graph with annotated edges (or, sometimes, annotated nodes) which is traversed during processing of some input data, and serves to transform the input data (e.g. translate one language into another), annotate the input data (e.g. tag the words in the input with their corresponding part of speech tags), or do some calculation over the input.

Weights can be added to the edges of automatons, where the weights are elements of a carrier set of a semi-ring, to allow a wider variety of such calculations; for example, they facilitate working with probabilities. Weights can be real numbers, probabilities, distances, strings, feature structures, sets, matrices, or other elements, depending on the chosen semi-ring.

The semi-ring properties, such as closure properties and algebraic possibilities on the semi-ring, are very important for many language-related automata algorithms: finding shortest paths, longest matches, doing transformations on the automaton (such as determinization and minimization) and many others.

Here is a list of some semi-rings that might be used for automata weights, where each semi-ring consists of a carrier set associated with two operations, abstract plus and abstract times with their respective identity elements. When an automaton is traversed, the weights along each path are multiplied, and then the results of all paths are added to obtain a result.

  • Boolean semiring: $ \left< \{t,f\}, \vee, \wedge, false, true \right> $
    "A path is $ true $ if all its edges are $ true $; the result is $true$ if there is at least one $true$ path."
  • real semiring: $ \left< \mathbb{R}, +, \dot{}, 0, 1 \right> $
  • probabilistic semiring: $ \left< [0,1], +, \dot{}, 0, 1 \right> $
    "Probabilities along each path are multiplied; then the sum of all path probabilities is the result"
  • tropical semiring: $ \left< \mathbb{R}_\infty^+, min, +, \infty, 0 \right> $
    "If the weights represent edge lengths, the result is the shortest possible path length from start to end node"
  • Viterby semiring: $ \left< [0,1], max, \dot{}, 0, 1 \right> $
    "maximum probability of any path from start to end node"
  • log semiring: $ \left< \mathbb{R}_\infty, \oplus_{log}, +, \infty, 0 \right> $
    with $ x \oplus_{log} y = -ln\left( e^{-x} + e^{-y} \right) $
    "like the probabilistic semiring, but $-log$-transformed to avoid numerical underflow and increase performance; see Intuitive use of logarithms"
  • arctic semiring: $ \left< \mathbb{R}_{-\infty}, max, +, -\infty, 0 \right> $
    "opposite of tropical; longest paths"
  • fuzzy semiring: $ \left< [0,1], max, min, 0, 1 \right> $
    "If edges are bridges, weights are maximum load capacities, this will obtain the maximum weight that can be hauled from start node to end node: get strongest path, where each path is only as strong as its weakest link"
  • concatenation semiring: $ \left< 2^{\Sigma^*}, \cup, \dot{}, \emptyset, \{\epsilon\} \right> $
    with $ 2^{\Sigma^*} = $ powerset of all strings $ = $ set of all formal languages,
    $ \epsilon = $ empty string
  • string semiring: $ \left< \Sigma \cup {s_\infty}, \wedge, \dot{}, s_\infty, \epsilon \right> $
    with $s_\infty = $ string which contains all other strings,
    $ \wedge = $ longest common prefix
  • set semiring: $ \left< 2^M, \cup, \cap, \emptyset, M \right> $
    where $ M $ is an arbitrary set
  • unification semiring: $\left< 2^\mathcal{F}, \cup, \sqcup, \emptyset, \{\bot\} \right> $
    where $\mathcal{F} = $ set of feature structures
    $\sqcup = $ unification
    $\bot = $ bottom, invalid feature structure
Felix Dombek
  • 325
  • 2
  • 9
7

One major application of fields in the CS arena is to coding theory - methods for encoding information for transmission over a lossy channel, so that any damage can either be detected or repaired.

Outside of that, one of the primary driving forces in modern algebra has always been number theory. We can learn a good deal about things that, at their core, seem to be entirely about extensions of $\mathbb{Q}$ if we go to a slightly more abstract setting.

Nick Peterson
  • 31,127
  • 2
  • 52
  • 72
4

I always thought they were useful (instinctively) but failed to see where/how. Are they just theoretical concepts without practical applications?

I'm 100% sure you've written programs $p_1,p_2,p_3$ before, which take data $\mathrm{in}$ in and after the code did what it should it you get computed data $\mathrm{out}$ as return - as in $\mathrm{out}=p_1(p_2(p_3(\mathrm{in})))$. And then you realized you can define a new lumped program $p_{12}=p_1\circ p_2$ which takes $p_3(\mathrm{in})$ in. Then at this point you've already used associativity of function composition $$p_1\circ (p_2\circ p_3)=(p_1\circ p_2)\circ p_3,$$ which is one of the few starting points in investigating structures you listed.

numbers "$3+(-3)=0$,"
vectors "$\vec v\oplus(-\vec v)=\vec 0$,"
invertible functions "$f\circ f^{-1}=\mathbb{id}$,"...

are all pretty groups and you've applied their properties before. The study a mathematical fields like group-, ring- or field theory is an investigation of common mathematical structures. Take out a book on the subject and see how the theorems translate for those examples.

Nikolaj-K
  • 11,497
  • 2
  • 35
  • 82
4

A couple of specific applications:

user3490
  • 125
  • 4
3

Most of the examples above are about groups. Here are two books about using rings and ideals in computing along with other theoretical algebraic geometry concepts.

Michaela Light
  • 1,236
  • 9
  • 16
2

I think the most succinct answer might be that groups are very handy at encoding information about symmetry, which is something ubiquitous in mathematics and the sciences (I can't speak for computer science). More importantly, groups encode the structure of symmetries without having to bother with looking at the explicit symmetries themselves. That's just to say that most of the time it's not important to know what the symmetries actually are but rather how they interact with each other.

For example, if you look at the usual group D4, represented as the symmetries of a square, it's not important that such and such element of the group represents such and such action on the square; instead it captures exactly how the different actions on the square react with each other. The same concept applies very similarly in more advanced contexts, where groups can model the symmetries of a crystal or of quantum mechanical systems or of more complicated mathematical structures.

youler
  • 2,300
  • 13
  • 20