I am a final year Bachelor of Mathematics student and next semester I will write my Bachelor thesis.
My interests are in Machine Learning (ML) and I will do a master in ML next year. More specific sub-fields I like are

  • Deep Learning
  • Computer Vision
  • Natural Language Processing
  • Reinforcement Learning

And my interests outside of ML and mathematics include

  • Self-driving cars (e.g. Tesla)
  • Rocket and space exploration

more vaguely, I find tech interesting as a whole.


I am looking for a thesis which would bring me as close as possible to the field of ML. Do you have topic recommendations ?

BUT my Bachelor is in Mathematics therefore I shall not write a thesis in Computer Science as it would not be accepted by my study director.

Some thoughts

I know some people who were in my case. One of them for instance discovered and proved some convergence results in the context of Gradient Descent. Maybe this will inspire you with your answers.

Thanks in advance!

  • 1
    That seems like a job for your advisor. –  Dec 30 '20 at 10:38
  • I unfortunately do not have an advisor – Joris Limonier Dec 30 '20 at 10:39
  • I don't know how accessible this is for a bachelor thesis. But I would check https://arxiv.org/abs/1906.00193 and its references – Kernel Dec 30 '20 at 10:41
  • 2
    Cross-posted: https://cs.stackexchange.com/q/133808/755, https://math.stackexchange.com/q/3966709/14578, https://datascience.stackexchange.com/q/87335/8560. Please [do not post the same question on multiple sites](https://meta.stackexchange.com/q/64068). – D.W. Dec 31 '20 at 07:53

2 Answers2


There is lots of to do in machine learning when it comes to optimization. You can take a closer look to gradient descent and other competitive algorithms that may play the same (or even better) role in ML.

I don't know about you analysis and probability skills, but obviously it is very popular to work with MCMC, time series etc.

Another direction would be more towards statistics: Bayesian networks etc.

I do not know well RL, however if you look for inspirations in NLP you should look at Stanford or Edinburgh thesis. There is lots of good work happening there, and I am sure some papers are more theoretical. When it comes to computer vision, you can look at optical flow. Recently it is very popular and it is related to self-driving cars.

I assume that even if you are UG student passionate about ML, you are still missing a great deal of experience in such topic. I believe it is ok not to start immediately ML projects without solid foundations or relevant experience. I studied Mathematics and Computer Science. Even if I took couple of advanced,pure maths courses, they are extremely helpful later on in other fields, such as ML. Don't be worried about doing not precisely ML-focused dissertation. It may be later on even beneficial to look at problems from different angles.

  • 331
  • 1
  • 3
  • Thank you for your answer! I have a 'strong' background on the theoretic mathematical side, at least compared to my more modest background in computer science. A thesis in ML also shows universities where I apply for masters that I am actually interested in ML (they advised me to do so) as we have to provide the topic of our thesis during application. – Joris Limonier Dec 30 '20 at 10:46

Great question. I can think of a couple of topics. (1) Humans exploit symmetry in a pretty big way to make learning vastly more efficient. The canonical example is that objects "are the same" whatever their size and rotation. Can you somehow tie symmetry into the learning in a fundamental way? I'm sure that people have already worked on that, so you'll to see what's been done already and see how you can take it farther.

(2) Bayesian inference is a pretty general framework for reasoning under uncertainty, but the application of it is hampered by the need for approximations for multidimensional integrals, which are typically very expensive to compute. Can you expand the class of models for which exact results are known? For example, can you derive some exact results for the posterior distribution of the parameters of a system which is governed by differential equations.

Good luck and have fun!

Robert Dodier
  • 238
  • 1
  • 9
  • Thank you for your ideas! Could you precise how idea (1) relates to ML ? If you meant that symmetry increases learning capabilities, isn't that a human-focused approach ? In that case, wouldn't that bring me closer to psychology than mathematics? – Joris Limonier Dec 30 '20 at 18:41
  • ML models are just big complicated pattern matchers, and they are notoriously susceptible to learning (i.e. adjusting parameters to accommodate) features which we already know are irrelevant, and incapable of generalizing features we know are relevant. E.g. we know a letter A is "A" even when it's upside down, and whether the background is red or green doesn't matter. How can you build that into a model the notion that those are the same thing? Symmetry is just the mathematical expression of that. – Robert Dodier Dec 30 '20 at 18:50
  • Oh that's clearer, thank for the precision! – Joris Limonier Dec 31 '20 at 16:14