Benny Avelin tells about his research on neural networks
Benny Avelin was born in 1984 and earned his doctorate in 2013 with the thesis Boundary behavior for p-Laplace equations. In short, the dissertation deals with so-called boundary behavior for these types of equations, that is, how the value of solutions to the equation changes at a certain point near the boundary.
Benny left academia for a year to devote himself to working as a data scientist at Combient AB. Now that he has returned, he spends most of his time on research in data science and artificial intelligence (AI). The emphasis is on neural networks and how they learn what we want them to learn.
“As far as we can see, large neural networks can learn very complicated patterns. One example is when search engines learn to recognize images or distinguish certain motifs in a large number of images,” Benny says.
Research on the machine learning aspect of AI has its foundation in computer science. The area caught the eye of mathematicians after a research group created an artificial neural network to classify images as part of an image recognition contest in 2012 and won with only 15.3 per cent incorrect guesses. The runner up had 26.1 per cent wrong.
“Today, neural networks are better than people at recognizing images,” says Benny.
Because the area is so new, there is much that remains unexplored.
“Research in AI is incredibly exciting because basic research has a direct impact on applications. From a purely mathematical point of view, it is extremely interesting because there are very strong links to a large part of modern mathematics.”
In addition to mathematics, Benny is interested in physics and could have chosen that path, but he is attracted by the greater rigor of mathematics.
“In mathematics, you go from simple assumptions to proof. Without mathematical proof, you can try to empirically prove something but the best you can hope for is a reasonable theory.”
In 2019 Benny was awarded a research grant from the Swedish Research Council, which means he can fund computational resources, doctoral students, and postdocs.
Networks are made up of neurons, which receive and send signals. The networks can be described as machine connections that find patterns that separate objects. The algorithms that underlie the networks are self-learning, which means that the networks learn more and more.