1

In the biology class we've learned, that the neurons are connected. And if two or more neurons interact with each other often, then the connection gets stronger and stronger, and new connections may form. But if a connection is unused for a while, then it gets weaker and weaker, and it may lost forever. That is how the human brain/memory learns and forgets something.

I am a bit confused now. Because I've started to learn about artificial neurons, and the method of learning is very different. We analyze, how efficient the network was, and before restarting it, we update the weights a bit, based on the relative difference between the returned and the expected value.

So my question is: How is a real-world brain capable of learning? And does it have a default configuration?

Nikos M.
  • 2,493
  • 1
  • 7
  • 11
Iter Ator
  • 195
  • 1
  • 5

1 Answers1

1

Artificial Neural Networks are algorithms loosely based in how a brain functions so they shouldn't be treated as the equivalent of brain learning.

The science that studies all the details of how neurons and brains learn is Computational Neuroscience and the question you ask is pretty much an open question but it has many great hypothesis.

The method you described about how artificial neural networks update their "weights" is called backpropagation and some authors included O'Reilly, Randall; Munakata, Yuko (2000) in Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain argue that backpropagation is actually the algorithm that drives learning in the brain. In later publications they propose architectures that include the Hippocampus region to re-create short term memory.

However in recent years, with the advent of mainstream deep learning new ideas have arrived along with old ideas with new approaches. The ones that look to replicate the brain and its function or at least approximate it as much as we currently can are:

Spiking neural networks are computationally expensive and hard to tune, their performance isn't close to what deep learning can do, however natural brains use some form of Spiking neural networks which tells you about the potential of these kind of models.

Another interesting mention is Neural Turing Machines. It's an interesting approach on how certain brain inspired processes can be used to compute diverse problems.

wacax
  • 3,500
  • 4
  • 26
  • 48