In the biology class we've learned, that the neurons are connected. And if two or more neurons interact with each other often, then the connection gets stronger and stronger, and new connections may form. But if a connection is unused for a while, then it gets weaker and weaker, and it may lost forever. That is how the human brain/memory learns and forgets something.
I am a bit confused now. Because I've started to learn about artificial neurons, and the method of learning is very different. We analyze, how efficient the network was, and before restarting it, we update the weights a bit, based on the relative difference between the returned and the expected value.
So my question is: How is a real-world brain capable of learning? And does it have a default configuration?