Hebbian learning

and no gradient descent in Devlog #11

Published:
Published:

Sensor has a receptive field and some dendrites that come from the higher neurons back. When something appears in the receptive field, the neuron starts firing at a high rate. The signal goes through some modifications when it crosses layers above and above. Then some blocks try to find it in memory. If no plasticity is needed, then the inhibitory signal comes back, back to the sensor, and together with this slow down command the rate slows down. And the signal is not going to the memory. No extra plasticity is activated over and over.


I don’t know if this will explain the previous paragraph in more detail but one clarification came to me while I was running. About learning with Hebbian rule. When a signal doesn’t propagate further some chemicals accumulated in the neuron and because of it such condition is registered in the brain. It sort of implies that there is a controller that observes such behavior but only if one would think of implementation details. In a real brain it happens through chemistry—some molecules accumulated in the soma making it to produce something else, then that can be released outside, where it affects neurons in another special way, forcing them to switch into a learning mode. Plus the same happens when a new path established and confirmed.

Do you know that feeling of a fresh idea or realization, a-ha moment—that’s where special neurotransmitters force that new knowledge to repeat in order to improve the connection even more. Maybe it’s not just increasing synaptic connectivity, like increasing weight from 0.5 to 0.95 as it would work in ANN, but it creates more parallel connections. Duplicates. That way any noised input still be recognized. It also allows to some connections to develop another path on top of this. Maybe more optimized. Maybe less. It may remind a genetic algorithm where it tries different approaches and the one that works better stays in constant use, and other options that didn’t happen to be so great, do not continue to be utilized in this process, so they can participate in new improvement iteration or in another process.

Rate this page