The core problem with building sophisticated artificial intelligence, is understanding, and modeling, all the intricate dynamics behind how neurons work. Neurons, and all their various activities, are extremely complex, and we are still learning new things today. Artificial neural networks have been very instrumental in the recent advancements in AI. The problem, is that they are still very crude representations of the actual dynamics that go on inside the brain. If we could effectively model how a neuron works, we could simulate a neuron on a computer, and let its natural ability to learn, unfold in the simulations.
So, what if we trained a GAN to mimic the dynamics of a neuron?
What if you fed it as much data as you could about the neuron, its activity, how it interacts and affects other neurons. And how other neurons interact and affect it. And tried to get the GAN to model that behaviour?
If you can figure out the general rules behind how the neurons interact with each other. Once you know those rules, you don't have to understand the brain. You can just simulate the neurons, and let the intelligence naturally emerge from their interactions.
This is footage of a neuron taken using atomic force microscopy, allowing for imaging of nanoscale dynamics of neurons.
The key to making this work is the training data, and figuring out how to best capture and collate it..
Footage like this could serve as a source of training data for the GAN. This data, combined with the electrical activity that correlates with the data. Could potentially serve as a great dataset from which to allow a GAN to model neuronal behaviour.