Building brains – how close are we?

I recently finished watching the TV series on Netflix called Westworld. Yes, it came out in 2016, and yes, I was late on the bandwagon. But now I understand why there was so much hype about this Sci-Fi meets Wild West drama series.

It’s set in the wild west somewhere in the foreseeable future, in a theme park called Westworld. Unlike Disneyland, every host there is actually a super high-tech android, that has been taught to think and feel the way that humans do. They all have intricate backstories and storylines to take guests on, like normal theme parks have different rides. It’s based on a film from 1973 of the same name.

The hosts were of spooky human-likeness, and the line separating humans and robots is often blurry. It’s impossible to watch something like this and not ask the question – is something like this really possible? Could scientists engineer “brains” that think the way that we do? And if it’s possible, where are we on that road to discovery?

This kind of science is being investigated, but we’re still a bit behind the knowledge we’d need before we can create a real Westworld. The current focus is mostly on understanding the human brain, and especially, how it can learn.

The learning curve

Learning is an extremely curious process, because it involves physical changes in the wiring of the brain. The cells of the brains are called neurons, and neurons are what pass messages around the brain and through the body. Think of them like streets that connect and branch to form a massive network map. Pathways that are often used become stronger; a phenomenon known as “neuroplasticity”.

There are numerical models that use binary numbers (0 and 1) to imitate the flow of signals through neurons, like traffic lights in the roadmap of neural pathways. But the brain doesn’t function in equations, the same way that streets are much more than just roads and traffic lights. The whole reason we have streets is to carry cars, the same way that the brain’s networks carry electrical currents. These currents are how the neurons talk to one another, by sending out an electrical pulse.

To really understand neural networks in the brain, we need a physical model: one that actually carries electrical impulses the way the brain does.

Instead of executing a pre-programmed function like most computers, these models are progressively configured. It’s almost like they “learn” how to work.

As Meier put it, he’s developing “the first computer that does not compute”, rather a machine that actually learns the processes as it goes, the way the learning brain does. This means that the model will be effectively modelling neuroplasticity, as it configures it’s own function based on the input it receives.

Outsmarting ourselves

The advantage of these silicon models is that they are much faster than the brain. The neurons of the brain have a very low conductivity, but a high capacity. It’s like filling a large bucket of water (the capacity) with a tiny hose (the conductivity). Silicone has a much faster conductance than neurons, so it can process information, complete tasks and undergo changes similar to neuroplasticity in a much faster time frame.

Instead of trickling information through at the biological brain’s pace, these machines can catapult enormous quantities of information in a fraction of the time. As seen in Westworld, these machines could (maybe one day) be far smarter than us. Imagine what would that mean for society, if we create a superior being to ourselves! Or even a vessel to harvest our individual brains, and make us immortal…

Remember that Westworld was originally a film from the 1970’s, so these ideas of android takeover have been around for a while. It might be many years again before we see these kinds of technologies emerge.

Until then, these new silicon brain models could change the way scientists investigate the behaviour of neural networks, and help scientists better understand the biological brain to build artificial cognitive systems. Maybe these models are the key to unlocking what it takes to build a fully functioning human robot, or maybe they just help us build really fancy computers.