Exploring quantum physics plus consciousness, meditation, the afterlife and ancient mystical teachings from the perspective of modern science.

Menu

Teaching a Computer to Speak

Recently I’ve been wondering if it might be possible to create a computer out of fungus or plant cells. The idea is it’d work like a “neural network” in computing. You’d supply certain inputs in the form of electrical stimulation and get certain electrical outputs. If you got appropriate outputs for a given input, you’d somehow strengthen the relevant pathways, perhaps by amping up the current to encourage fungal growth along the right lines. If the output was innapropriate, you’d somehow decrease growth along those pathways.

But what if this did indeed turn out to be feasible? We’d have a “computer” that functioned a somewhat like a human brain, minus the structure. It wouldn’t be good at doing rapid calculations, but might be good at fuzzy logic of the kind the human brain is good at.

We could imagine teaching such a computer to recognise words, producing output corresponding to the typed version of the word in response to input representing the word spoken by a human being. This is something that digital neural networks can already do.

Beyond this, neural networks suffer limitations due to their lack of parallel processing ability; the fungus brain may suffer no such limitations. So could we go further, feeding the computer (perhaps I should say, “brain”), sentences taken from the Internet and getting it to produce, even speak, via appropriate apparatus, suitable responses? Suppose this worked perfectly and we got a machine that could produce a sensible-sounding response to most questions, in effect carrying on a conversation …

The problem is, the machine would be producing appropriate responses but with no context. Ask it what its name is, and it wouldn’t know. Ask it what the weather is like in Rome and it would produce an answer … but surely not the right answer. Could we somehow get the brain to understand context?

Maybe it could be trained to recognise the difference between an approving human voice and a disapproving one. Disapproving voices causes a weakening of whatever pathways just fired; approving ones strengthen them (via some suitable mechanism). The brain has to detect approval or disapproval and then engage, via suitable output, a mechanism that strengthens or weakens the last pathways that fired.

How far could such a machine, in conversation with human beings, actually go towards sounding fully human? A baby can go all the way, but of course is extensively pre-programmed by evolution. How far could we go with a completely unstructured fungus brain?

If it did learn to sound actually human in its responses, would we expect that it would then actually have feelings? I’ve argued in my podcasts that a digital computer could never have feelings, but a fungal brain I’m less sure about … after all, we really don’t know why babies end up having feelings.

This experiment might sound whacky, and doubtless progress would be very slow, but I’m inclined to try it. Even a fungus brain that could recognise characters would be huge achievement … I wonder if it’s possible to grow vats of neural cells? Even plants presumably have neural cells of some kind, otherwise how do plants detect light, and how does Mimosa pudica move in response to touch? Or is this some other mechanism entirely? Time to have a look at Wikipedia …

Post navigation

One thought on “Teaching a Computer to Speak”

Hi, I find these topic really interesting. Language is a gate to thought and I’d really like to explore more about it. I study Psychology and I’m trying to find a way to make it converge with artificial intelligence and making a computer think (in the imperfect way humans do).
One of the main elements I must include is, as you pointed out, the context and our understanding of our surroundings.
Thanks for this post, I’m going to keep reading interesting stuff on your website.
Cheers!