I am trying to write a sci-fi setting in a not so distant future in which analog signal (brainwaves, in this case) processing is one of the main points of the plot and pretty much required to explain some of the mechanics going on in the universe.

Thing is, analog to digital conversion is expensive, and it may compress data into an easier to handle finite set of values that would, in exchange, make it drop some information in the process. This is something I don't want since some of the mechanics required to develop the plot require subtle differences in a person's brainwaves (in this case, used as some sort of biometric key). This last part can be avoidable by just throwing more resources at a regular digital computer, but that's lazy and something I don't want, as an analog computer could allow for more interesting details and implications.

Just how viable would it be for the world to go back to analog? In the past, we had analog computers, but we changed to digital because apparently they weren't designed with programmability in mind (as in, they were like ASICs) and soon digital became better than analog so there was no reason to try to improve a deprecated technology. Likewise, most of our telecommunication devices operate on waves and analog signals, but they are translated to digital at some point of the process (ie. the modem) and lose all the properties an analog signal has.

DARPA tried to build an analog "cellular neuronal network" CPU (project UPSIDE) for computer vision back in 2012, but there is not much information about it. Apparently, it allows for much faster speeds at a lower energy cost, at the expense of some errors from time to time and what has been described as requiring a much different way of tackling problems. Problem is, it says nothing about how programmable it is (which it apparently is, but it doesn't mention if it's Turing complete by itself or not). In addition, it seems to be a hybrid analog-digital computer, which is the concept I initially thought about including in my story.

In the future, could we see the following things? How superior would they be to their digital counterparts? Would they have any limitations?

Purely analog CPU (could they run the programs we run today? As in, would they still be usable as a PC?)

Hybrid analog-digital CPU, where they complement each other depending on the problem at hand

Analog RAM/storage. May be digital-analog or purely analog. Could it be made persistent, like with memristors? How would this work, anyway?

Truly analog telecommunications. I know they are impractical due to possible signal noise, but let's assume we have a reliable solution to compensate that, such as algorithms capable of discerning the real signal

Holographic CPU? Both digital and analog. I know a digital optical CPU is viable in theory, but I have no idea about analog; I assume it could operate on light frequency/color or something.

Analog-oriented programming languages. How different would they be from the programming languages we use today if any? Could a uniform analog-digital language exist, where a smart compiler decides whether to use the analog CPU or the digital CPU in a fashion similar to what hUMA compiler optimizations already do?

Mind you that while this setting isn't supposed to be hard sci-fi at all, it isn't fantasy science either. Whatever the answers are, they should be at least remotely viable in reality, and more specifically, doable within the following 70 years or less, although computer science could always do some breakthroughs. No suspension of disbelief should be required to enjoy the setting, even if you were somewhat knowledgeable about the field.

Note: I guess you could draw some parallels from analog computing to quantum computing since both seem to work best (or only) on probabilistic algorithms instead of deterministic, but this is not about quantum. Quantum technology exists in this setting, but it's extremely rare and only used in some specific contexts, not to mention most of it is extremely experimental and the general public is oblivious about the existence of the somewhat viable prototypes.

Edit: to be more specific, the context and use cases of this technology are that user input is now handled through some sort of a matrix of electrodes implanted into the brain, capable of reading the user's brain activity/thoughts. The software handling the output of this matrix already tries to transform brain activity into some sort of "universal brain language" that covers up the differences between human brains, but still requires an analog/real numbers/wave signal for fine precision (not as in error-free, but as in descriptive) and high throughput. Analog signals were chosen because the brain can easily recover from small errors and discrepancies and because it is more similar to the way the human brain works, but due to limitations of the feedback system, "lag" and slow transmissions is something that you would generally not want to get in your wetware, which is why digital signals were discarded: the electrodes array requires a continuous stream of data, so buffering and processing something to compress it and send it over the network would make the brain "halt" waiting for the next signal, essentially damaging your psyche on the long run (think of becoming deprived of your senses or getting mind-frozen in place every two seconds, which is what the digital transmission could do, compared to seeing some static in your viewport every few seconds, which is what you could get when on analog).

This is also the reason computers perform operations on their continuous output analog stream concurrently, to reduce the time the user waits for a reply. It is also the reason all algorithms that directly read from the user wave are also concurrent: it is better to update the wave late than to halt it until the answer is processed. In addition, due to the nature of the brain, a thought or sequence of thoughts can be read and predicted as it is being formed, but can't be confirmed until it is fully formed. This detail is extremely important, as the plot device is based around this fact.

Think of the exchange of information between a computer and a human as a regular conversation between two humans (it would be more like telepathy, but for simplicity's sake, let's assume they are just speaking).

Computer is just patiently nodding as the human talks to it, representing continuous feedback

What really happened here is that the user made a request for the deletion of a specific file. As the user formed its sentence, the computer already started doing all the necessary preparations for its execution, much in the way we humans converse: we can identify a word by its lexeme before the word is completely formed, so we can more or less guess what will come next, but we can't completely understand the full implications of this word until we hear all the morphemes (if any). Likewise, we can try to make a wild guess about which word may come next and try to understand what the other person is trying to tell us, but we will not be sure about the specifics until the whole sentence is complete. Likewise, a single sentence might throw some light into the context of the topic at hand, etc. After the user's request was completed, the object vanished from the user's viewport in a fraction of a second.

This point is extremely important because the hackers of the future will attempt to trick the machine into doing something else (or just bring it to a halt) by surprising it with some sort of "punchline" capable of surprising it. Since security programs are concurrent, they can't really understand the full scope of the user's actions until it's already too late. Think of it like setting up a trap for the enemy king in chess over several turns: most of the "illogical" movements made earlier start to make sense the moment your king is killed. The paragraph talking about cranes was actually talking about birds and not boxes, but the computer could have never seen that coming since it mostly operates on sentences and not contexts as big as a paragraph or small text; generally, the precision of its predictions are drastically reduced the bigger the scope is, although it can still operate in bigger schemes if specifically programmed to do so.

To identify what the user's trying to say, modern CPU incorporates a neuronal network that allows the OS to retroactively make some sense of words after hearing a string of letters. More often than not this is abstracted from userland programs through the use of libraries and APIs, although they may get access to the wavestream depending on their permissions.

The "biometric authentication" system I mentioned before actually operates on big segments of the stream. Since the automatic conversion to "universal brain language" reduces (not removes!) the variance between user brains, trying to identify a user by these differences alone is impossible (not to mention that the, although small but random, noise the line may have, such a level of detail would be impossible). This is why the user authentication software operates on a larger set of thoughts: it detects the approximate state of mind of the user (excited, angry, relaxed, etc) and "mannerisms" they may have. This is more or less the equivalent of the accent a person may have or stylometric analysis of their texts: it identifies them with a high degree of precision, but it's not infallible. Hackers may again try to disguise themselves as the system operator of a device by using meditation techniques to appear as if they were thinking like the legitimate user of said computer.

This "universal brain language" I talk about would be more or less like any human language (such as English). It encodes information so everyone can understand it, but it's not digital because the way you speak it may say something more about your message than what the language can express. That means, in the conversation example, the user may be thinking of deleting as symbol A with modifying factor B, while the software translates it to symbol X with modifying factor Y (which may be equal to B, although I haven't thought of that yet. I don't think it actually matters). The modifying factor is what tells the computer that you didn't just think of deleting, but that it also seems to sound as if the user was somewhat distressed or angry: it is analog metadata that would be difficult to translate to digital without butchering its meaning. Here is where the CPU's neuronal networks try to take a guess about what does this metadata mean, much in the same way a human would try to guess what does that tone of voice mean; it may be easier to guess when the modifying factor is stronger.

What I originally meant with this question is: how could the CPU process this brainwave? Could some technology directly operate on this wave through the use of analog operation programs or would convert to digital be required for all cases? Mind you that the CPU has a digital coprocessor that can process those problems where the analog computer can't process that well, although communication between these two may be slightly slower in the same fashion memory to on-die cache transfers are slow. Could the analog CPU be a universal Turing machine, independently of how practical that could be? Alternatively, if this isn't the case, would analog emulation on a digital CPU (emulated neuronal network simulations, like a partial brain simulation) be the only way to tackle this problem? In addition, could information about a wave be persistently stored somewhere? Could said stored wave be actually stored as a wave and not as a "parametrization" of a wave?

$\begingroup$welcome to the site. Its a bit of a big question I hope that helped. You may want to add the hard science tag, and break this into 2 or 3 separate questions$\endgroup$
– sdrawkcabdearFeb 6 '16 at 20:08

$\begingroup$@sdrawkcabdear Thank you! I was unsure as to whether I should tag the question as hard science or not since the description of the tag says it implies that all answers should be backed up with cold, hard scientific facts and citations/demonstrations, which is something that would be pretty cool but not really necessary for this question.$\endgroup$
– a1901514Feb 6 '16 at 20:33

$\begingroup$This question is too big. Break it up if you want better answers.$\endgroup$
– SchwernFeb 8 '16 at 18:06

$\begingroup$You should take a look at Quantum computing.$\endgroup$
– T. SarFeb 3 '17 at 9:48

1

$\begingroup$I'd just like to note that you have a very big misconception of the history of computing and communicating machines. We went from analog to digital (or analog to discrete in the case of communications) not because of programmability but because of improvements to signal to noise ratio. Technically most AI is analog even though that analog system is implemented on digital computers.$\endgroup$
– slebetmanNov 21 '17 at 3:28

11 Answers
11

I very much enjoy playing with this topic, so I'm going to turn the question on its head. Every computer made today is a hybrid analog/digital computer you just might not know it! Modern clock speeds are so blindingly fast that we find ourselves doing analog signal conditioning in the middle of our supposedly digital equipment. Modern memory subsystems use analog techniques to crank as many digital bits through the pipe as they can. Physical traces along boards are routed to be the same length as other wires on the bus, and are structures line analog transmission lines on the CB. CPUs regularly have to be concerned with the rise of parasitic capacitance between its logic elements, limiting the "fanout" of one output to multiple inputs. Gigabit Ethernet actually relies on analog superposition of voltages to achieve its extreme speeds. Analog appears everywhere in computing, so its clearly not the hardware we're talking about when we discuss analog vs. digital.

I think the more important distinction in the analog vs. digital world is found in our models of said computer subsystems. It's not that a CPU doesn't have mixed analog/digital on it, but rather the fact that we model our CPUs as though they were purely digital that really matters. We like to pretend our CPUs are perfect digital structures when we write thousands of lines of code to be executed on it. The gap between analog and digital is in the minds of the developers more than it is in the actual hardware. The hardware manufacturers embraced the analog part of their job a long time ago.

Accordingly, the viability of "analog" computing would be stoutly seated on the desire of programmers to answer questions which are best handled in analog. Once the desire is there, the hardware manufacturers will happily expose analog behavior to them. Then the slow grind towards making useful analog behaviors would begin (this is where that DARPA effort fits in).

So when is an analog model useful? The most important answer I can think of is meta-stability. Digital circuits really like to resolve to either a true or false state for every bit of information. In fact, if you give a half-way voltage to many digital logic circuits, you can actually enter trippy metastable states where the circuit can actually cease to behave as intended for an arbitrary amount of time after the half-way voltage has been resolved. At the hardware level, we spend a great deal of time preventing our logic circuits from ever "seeing" the metastable points as the voltages swing from low to high, typically using "clocking" to do so.

Where would this be useful? One thing you mentioned was energy. Analog computing can be more energy efficient than digital because it can tailor its signal-to-noise ratios to the moment. Consider a digital number, in binary, 100001. This is the number 33 in decimal. If noise corrupts the leftmost bit, it becomes 000001, which is 1 in decimal. If noise corrupts the rightmost bit, it becomes 100000, which is 32 in decimal. In some situations, the difference in semantic meaning of 33 and 32 is pretty minimal, and you might be willing to accept some error in the rightmost bit, in exchange for being more efficient. You may be less willing to accept error in the leftmost bit, which changes 33 into 1. However, if that 33 is in an equation, say perhaps 33 - 32, suddenly that subtraction makes all of the bits important. The difference between a 1 or a 0 from that subtraction could be a very big deal! Digital models cannot implement such decision making, because they would have to admit an analog model underneath to do so. Meanwhile, your mind has no problem being an inch off when waving your hand left to right saying goodbye to someone, and then cranking up the precision to write something legibly.

This would be very powerful if you were dealing with highly intricate parallel operations. Right now, if two processes try to write different values to the same location, they must be "deconflicted." One must win, and the other must lose. The process happens in a blink of an eye, far faster than either process was aware of. However, what if your two processes wanted to be smarter, and actually talk through their differences to determine what the final result could be. This would be a natural operation in an analog computer. In a digital computer, we have to go through many hoops to make that happen (and typically we just choose to fix the "race case" in its entirety and move on).

One final point of metastability is its ability to be still. A circuit sitting in its metastable state can be ready to spring to life at the slightest sensation. Doing this in digital is hard, because the gaps between low and high signals are so large. We typically have to custom tailor digital circuits to pass these rapid fire signals, while an analog computer might handle them intrinsically on all signals. If a computer stops, it has to stop in a fixed state. Computation ceases. An analog computer can instead decrease the energy consumption more and more, driving towards a balanced state, while never actually stopping. Then, when the computer is resumed, it may have actually accomplished something while stopped!

$\begingroup$Well, ain't that a twist! This is amazing. I guess all we need is a change of mindset in the way developers develop their solutions to problems, which could be provided by some breakthrough in analog interfaces usability (analog programming languages becoming more like "traditional" programming languages). This breakthrough could be forced by the paradigm shift in the way users interact with their computers with the introduction of reliable brain-to-computer interfaces.$\endgroup$
– a1901514Feb 6 '16 at 22:49

2

$\begingroup$I would consider anything which involves discrete quantization and regeneration "digital". One may need to use various analog techniques to counteract various unwanted parasitic effects, but if the signal will ultimately be processed in such a way that noise below a certain level will be eliminated entirely, I'd consider it a digital system.$\endgroup$
– supercatAug 22 '16 at 16:08

$\begingroup$@supercat This is why I say "The gap between analog and digital is in the minds of the developers more than it is in the actual hardware." The definition of "ultimately be processed" is in the minds of the developers. As an example, consider a DSL modem, which is constantly monitoring the analog SNR of its channels, and making adjustments. It never fully lets go of that analog signal noise, but it gives the illusion of a purely digital interface to the computer. The hardware never gave up on the analog, only the developer did.$\endgroup$
– Cort AmmonAug 22 '16 at 16:23

$\begingroup$And it gets more tricky if the DSL modem is willing to provide SNR data to the computer, and it makes decisions based on that SNR data about how much data to transfer...$\endgroup$
– Cort AmmonAug 22 '16 at 16:23

$\begingroup$@CortAmmon: A modem will take its incoming signal and quantize it to some level. The quantization will generally be fine enough that the lower bits deviate somewhat unpredictably from the ideal expected value, but any any possible sequence of samples will define an envelope, such that any possible signal which stays within the envelope will be indistinguishable from any other.$\endgroup$
– supercatAug 22 '16 at 16:33

In the future you might have a breakthrough in developing analog computers which emulate the features and efficiency of a human brain. Much like today how a digital computer is paired with a specialized GPU for graphics, the analog brain would have a digital CPU co-processor to handle the things digital computers are good at. In essence, you'd have a thing which could do the things humans are good at (pattern matching, estimation) which can reference a thing computers are good at (very fast, very precise math).

Imagine a robot which could not only debate politics, but while doing so be performing the research and statistics necessary to weigh their decisions.

You can exploit two features of analog vs digital computing in your world: heat and precision vs accuracy. And build around our decades long failure to achieve AI using digital computing, something we've felt we're on the cusp of since the 60s. I imagine a world where, as computers get more and more integrated into everyday life, machines with human-like thinking becomes more and more important and analog computers are better at that.

Human brains differ from digital computers in a few key ways. Digital computers have to be designed, understood and built by humans, and humans like simple, orderly things. The human brain has evolved over hundreds of millions of years of trial and error and have no such restraint. As a result, human brains can do things in surprising ways. Digital computers are built by humans to be simple, they combine many, many of a few basic parts in novel ways. Human brains are very, very, very complex using thousands of specialized neurons each doing multiple specialized tasks and being reused and recombined in surprising ways. The result is a digital brain is predictable and precise, but at a cost of flexibility and efficiency, while a human brain is unpredictable and sloppy, but very flexible and very efficient.

Your future society will need a reason to transition from precision and predictability to flexibility and efficiency.

Digital computing does one thing really well: it very precisely, very predictably, and very quickly calculates a certain subset of problems. For other problems it will be appallingly slow. It will do exactly what you tell it to calculate and that's all it will do. And, as we all know, digital computers have a very hard time learning or dealing with anything new. As we all know it's very easy for even good AI to carry on a convincing conversation.

The human brain, an analog computer, is sometimes very precise (hitting a baseball), sometimes very sloppy (math and statistics), sometimes very fast (baseball and estimation), sometimes very slow (precise math), but it's going to give you an answer. And it can solve some problems much, much faster and more reliably than a computer can, for example computer vision, path finding problems, and pattern matching. It does it all very, very, very efficiently and it can handle them all. It is a truly general purpose computer. The same brain that can hit a fastball can do calculus, hold a conversation, run over obstacles, and cook a meal.

The human brain is extremely efficient and we're nowhere close to emulate what it's capable of. Take, for example, the Human Brain Project. Professor Steve Furber of the University of Manchester has this to say on his attempt, SpinNNaker.

With 1,000,000 [ARM] cores we only get to about 1% of the scale of the human brain.

The ARM968 they're using is not exactly top of the line, the Nintendo DSi uses one at 133Mhz, but speaks to how important parallelism and heat management is in modern computing and how bad we are at it.

$\begingroup$The human brain may actually be extremely inefficient as brains go. Consider parrots and crows. Alex the parrot had the linguistic ability of a human toddler. Google "the girl who feeds crows" for the wild crows that reciprocate her gifts as best they can, bringing her shiny found objects. Both these and many other examples running on brains smaller than walnuts. Parrots and crows are among the few species that pass the mirror test for sentience.$\endgroup$
– nigel222Feb 8 '16 at 11:15

I actually worked on analog[ue] computers in my "gap year", around 1968, at Redifon (the industrial arm of Rediffusion, a cable company avant-le-lettre).
I guess these were (at that time) pretty much state-of-the-art, but they were not that of which a silk purse is made.
I can't find images, but programming entailed plugging cables into a patchboard of about 100cm x 80cm, with maybe 50 x 40 sockets. A complete problem might involve 100-400 cables (think this) not to mention several dozen pages of notes.
Once you had a program working, it gave real-time answers to variations in inputs or parameters, with a speed and accuracy which the digital world only managed around 2000 (30 years later!)
Debugging was one problem (there was no way to trace variables); and stability was another: if one of the op-amps (JFET inputs, bipolar outputs) went unstable then the entire computer became what we called a "Christmas tree" with all overload lamps flashing.

So your questions: analog CPU: forget it.hybrid CPU: no idea.analog/RAM storage: theoretically possible with [super]-capacitors but not very practical.Analog-oriented programming languages: I really cannot see a bridge between wires on a patch-panel and the modern concept of programming.

In your story, if you want to parallel real life, you would be better off to use analog-to-digital converters and have as many bits as you need to capture the finest emotions.

$\begingroup$FPGA provides a hint at how Analog-oriented programming language would look like. FPGA programming involves describing connections between logic gates inputs/outputs, it pretty much IS a programming equivalent of bunch of cables connecting different elements. Replace digital logic gates with analog elements and rest of the programming scheme should hold.$\endgroup$
– M i echFeb 4 '17 at 5:19

MOSFET VS BJT
So the core component of cpus and memory are electrical switches called transistors. There are two main types digital MOSFET transistors and analog Bipolar junction transistors. The reason computers are digital is because MOSFETS are much better. Analog circuits are only use when the circuit has to be analog.

MOSFETS are more power effect since they only consume power when they 'switch' BJTs use power when ever the machine is on. MOSFETS can be made much smaller with out introducing errors because they have smaller leak currents. This means you can get 100s or 1000s of MOSFETS for the same power space and heat of a BJT.

analog CPU and RAM attempts were made using multilevel logic they were less effective then just more digital

there are hybrid analog digital processors, they usually take analog input convert to digital process it then convert back to analog output. Check out micro-controllers like the arduino.

current computers simulate analog numbers with floating point numbers and have special digital processors for simulating floating point operations. If analog processors got better they might replace floating point units in current cpus.

In theory an analog data point contains an infinite amount of information if it can be measured with infinite precision, its 1.23245... volts the decimal digits in theory go on forever. But we can never use those points because there is random unpredictable noise. A huge part of the Electrical Engineering field is trying to cope with noise. Since the noise makes some of the the information use able, why not just start with a smaller less noisy data set. Information theory shows removing random noise from a signal is a really hard problem unless you already know the original signal.

Optical is cool but is still 1000x too large and 1000x too slow to compete with standard cpus. I should just put this in perspective a current cpu has around 3 billion transistors that switch over a billion times per second, a major design concern is electrons have to have enough time to cross the chip between switches and sine they only move at the speed of light they only cover an inch or two in that time. The whole thing can cost under thirty bucks. Its hard to compete with that.

$\begingroup$Using analog FPU as a replacement for our current FPU could work. After all, what I am asking for is arbitrary precision of signal, which would require a real numbers/analog processing unit. I know there are libraries that emulate this, but the problem they have is that they are slow and inefficient, which is why project UPSIDE was done. The idea behind project UPSIDE and this universe's analog CPU is that neuronal networks should be able to identify what an analog signal may be trying to say, which may be later interpreted as a digital code. Of course, there will be errors from time to time.$\endgroup$
– a1901514Feb 6 '16 at 20:39

$\begingroup$In addition, how would telecommunications work with digital arbitrary precision transmissions? As far as I know they should be linear, which could bring the program to a halt as it waits for the transmission of the next number to finish, which is something I don't want. Transmissions should be as fast as possible, maybe even almost instantaneous.$\endgroup$
– a1901514Feb 6 '16 at 20:43

$\begingroup$If there were arbitrary precision analog transmissions we would have infinite bandwidth. In sending 1 datum it could contain tons of information so instead of sending 100Mbs we could send trillions.$\endgroup$
– sdrawkcabdearFeb 6 '16 at 21:18

$\begingroup$@a1901514 Emulating the brain is a great idea, a ton of break thoughts have come from artificial neural networks running on digital hardware. But the brain also has lot of noise so it does not rely on precision either, each neuron has a finite number of states so it can be done in digital. The real distinction is designing the shape of the network and the fact that neurons are much more power efficient because they power down when not it use see science.sciencemag.org/content/345/6197/668.full for attempt at brain network.$\endgroup$
– sdrawkcabdearFeb 6 '16 at 21:23

$\begingroup$The idea would be that, while analog signals have improved bandwidth a lot, noise and user output speed (think of keypresses from the brain) limit the maximum speed telecomms can reach: it IS much faster, but it is not instant. The algorithm interpreting the input would have to rely on previous input to calibrate itself on the fly and try to understand what's the actual signal to noise ratio and how accurate the received data is to actually make an informed guess about what the user is trying to do. As in, you need some previous data to understand the message, like speech.$\endgroup$
– a1901514Feb 6 '16 at 23:04

It might be worth considering what nature has done. "There's nothing new under the sun" isn't true but isn't very far off either.

So digital computing seems to have come first. DNA, RNA, stop codons, an escape codon, epigenetic labelling, error detection and correction mechanisms inside every cell. It seems to have all the characteristics of a digital system. A one-base (bit) error can have no impact, or can completely change the organism (result).

And then hybrid systems evolved. Nerves and neural networks. Brains. We do not fully understand them. A single neuron accepts many analogue-ish inputs from other neurons and seems to generate something like a weighted sum of its inputs. If that sum passes a threshold the neuron "fires". The cell is bistable, binary, firing or not. The weights are modified by the firing of connected neurons. And in a brain neurons are elements of an absolutely enormous network.

Synapses, the connections between a neuron's output and another neuron's input, are complex molecular systems. Not simple analogue. Thermal-noisy. Possibly, more quantum-magical than merely noisy. Beware anyone who says single synapses are simple and well understood.

What is missing, are pure analogue systems without hysteresis. Nature's systems, if they react to analogue inputs, always seem designed with trigger thresholds that are greater to cause them to turn on than to turn off. Not unlike a plain ordinary electric switch. No halfway.

I remember this example from a long ago article in Scientific American because it was so completely counter intuitive (On the Spaghetti Computer and other Analog Gadgets for Problem Solving," in the Computer Recreations Column (1984) http://www.scientificamerican.com/article/computer-recreations-1984-06/). Analogue computing can be, in many instances, faster than digital computing, but apparently only in the actual processing. Input and output required a lot of work and could be time consuming.

The example given was a "spaghetti computer". If each strand of uncooked spaghetti was cut to a different length corresponding to inputs, in theory you could make a computer by assembling the lengths of spaghetti in you hand corresponding to the inputs and slamming the bundle "end on" into the counter. The answer would be derived from the pattern of high and low spaghetti ends sticking up from the other end of your hand from the counter.

So you will have spent minutes or hours cutting and assembling the lengths of your spaghetti computer, and may have to spend some time looking at the "output" to understand the answer, but the actual act of "computation" takes the fraction of a second to slam the spaghetti bundle on the ground.

While this seems a bit pointless (and scaling up to solve really big problems with a "Super Spaghetti Analogue Gadget" will also be difficult and messy, until you boil it down for the post computational dinner party), analogue computers are potentially viable for solving NP complete problems that would lock up digital computers for eons.

So the real crux of the matter is what sorts of problems are being solved for that require the special attributes of an Analogue computer. Unless you are trying to solve for some sort of NP complete problem or avoiding the "halting problem" of a Turing machine, digital computers have been refined to a very high degree of power and accuracy. The only other huge advantage of an analogue machine comes from the fact that most examples I have ever heard of are essentially mechanical or electrochemical in nature, so they will not be affected by power spikes, EMP and other environmental factors which can disrupt digital computers.

Analog CPU running the programs we run today, no (as already answered). Analog computers were not designed for step-wise sequential operations; they were designed to solve specific classes of largely mathematical problems.

Hybrid, sure... no reason a digital computer couldn't be interfaced to an analog computer, though there may not be a lot of benefit in doing so.

Analog RAM... maybe not such a great idea. For example, storing a quantity as a charge can work, until it leaks down and the value you read out is not the same as the value you stored. Pretty much any other concept you might think of for analog storage would be vulnerable to distortion or loss of the stored information. Analog ROM on the other hand is totally practical - the good ol' phonograph master.

Totally analog communication - totally practical. That's how we did things up until the 1970s for voice and up to at least the 1990s for video. Even digital data originally went down the wire as analog tones.

The reason general purpose computers are so flexible is the fact that every single piece of the logic inside can be created from a single element: the NAND gate. Inverted And. This provides functional completeness... any and all other logic functions can be built with just NAND gates.

Now, you can easily make more space and power efficient gates to do those other logic functions... but the key is that everything can be boiled down from one concept, and from that concept everything else can be built.

So what's the analog equivalent of a NAND gate?

I could imagine a merging between analog and digital circuits; we do this kind of stuff all the time. You've got a sound card... that's what it does. Old timey video cards used to output analog signals for video. Wifi has to translate between electromagnetic waves and digital signals.

A layer of analog circuits would filter data, which could then be processed blazingly fast on a digital general purpose computer. The power of the future seems to be the merging of analog and digital, embracing the strengths of both and cancelling out each other's problems.

We solve problems in different ways... but Google's assistant voice recognition is amazingly accurate these days. Sure, it uses a massive database of records to identify speech, instead of using a finger sized wad of meat... but that's playing to digital computing's strengths. Processing billions of pieces of data in a blink of an eye.

$\begingroup$In "classical" analog computers, the closest analog to a NAND gate would be an analog integrator. In fact, many analog computers were rated by how many independent integrators they had. However, making a good integrator is much, much trickier than making a NAND gate (at least with current tech.)$\endgroup$
– CatalystFeb 3 '17 at 10:04

Yes it is possible to design a CPU that is 100% analog. 3d imaging isn't that hard. What is hard is convincing people that the fact that they believe everything they've been told about analog computers is exactly why no one has managed to create a full fledged analog CPU in the commercial sector. It's laziness and incompetence.

3d imaging can basically already be done. If you just look at some of the old analog video boards that would be able to "skew" and "stretch" broadcast images in the old days at tv news stations, that's basically it. A simple polygon would just need a few extra inputs and options to skew it properly, and then add more polygons to the screen. It's that simple. Except you can have a high resolution image and you wouldn't be worried about pixels, and it would be really cheap.

There's an old example of a analog computer running a physics car/terrain simulation on youtube. It's 2d but then again, it's basically just a stock analog CPU. Here it is but it doesn't show the actual car, wheels or road in this one. https://www.youtube.com/watch?v=AEJtajaRj_s

$\begingroup$Good see analog computing getting good support. Digital isn't everything. The differential equations to plot the Apollo Program Moon flights were calculated with analog devices custom made for the purpose. They varied the inputs, let the circuitry do the rest and the answers were the outputs.$\endgroup$
– a4androidFeb 3 '17 at 5:53

Storage

With analog computers you wouldn't be able to reliably store information, since you would always change the analog signal upon measurement or just because there was a little bit of noise, not mentioning that you'd need to invent a way to store it in the first place.

The reason we use digital signals is because you can have, for example, either 0V or 1V and if the voltage deviates a little bit then the system still recognizes as the correct one.
However in your question you ask for an analog system that can notice subtle differences in a person's brainwaves.

Solution

What you are after is an expansion card which would take care of the analog information and find a way to store it perfectly in digital form (for example it could recognize every simple wave that composes a complicated wave and store those which would be easily stored).

You could even introduce this expansion card with a cool name like Brain Card and never actually explain how it works (or give a simple explanation like I just did), just that people buy it and use it to store their brainwaves with 100% accuracy.

"Analog techniques, involving continuously variable signals rather than binary 0s and 1s, have inherent limits on their precision—which is why modern computers are generally digital computers. However, AI researchers have begun to realize that their DNN models still work well even when digital precision is reduced to levels that would be far too low for almost any other computer application. Thus, for DNNs, it's possible that maybe analog computation could also work.

However, until now, no one had conclusively proven that such analog approaches could do the same job as today's software running on conventional digital hardware. That is, can DNNs really be trained to equivalently high accuracies with these techniques? There is little point to being faster or more energy-efficient in training a DNN if the resulting classification accuracies are always going to be unacceptably low.

In our paper, we describe how analog non-volatile memories (NVM) can efficiently accelerate the "backpropagation" algorithm at the heart of many recent AI advances. These memories allow the "multiply-accumulate" operations used throughout these algorithms to be parallelized in the analog domain, at the location of weight data, using underlying physics. Instead of large circuits to multiply and add digital numbers together, we simply pass a small current through a resistor into a wire, and then connect many such wires together to let the currents build up. This lets us perform many calculations at the same time, rather than one after the other. And instead of shipping digital data on long journeys between digital memory chips and processing chips, we can perform all the computation inside the analog memory chip."