Yep, I think it may have been this forum where I posted a link to that years ago and someone commented that they thought it was referring to a Harry Potter computer. Harry Porter gives some credit for his circuits to this guy who didn't get the credit he deserves for computer innovation until relatively recently:

Back in the day I worked on a Navy system based on a Sperry Univac computer with 32Kb of magnetic core memory, a 18 bit machine without a single IC chip in it. The CPU was a drawer with 100s of cards and each card was a gate or flip/flop. Now you can get more computer power in a wristwatch.

Back in the day I worked on a Navy system based on a Sperry Univac computer with 32Kb of magnetic core memory, a 18 bit machine without a single IC chip in it. The CPU was a drawer with 100s of cards and each card was a gate or flip/flop. Now you can get more computer power in a wristwatch.

Hmmm... New thing is disintegration, huh? I'd love to see them take it way back and do it with tubes--now that would be a challenge!

Yeah, I love to see something based on tubes. I looked for tube computer projects on-line a while back, but didn't find anything other than computer museum restorations. Greater costs, voltages and power consumption to make anything large is probably a major factor against individual projects. I'll have to search again.

More on the dis-integrated 6502:

"I don't make jokes. I just watch the government and report the facts." - Will Rogers

Troubleshooting the 1959 IBM 1401 Computer at the Computer History Museum

"Monthly rental for 1401 configurations started at US$2,500 (worth about $20,294 today).

IBM was pleasantly surprised (perhaps shocked) to receive 5,200 orders in just the first five weeks – more than predicted for the entire life of the machine! By late 1961, the 2000 installed in the USA were about one quarter of all electronic stored-program computers by all manufacturers. The number of installed 1401s peaked above 10,000 in the mid-1960s. "In all, by the mid-1960s nearly half of all computer systems in the world were 1401-type systems." The system was marketed until February 1971. Commonly used by small businesses as their primary data processing machines, the 1401 was also frequently used as an off-line peripheral controller for mainframe computers."

http://techxplore.com/news/2016-06-w...ssor-chip.html
A microchip containing 1,000 independent programmable processors has been designed by a team at the University of California, Davis, Department of Electrical and Computer Engineering. The energy-efficient "KiloCore" chip has a maximum computation rate of 1.78 trillion instructions per second and contains 621 million transistors. The KiloCore was presented at the 2016 Symposium on VLSI Technology and Circuits in Honolulu on June 16.

The chip is the most energy-efficient "many-core" processor ever reported, Baas said. For example, the 1,000 processors can execute 115 billion instructions per second while dissipating only 0.7 Watts, low enough to be powered by a single AA battery.

"I don't make jokes. I just watch the government and report the facts." - Will Rogers

Fascinating interview with Steve Furber, the architect of the first ARM Processor at Acorn Computers in Cambridge in the early 1980s. The interviewer's questions are often fairly stupid ones, but Steve Furber does a great job of running with them.

If you don't want to watch the entire video, check out what he built in an attic using his own money during his PhD in aerodynamics studies. Whereas most students submitted their PhD thesis to a professional typist for publication, he used the hand wired computer he built and the text editor he wrote for his computer to write and print his thesis on a twin-track (for Greek letters) daisywheel printer he interfaced with it. Realize that this was just a "sideline hobby" of his since he was studying mathematics and aerodynamics. He was asked to join the ARM processor design team even though he apparently had no formal, high level training in electronics and told them that.

While reading about the new AMD Zen processor line, I read them bragging about a CPU security feature which would allow, for instance, a stolen PC to be disabled. I thought this to be a really stupid idea, investigated, and found that Intel CPUs have had the same "feature" for quite some time. Today, I saw this article about the Intel feature by someone who shares my reservations:

Intel’s Management Engine is the single most dangerous piece of computer hardware ever created

Five or so years ago, Intel rolled out something horrible. Intel’s Management Engine (ME) is a completely separate computing environment running on Intel chipsets that has access to everything. The ME has network access, access to the host operating system, memory, and cryptography engine. The ME can be used remotely even if the PC is powered off. If that sounds scary, it gets even worse: no one knows what the ME is doing, and we can’t even look at the code. When — not ‘if’ — the ME is finally cracked open, every computer running on a recent Intel chip will have a huge security and privacy issue. Researchers are continuing work on deciphering the inner workings of the ME, and we sincerely hope this Pandora’s Box remains closed.

AMD processors have had something similar for a while (post-2012), too:

Excellent seminar with the key designer of every Commodore machine before the Amiga. If you love as I do hearing insider tech anecdotes and the workarounds/kludges used to fix them, you'll love this talk. He jokingly calls the C64 a machine "that just appears to work" while describing all of the kludges that went into getting it to work.

Story of Commodore from the Lead Computer Engineers' Perspective - Bil Herd - 13 Dec 2016

"I don't make jokes. I just watch the government and report the facts." - Will Rogers

He points out that since 1970, MOS gate transistors (Metal, Oxide, Semiconductor) are really POS gate transistors (Polysilicon, Oxide, Semiconductor), but they probably don't call them that because they don't like the alternate meaning of the acronym POS.

He covers some parts of the Z80 CPU, the 555 timer, the 741 op amp, and the LM7805 voltage regiulator.

When I went to engineering school it was fairly common to do integration and differentiation using op-amps.

I think it's seriously cool. I imagine just typing an equation into an analog chip configuration language and having it laid out within the chip in a fashion similar to a (digital) FPGA or PLD. With the extreme low cost and vast power of current digital CPUs and SOCs, I don't see where this could be that useful, but it is apparently so in some cases and, regardless, I find it extremely cool.

I think it's seriously cool. I imagine just typing an equation into an analog chip configuration language and having it laid out within the chip in a fashion similar to a (digital) FPGA or PLD. With the extreme low cost and vast power of current digital CPUs and SOCs, I don't see where this could be that useful, but it is apparently so in some cases and, regardless, I find it extremely cool.

No, we did not type an equation. That would be too easy. We calculated sizes of capacitors and/or inductors and placed them in various portions of the feedback and gain loops of plain old op-amps to achieve the results we wanted. Then we wired them up and ran them. https://en.m.wikipedia.org/wiki/Op_amp_integrator

No, we did not type an equation. That would be too easy. We calculated sizes of capacitors and/or inductors and placed them in various portions of the feedback and gain loops of plain old op-amps to achieve the results we wanted. Then we wired them up and ran them. https://en.m.wikipedia.org/wiki/Op_amp_integrator

No, I wasn't saying that YOU did so. I was dreaming of an ultimate (for me anyway) programming language for modern Field-programmable Analog Arrays.

"I don't make jokes. I just watch the government and report the facts." - Will Rogers