Posted
by
samzenpus
on Friday March 29, 2013 @11:31AM
from the meat-machine dept.

sciencehabit writes "For the first time, synthetic biologists have created a genetic device that mimics one of the widgets on which all of modern electronics is based, the three-terminal transistor. Like standard electronic transistors, the new biological transistor is expected to work in many different biological circuit designs. This should make it easier for scientists to program cells to do everything from monitor pollutants and the progression of disease to turning on the output of medicines and biofuels."

We need a series of browser plugin virii that will screw up apostrophes, change all plurals to pseudo-latin form and randomly leave out the harvard comma. It should also transpose all instances of loose and lose.

We need a series of browser plugin virii that will screw up apostrophes, change all plurals to pseudo-latin form and randomly leave out the harvard comma. It should also transpose all instances of loose and lose.

We need a series of browser plugin virii that will screw up apostrophes, change all plurals to pseudo-latin form and randomly leave out the harvard comma. It should also transpose all instances of loose and lose.

I think the vast majority of computer users in this world already have that plugin.

it is a war between building a working biological computer and getting the quantum computers to add 2+2 correctly 100% of the time!
who will win? im betting on quantum computers. (especially since i would love to see an ansible sometime in my life)

ethernet speed is limited by the speed of light. thus, it is not an ansible. technically, if we can get quantum computers to work, we wont need fiber optic cable, because qubits are quantumly entangled, and their information travels instantly.

this is actually incorrect... it has been proven without a shadow of a doubt that entangled particles share information faster than the speed of light. we assume it is instantaneous information, however, like pointed out above, we have no way to prove that it is indeed instantaneous and not just a huge amount times faster than the speed of light.

I think that this depends on how you define information. ISTR that while data is available, calling it information is questionable. E,g,, you may measure somthing and measure it as, say, spinning up, and this will allow you to assert that whenever you opposite number performs the equivalent measurement it won't say spinning down...and that the time could be either before when you measured or after in some particular reference frame. But calling this a tranfer of information is a bit questionable. E.g.,

There's a good picture of the "simulated results" vs. the results they really got in that Science magazine preview for an AND gate, and a relevant paragraph of the summary:

http://news.sciencemag.org/sciencenow/assets/2013/03/28/sn-circuit.jpg [sciencemag.org] The Stanford team then showed that they could line up multiple transcriptors to carry out logical functions, creating standard logical circuits called AND gates, OR gates, XOR gates, and so on, which combine signals according to certain rules. (A computer's processor is a vast assemblage of such gates.) They also showed that their novel biological circuit designs were adept at producing signals with large amplification and that they could be used to up the expression of a variety of genes, such as the production of fluorescent signals that made it simple to detect cells that were carrying out their programming.

I wonder exactly how they "assemble" the circuit and keep the components from diffusing or floating away, thus diassembling the circuit. What keeps the "circuit" of DNA strands in place?

Read TFA for the components of this circuit. The DNA part of the circuit is most likely integrated into the cellular genome, so is effectively stationary in the nucleus. The RNA polymerase component is probably the naturally occurring version of the protein that already exists in the cell. RNA polymerase randomly diffuses around in the nucleus, but there's not just one molecule of RNA polymerase around, there's loads of them, and they can all do the same job. With help from other proteins, they bind to sequ

The summary has nothing to do with the headline. A transistor does not a computer make. To have anything worth talking about (as far as computers go), you would need to have a stupendous amount of these so-called "transistor"s interacting. This is no small leap- there is a significant amount of engineering work that goes into a processor even at higher levels of abstraction than gates.

This is like saying that someone's made a car, when all they really did was make a gear or something. It's just sensationali

You're wrong. A small computer can be assembled from a few hundred vacuum tubes. I designed a CPU when I was in high school, on paper, turing complete. 4 bits, 16 instructions. Not a lot went into that.

Sequential logic isn't the problem, it seems they've found a good way to propagate the signal. Neither is clock skew, there are ways to deal with that, and even clockless computer designs.

The biggest question is whether they can feed a signal back into a circuit, a flip-flop. A secondary question is reliability, whether you can use the same circuit repeatedly without it self-destructing. Once you've got that (and assuming their signals propagate as well as they think), then you have no problem building a

An abacus may be seen to be a series of registers. Each string holds beads and the positioning of the beads on the string sets the value of the register.:>)
The human operator of the abacus is the executor of the machine code: it's the human that:-- reads the value of a register (by looking at it)
-- stores a numeric value in a register (by sliding beads)
-- performs the carry when overflow occurs by carriage returning the beads in one row and moving an extra bead in the next (above or below depending on

An abacus may be seen to be a series of registers. Each string holds beads and the positioning of the beads on the string sets the value of the register.:>)The human operator of the abacus is the executor of the machine code: it's the human that:-- reads the value of a register (by looking at it)-- stores a numeric value in a register (by sliding beads)-- performs the carry when overflow occurs by carriage returning the beads in one row and moving an extra bead in the next (above or below depending on y

If someone is going to argue that something is untrue, and there is an understood interpretation of words that makes it true, they are being obstinate. Such semantics should only be argued when someone is holding 2 mutually exclusive definitions at once.

A transistor is a computer. It just computes exactly one function on exactly one set of inputs. It's a simple finite state machine.

So transistor is a finite state machine is a computer. Well I've got a jar of pennies where each one is a finite state machine and therefore each one is a computer. Or else maybe they should start teaching more engineering and less computer science.

A penny can have more than one state, but a finite count of them. A transistor can have one state unless you add a method to change it, such as a sine wave generator or another transistor. Then it is a transistor + input generator make a computer. The penny has itself and a small image (signal) that make it a computer.

You're confusing transistor with flop-flop or flash cell or something which is slightly more than just a transistor. Which by the way, a transistor does not have a finite number of states. Transistors are analog devices with continuously infinite number of states. It just happens that digital computers use them in arrangements of multiple transistors where we generally only use 2 states.

" Transistors are analog devices with continuously infinite number of states."You could have just said you don't know how a transistor works. I mean, sure that sentence conveyed the same information, but is seems like a round about way to show off your ignorance.

Perhaps the headline should have said "logic gates" instead of "computer". It didn't say "Core i7" either, though. Babbage's machine was a computer. Programing graphics processors with punch cards dates to the early 1800s, so "computer" doesn't imply a modern desktop.

I suspect you'd agree that any processor capable of running Windows is a computer. Therefore, any machine that can run a hypervisor, which in turn runs Windows, is a computer. You probably know where I'm headed - Turing machines. Any Turing machine can emulate a Core processor, and is therefore a computer. Wolfram's Turing machine requires only a few gates, so these researchers can probably build a biological Wolfram Turing computer today.

A Core i7 emulator running on an Amiga, or on a Picaxe, is also not realizable in a practical way. Most certainly an Amiga is a computer.
The core functionality of a simple PIC micro is a few thousand transistors. IO is several thousand more. If they already have
gates with six transistors or so each, combining a dozen gates into modules and a dozen modules into a simple CPU is not a huge leap.
It's not a computer, yet, but it's a couple of incremental steps away.