Any binary logic equation can be implemented using only NAND gates and also using only NOR gates. So, for now, forget about the 3-to-8 decoder and learn how to implement each of the basic gates using only NAND and also only NOR gates.

The basic gates I am refering to are the one-input and symmetric two-input gates.

1-Input: BUF, NOT
2-Input: AND, OR, XOR, NAND, NOR, XNOR

The first step is to write the truth tables for all eight of these gates. The work with each one in turn and implement it using only NAND gates. Then do the same using only NOR gates.

You only have 2-inpu NAND gates available. How can you implement an inverter, namely a gate that takes a single input, A, and produces an output, Y, that is always the logical complement of A? By "implement" we mean to create a logic circuit using only NAND gates that, if placed in a black box with just the A input and the Y output accessible, woud be indistinguishable from having a NOT gate in the box. There are two ways to do it -- try to find both.

Now do the problem again under the conditions that you only have NOR gates.

Then, once you have done that, when told you can only use NAND (or NOR) gates you can assume that you also have NOT gates because you know how to make a NOT gate using only NAND (or NOR) gates.

The next step is to figure out how to make a circuit that implements the function for a 2-input AND gate using only NAND gates.

And so on and so on until you have each of the eight gates implemented using only NAND and again using only NOR gates.

If I take A.. and put A into both inputs of a NOR gate, that will get me A'.
If I take B.. and but B into both inputs of a NOR gate, that will get me B'.

If I then take the A' and B' and put it into another NOR gate, that will get me AB.. which is the equivalent product of an AND gate.

I can now make the 3-8 decoder and replace the AND gates with this NOR gate combination... I think.

Does that sound right?

Click to expand...

Yes.

The takeaway point is that you can make an inverter out of either NAND or a NOR gate by applying the input signal to both inputs. Another way to do it (and arguably better from a practical standpoint) is to apply the signal to one input and tie the other input to the appropriate rail.

Once you have an inverter, you can trivially make an AND from two NANDs by just following the first NAND with an inverter made out of the second. The same goes for making an OR out of two NOR's. The slightly tricky one is to make an AND out of NORs (which you figured out just fine) or an OR out of NANDs (which follows the same idea). To make and XOR or XNOR, you can always just implement the circuit out of ANDs, ORs, and NOTs which, in turn, are implemented using either NANDs or NORs. But you can implement an XOR using four NAND gates and an XNOR using four NOR gates. It's good practice to see if you can figure out how to do it.

The takeaway point is that you can make an inverter out of either NAND or a NOR gate by applying the input signal to both inputs. Another way to do it (and arguably better from a practical standpoint) is to apply the signal to one input and tie the other input to the appropriate rail.

Once you have an inverter, you can trivially make an AND from two NANDs by just following the first NAND with an inverter made out of the second. The same goes for making an OR out of two NOR's. The slightly tricky one is to make an AND out of NORs (which you figured out just fine) or an OR out of NANDs (which follows the same idea). To make and XOR or XNOR, you can always just implement the circuit out of ANDs, ORs, and NOTs which, in turn, are implemented using either NANDs or NORs. But you can implement an XOR using four NAND gates and an XNOR using four NOR gates. It's good practice to see if you can figure out how to do it.

Click to expand...

Many people would actually use a GAL/CPLD. Or a small microcontroller. All this substitution and optimization has little significance these days. On the other hand if you write Windows applications, you use a lot of functionality that would take you ages to understand in all details.

If you are interested in this digital logic stuff, get one of these old CMOS cookbooks, read them through a few times, and play around with the information on paper. Soon you would figure out about negation, substitution, etc. But normally you would not optimize it down to the point very often. Silicon is much cheaper these days than in the 1970s.

Many people would actually use a GAL/CPLD. Or a small microcontroller. All this substitution and optimization has little significance these days. On the other hand if you write Windows applications, you use a lot of functionality that would take you ages to understand in all details.

If you are interested in this digital logic stuff, get one of these old CMOS cookbooks, read them through a few times, and play around with the information on paper. Soon you would figure out about negation, substitution, etc. But normally you would not optimize it down to the point very often. Silicon is much cheaper these days than in the 1970s.

Click to expand...

This is an assignment that is pretty clearly aimed at teaching fundamental concepts and understanding at a basic level -- long before it is reasonable to start throwing around programmable logic of any kind. This notion that engineers and technicians only need to know how to use a tool written by someone else and that fundamentals aren't crucial, or even valuable, is poppycock.

And the claim that silicon is much cheaper these days is both true and false. When I started designing ASICs in the mid 90's, you could fab a test chip on a multi-project run for under $1000. If you needed an engineering lot of whole wafers, you could do that on a mainline process for about, oh, a hundred grand. Now you are looking at mask charges alone commonly exceeding a million dollars. On the other hand, the marginal cost per transistor in production has come down signficantly.

This is an assignment that is pretty clearly aimed at teaching fundamental concepts and understanding at a basic level -- long before it is reasonable to start throwing around programmable logic of any kind. This notion that engineers and technicians only need to know how to use a tool written by someone else and that fundamentals aren't crucial, or even valuable, is poppycock.

Click to expand...

I always had a sense for linguistic creativity, including to recognize it.
Myself I skipped through the CMOS cookbook in a matter of a few weeks- there you go with the fundamentals.

But why I wrote it is really to me it looks (And yes I went to computer college some years ago myself) there is far too much grind about so-called fundamentals. Without it ever to become useful in real-world for most of the students. It (the fundamental) stuff should be mentioned, but I think it is pre-requisite. The expensive time at college should become used in better ways. Those who disagree, or who can't keep pace, well they are maybe not meant to become great engineers or programmers.

And in addition, I would not really know how to do a Karnaugh optimization right now from scratch. I would have to look it up again. Even if back then in college, I have succeeded with it. I mean it is not neccessary to memorize these fundamentals in all depth. It is not neccessary to master binary algebra in all details. You still may be able to write high-quality softwares.

And the claim that silicon is much cheaper these days is both true and false. When I started designing ASICs in the mid 90's, you could fab a test chip on a multi-project run for under $1000. If you needed an engineering lot of whole wafers, you could do that on a mainline process for about, oh, a hundred grand. Now you are looking at mask charges alone commonly exceeding a million dollars. On the other hand, the marginal cost per transistor in production has come down signficantly.

Click to expand...

Not so many people are actually involved into real-world silicon implementations. It is a small world. For those interested, they will have way and means to understand about real-world composition of actually implemented silicon.

My argumentation is they won't discuss such real-world composition in many cases, but instead maintain lengthy discussion and absurd exercizes aimed at fundamentals.

I mean, you maybe aren't so bright to memorize all the fundamentals, and reproduce it all over again in a test, but you still may have a genius idea for another technological breakthrough. You still may be able to write high quality software.

If you don't have such a feeling, maybe it's not the right subject.

Examine for instance the Intel 4004 layout. How can you master it, and discuss it in class, if you spend so much time with fundamentals. They really should start with things like that, and look up fundamentals as needed.

I could link a PDF about microwave telephony, including some maths, and it is really worlds apart from such exercizes.

I may apologize for my lenghty writing. But in general, I think the education system has not kept pace with technological evolution. There is a gap spanning at least 20 years. With some exceptions. Ways of teaching are no longer appreciate, considering the large volumes of data engineers have to deal with these days. I have maybe 500K pages PDFs.

Compare that to the sheets/handouts at university/college. New ways of learning/memorizing/task sharing are required, that are no longer based on literal memorization.

And I don't think I speak without reason or experience. I use websites to support my programming work. And it is actually a large clutter. All in different styles, formatting, amount of details. I have to extract bits and pieces from here and there. Even if some really good instruction materials exist.

And in addition, I would not really know how to do a Karnaugh optimization right now from scratch. I would have to look it up again.

Click to expand...

Which is a strong indication that all you ever did was memorize a bunch of steps and mistake that for comprehension. The principles behind a Karnaugh map are so simple and so fundamental that once you truly understand them you should be able to recreate the entire K-map procedure, from scratch, without looking anything up even after a decade or more without using it - including the notion of concensus terms, what they are, what there significance is, and how to identify them. Just like someone that understands how long division works can figure out how to do it after decades of using a calculator -- but someone that just memorized a bunch of steps during the one week in middle school where they were "taught" how to divide two numbers without a calculator either recalls the steps from memory or has to go look them up.

But I suspect the moderators would probably agree that this thread as been hijacked more than it proper. So let's return it to assisting the OP with their question at their level, assuming they need any further assistance at all.

But I suspect the moderators would probably agree that this thread as been hijacked more than it proper. So let's return it to assisting the OP with their question at their level, assuming they need any further assistance at all.

Click to expand...

Hmm. Maybe I have not estimated the significance of my replies correctly.
Moderators might have been notified already.

On the other hand, it is possible quite easily to solve the problem using a ball pen + some sheets of paper. I have done it quite a few times. Sometimes I did not have all the gates available as required.

Scribble the basic truth tables on paper (NAND is the most common), you can easily see how a NAND turns into a NOT gate. Really spend some time with that until you understand it.

Since I used 4000x chips sometimes in real world, I really know they behave that way. So I can be quite confident when I draw the NAND truth table on paper, maybe OR as well. They really behave that way.

When I recall my maths teachers, I can't even add 1+1 together. 1 what + 1 what? Does not make much sense to me.

Possible, but given how diligently they monitor the forums is really not necessary.

On the other hand, it is possible quite easily to solve the problem using a ball pen + some sheets of paper.

Click to expand...

Some sheets of paper? More like one side of a 3x5 index card. But that's for someone that has long ago internalized the fundamentals. The OP is (presumably) a student just being exposed to this stuff for the first time. They are just learning what digital logic is, how different logic functions relate to practical applications, and how problems can be broken into smaller functional pieces.

Since I used 4000x chips sometimes in real world, I really know they behave that way.

Click to expand...

They wouldn't be much use if they didn't, now would they?

When I recall my maths teachers, I can't even add 1+1 together. 1 what + 1 what? Does not make much sense to me.

Click to expand...

So, by all means, use a calculator to do your thinking for you. I've encountered plenty of students that operate at that level. Of course, they are seldom the students that do well or that get the good looks from companies that want to hire well-paid engineers instead of run-of-the-mill app-monkeys.

A 3 to 8 decoder? Who would need one for real these days?

Click to expand...

[/QUOTE]

Gee, guess they only make and sell them by the millions every year so that people can decorate their Christmas tree with them.

Nearly all designs I have done for the past decade or more have had several decoders in them. Ranging from a 1:2 up to a 12:4096 and everything in between.

Some sheets of paper? More like one side of a 3x5 index card. But that's for someone that has long ago internalized the fundamentals. The OP is (presumably) a student just being exposed to this stuff for the first time. They are just learning what digital logic is, how different logic functions relate to practical applications, and how problems can be broken into smaller functional pieces.

Click to expand...

And I did not mention simulation software. WinCUPL from Atmel for instance. When I have to deal with digital logic problems that must be built from gates, I really draw all the truth tables, including for the basic gates, and then I experiment with, including to negate inputs, and trying to reduce the number of gates. Normally I have redraw it a few times. But mostly I implement such things in software. Small CPLD boards can be had these days for low prices.

I mean it is much harder to imagine some functionality in theory, especially if the description is cryptic, and incomplete. From my college experience, handouts tend to be cryptic, incomplete and sometimes containing absurd problems/questions.

So, by all means, use a calculator to do your thinking for you. I've encountered plenty of students that operate at that level. Of course, they are seldom the students that do well or that get the good looks from companies that want to hire well-paid engineers instead of run-of-the-mill app-monkeys.

Click to expand...

Back in the 1980s we were actually prohibited from using calculators. Later during highschool, we were prohibited from using anything that is programable. MSX etc. was shot in the back, disappeared after 3 years. In the early 1990s we were introduced to boring sorting algorithms using BASIC, we spent nearly half a year with that. Imagine if one would consider to learn Windows programming at that pace.

Gee, guess they only make and sell them by the millions every year so that people can decorate their Christmas tree with them.

Nearly all designs I have done for the past decade or more have had several decoders in them. Ranging from a 1:2 up to a 12:4096 and everything in between.

Click to expand...

Yes but implemented in software? So the only limitations are these imposed by the software and/or generic logic.

I mean my point is why introduce this scarcity so much?

Some years after college, I borrowed this CMOS cookbook from a library, bought some 4000x chips from a local shop, and built a CMOS LCD clock, including a startup sequencer (to reset the registers), and a driving circuit for the LCD backplane. I used some information from the book, but no ready-made schematic. It was a matter of maybe two weeks, from design idea to working circuit.

Nobody ever introduced me to a reset sequencer. I made one from 4-bit counters!

Same as for MUX/DEMUX. It is helpful if you have heard of it for maybe half an hour, yes. But if you have creative skills for digital hardware, you'd just build some kind of MUX or DEMUX from whatever silicon you have available, as needed.

I mean my point is they spent maybe weeks on that, and introduce this scarcity, as if nothing else would exist. I work considering the concrete requirement as a starting point, and considering the silicon stuff that is available on the market.

Maybe you want a concrete reason why I write this.

In college we have done digital hardware too. At the beginning, the instructor showed us a "microcontroller" board, which was rather complex! And he said, this would become subject towards the end of the 2nd year!

Really only one clunky controller board!

Today I use many different controller chips. Normally the circuits I build contain very few components. I could make 50 controller circuits from the stuff I have here!

So why they have been withholding this technology from us??
It was available already at that time. It was available in the early 1980s.

Who is using discrete logic these days including effervescent efforts to sbstitute and to optimize? Very few to no one. They take a small and cheap controller, and one or a few programmers will do the software within a very short time. They always use the latest technology, yes they cost optimize, and they take the path of least resistance.

Maybe others can be saved from wasting their college time. Maybe they can try to demand things from their instructors. Or they just buy some cheap developement stuff themselves.

Not to say, a 3 to 8 decoder is not difficult as such. If I needed one, I would build one as simple and easy as possible. So I'd say, no regard if they learn all the details about substitution, most of these students will never have opportunity to use this particular skill.

What is really helpful would be concrete experience solving large projects, and taking steps independently. Really speaking from my experience, we spent far too long with fundamentals. We wrote trivial assembler programs on hopelessly outdated hardware. While a little later I actually learned that C compilers were used commercially since the mid 1980s.