Outside of the lower-level classroom activities binary wasn't used much in any of my networking classes in school. It was just used to show how networks determine where packets should go and what is internal traffic vs. external.

The same thing applies to programmers. We don't program anything in binary (except on rare occasion) but everything gets turned into binary by the computer running it.

__________________

__________________
"as a fanboy i refuse to admit it and will pull countless things out of my butt to disprove it"

Hello and welcome to the forum. You have asked a very very involved and complex question. In short yes binary was and is used to program computers. Binary is a numerical system based on two states 0 and 1. We normally work in denary which is a numerical system based on 10 digits ie 0 to 9. We use that system because we have ten digits or fingers. Computer programers because every piece of electronics that enable a computer to run is based on a system if switches and switches only have two states ie on or off. The on state is represented by a 1 and the off state is represented by a 0. Whilst programing in 1s and 0s can be and is used it can be very time consuming and takes all of your mental skills to bring about a result so programming languages were developed such as C, C+, CC+, Fortran etc. There are lots of different progarmming languages and these languages are specific to a programming task, They try to use a logical command rather than a series of 1s and 0s to perform a certain task within a program. But the bottom line is that all programming languages can be broken down to the base language used which is binary.

I'm afraid that that is a very simplistic explanation and really if you want more knowledge and experience in this sort of stuff you really need to study computer science. I am by no means an expert on these things but I was brought up using personal computers such as the Sinclaire ZX80 which had to be programmed in binary code to perform very simple tasks such as drawing a mono coloured circle on the screen. That bit of binary coding actually took up a large part of the page in the Sinclaire magazine that attempted to teach us computer sprogs how to program. I never did learn how to do it and eventually just went down the road of building, using and repairing computers and letting others do the programming.

wow thanks everyone for all the informative replies. I'm currently attending a community college (looking to transfer to a university after i graduate) for IT and am generally curious about programming. I was initially intimidated by it because I'm not great with advanced math but I'm really eager to learn all that i can about programming and syntax and all that good stuff. I just think it's cool.

No, Software engineer with a focus on networking. After that class where we programmed the physical controllers (which needed the binary and knowledge on how it was used to calculate subnets and such) we worked making our own porotocols in the upper-level classes using our controllers from the lower-level

__________________
"as a fanboy i refuse to admit it and will pull countless things out of my butt to disprove it"

might be worth pointing out that, while binary is not a programming language, it's not unheard of to program in binary.

it's unlikely that you ever will though.

generally if you look at programming it'll be from a point of view like this:
(when I say next step down I'm talking about getting closer to the silicon.)

Using a drag ans drop type program, (e.g Media builder) everything is basically done for you, it's like using actionscript, or using javascript, you put an element down, and you say, onclick, play this sound.
you don't need to worry about how files are actually loaded, or how networking works, or anything too in depth.

Down from this you'll get languages like visual basic, again very drag and drop, you don't need to learn how to draw a window on the screen, but you get a bit more power over what things can and can't do. you;re also a bit more involved with creating things, - so rather than having a file selection box as a pre-built thing ready to use, you put the code in to make that.

Down from that you start getting to languages like Java, you need to tell the program where everything goes, and there aren't many graphical (drag drop type) interfaces, you're writing in code, but, even though this sounds like really hard work, the computer and compiler are still doing a ton of work for you, you don't need to worry about a lot of memory management or anything like that. there are still plenty of "pre-built" things... the trouble is. (and this might just be a personal experience) but most things written in Java seem to be resource hungry beasts. I guess it's cheaper to buy more memory than to figure out why your notepad application consumes 50MB of RAM, etc. C++ gives you (the programmer) a bit more to worry about than Java, but by and large, lots of things are done for you.

Then you get things like C, where the onus is very much on you to allocate and de-allocate memory. you have to write much more code. but "generally speaking in my experience" the results are slicker. (e,g using less resources). for all intent and purpose, unless you're working on OS Kernels or embedded hardware, you may as well consider C a dead language, there aren't that many programs written using it any more.

All these languages require compilers to move from a human readable form to machine codes.

After this there is another type of programming languages.

Often when people talk about Assembly languages they write them using mnemonics:

But (and here is the clever bit) when you read a programming manual for a chip those instructions (move) and register locations are given in hex.

so instead of writing, MOV A,B

you can refer to the manual and just directly write,
0x0A
0x01
0x02

or you could write that out in binary as

00001010
00000001
00000010

so the long and the short of it is:
Yes, you can program in binary, not only is that possible it used to be the ONLY way to program, -where programs were instructions were literally entered with a bank of 8 switches and a GO button. - you'd set the switches according to line1 of the program, then press GO. then you'd set the switches to line 2 and press GO, then set the switches to line 3 and press GO.

this is still possible with some very small chips, (though most chips load data via a serial bus so parallel programming is all but dead!)

Not quite binary, but programming a Z80 in machine code
0x0a, 0x01, 0x02 is (or at least was) on the A level (16 - 18 year old) learning syllabus for electronics (in the UK) I was given a question paper, and a programming manual, you had to hand write programs in hex under exam conditions.

Writing a program to create a signal generator in hex was a part of my 1st year electronics degree program...

Even though it's highly unlikely that you'll ever need to program directly in Hex or binary, (though people still do for small embedded controllers) that doesn't mean that it's not worth learning, its quite useful to be able to count in it (that and hex!) (so at least you understand where you need to use int, and small ints etc. many programs, (especially encryption based) will use binary operations to change data. (stuff like bit shifting and performing XOR operations etc.)

so, to answer your question:
Is binary a programming language, - no.
Do programmers use it - Yes.

Weirdly, even though binary is not a programming language, you can program in binary. (where program means to physically sit at a box of switches entering code to a device!) it's not the same as programming in java, or C etc.

My advice. if you are hazy on the concepts of numbering in computer systems, then try and get as much committed to memory as you can, (that way the grass roots type stuff you know, and can devote more time to understanding the more advanced stuff when required).

__________________
I didn’t fight my way to the top of the food chain to be a vegetarian…
Im sick of people saying 'dont waste paper'. If trees wanted to live, they'd all carry guns.
"The inherent vice of capitalism is the unequal sharing of blessings; The inherent vice of socialism is the equal sharing of miseries."

Yes Root, I did some programming of an HP 3000 mini computer in a college class back in 1976 or so using only the front panel switches. However, the front panel switches represented Octal, not Binary, so we had to convert the instructions into Octal in order to set the switches. Once the program was completely entered then you had to hit the "Run" button. The front panel lights would flicker for a few seconds and then stop, if the lights then showed a result of 0 (return code of zero) then the program ran successfully, otherwise a non-zero return code meant your program failed.

After that class, I couldn't understand why people were so awestruck by computers since I thought it was way too much work to set the switches for each instruction and then run the program only to attempt to get a return code of zero. It was another 8 years before I used a PC with a keyboard for input and a monitor for output where I could finally see the value of using a computer. In between, I took a programming class where we used punched cards for input, another method that I felt was way too complicated to be worthwhile.

it's still sort of programming in binary, in so far as you can "see" electrical connections being made. - and yes, a hell of a journey to see a light blink! or to add two small numbers together that you could do in your head.

I guess that the answer is still the same, (you can do something, but ordinary people would choose not to!)...

HP3000 is a little before my time!!

__________________
I didn’t fight my way to the top of the food chain to be a vegetarian…
Im sick of people saying 'dont waste paper'. If trees wanted to live, they'd all carry guns.
"The inherent vice of capitalism is the unequal sharing of blessings; The inherent vice of socialism is the equal sharing of miseries."

One thing I haven't seen mentioned (which is actually where I've had to drop down to raw 1's and 0's most) is reverse engineering of comms protocols (particularly serial protocols.)

I've spent the best part of the last week in my day job reverse engineering an IR protocol used on laser tag guns... turned out to be a form of RS232 over IR, but with the start and stop bits backwards (don't get me started on the stupidity of that last bit!) That required an oscilloscope, logic analyser and many hours of staring at highs and lows on the scope (and translating them into 1's and 0's accordingly) before we figured out what was going on.

Now that's figured out we're down to analysing the protocol at a higher packet based level, and the same applies - you still have to realistically work with the data at either a binary or a hex level to work out what bits are changing to what based on different parameters.

Before that I was doing a similar task on a circuit board designed to drive ultrasonic rangefinders - same story.

Before that I was doing the same thing on an atomic clock receiver with a UART (drivers were windows 3.1 only, not very useful today but the receiver itself works great!) Again, same story with dropping back to raw binary to work out the protocol.

__________________

__________________
Save the whales, feed the hungry, free the mallocs.