"In this tiny ebook I'm going to show you how to get started writing 6502 assembly language. [...] I think it's valuable to have an understanding of assembly language. Assembly language is the lowest level of abstraction in computers - the point at which the code is still readable. Assembly language translates directly to the bytes that are executed by your computer's processor. If you understand how it works, you've basically become a computer magician." More of this, please.

ASM was part of the first semester classes when I was in undergraduate technical college, along with binary. Then a few years later, in University, they don't even seem to know it still exist. They think it is something that is behind us and to look forward, they must teach more important things like design pattern and how to write layer upon layer of abstraction... What a shame.

You were lucky to have valves! Back in my day, computing was done using our heads. We had to work 28 hours a day, 8 days a week, supply our own pens and paper and hope that the foreman didn't hit us for writing with it!

You were lucky to have valves! Back in my day, computing was done using our heads. We had to work 28 hours a day, 8 days a week, supply our own pens and paper and hope that the foreman didn't hit us for writing with it!

You had self-inking pens? In my day we had to whittle our own quills from the neighbours goose feathers.

We had an Apple ][+ at school back in the early 80's, and I wanted to get as much as I could out of it, the only way was machine code.

I remember using CALL -151 so many times (3D0G to get back to Applesoft).

I got an Apple 2c in 84 that had the 62C02 chip, that gave you such things as BRA which I thought was the coolest thing ever (my friends C64's 6510 was even nicer, but that's another story).

I wrote code to put the //c into double hi-res, then using the video interrupt, wrote a small sprite routine that put a mouse cursor (copied pixel perfect from a Mac 128k image I had from a Byte magazine at the time). I even wrote a faux menu system that copied the Mac look and feel, right down to the Chicago font.

And that niceness was in a much less expensive package, but that's anot... hm, no wait, something like this goes on also in present ;p

Times have changed.

Didn't change totally; similar coding style still goes on with many microcontrollers (tons of them around), including some DIY stuff - this AVR "console" for example: http://en.wikipedia.org/wiki/Uzebox

Me, one day... I will make an Atari 2600 game - 128 bytes of RAM and racing the beam, here I come!
(I'm not really in a hurry though - in fact, I think that doing it on a half-century mark would be even more curious, so still 15 years to go; notably, also a 6502 variant)

{ // begin rant
As a software engineer by trade, I have on a *few* occasions needed this. It's not a purely academic exercise. Sure, even as developers, you will rarely (and mean virtually never) these days be employed as an assembler coder. That being said, it's still a valuable skill set to have. It's definitely crucial when you get stuck in those deep debugging sessions with gdb and need to understand "WTF is at this memory address? I didn't put that there!" Being able to understand at the low level what is really happening is valuable. Or if you're doing some low level performance optimizations, being able to write assembler is necessary for using the SSE intrinsic functions on x86_64 architectures.

While it may almost never be your full time job to write assembler code, having a moderate ability to do so is necessary when you come across that "every once and a while" when you do need it.
} // end rant

Compilers are good at what they do, even great, if not astounding. But they are not magic. Every piece of software, even gcc, is written by a bunch of people just like us (speaking to fellow programmers here), ugly warts and all.

Perhaps ARM assembly is more appropriate. It is a contemporary architecture and it is used in 100-epsilon percent of all mobile devices.

A cheap board like the Raspberry Pi has IO pins that let you control stuff where timing is important and assembly could be appropriate.

Another possibility is the Arduino board which uses a completely different architecture, http://en.wikipedia.org/wiki/Atmel_AVR . There is no OS on top, only a small bootloader. And assembly is very appropriate to use here.

ARM assembly is much more complex than 6502.
Also, learning assembler on a Linux machine is pain.
How much code do you have to write to get a single pixel on the screen ?
Best is to have a small eval board with lots of LEDs with an Cortex-M3 on it and a debugger.
E.g. the Stellaris boards, which also often come with LCD.
Also the Raspberry Pi uses an ARM11, which is not widely used these days.

...then it's quite a jump to say "not widely used" - it still ships a ton (just not any more in top smartphones for example). "Quite outdated" doesn't mean much if it's perfectly good enough (and much less expensive, with how ARM licenses older cores)

Well, I am a bit biased. When talking about 6502 and Raspberry Pi, I am thinking of hobby or embedded projects.
So far, besides mobil phone market, I really never saw or heard of projects using ARM11.
But of course, my view to the embedded market is limited.

Mobile tech is typically counted as embedded ...overall, not very prominent - who knows really what ends up where, on large scale.

Sure, ARM11 is no longer dominant in ~top smartphones like it was for some time, But it didn't disappear, continues to be used in many mass-market devices - many lower-end Android phones are still built around ARM11-based Qualcomm or Mediatek SoCs ...quite possibly still most Android phones (it's just that tons of cheap Chinese devices aren't really noticeable in few most visible "premium" markets, but they largely power the worldwide Android adoption). Then add so called "feature phones" or Nokia Symbian devices, also most recent. It's not impossible that most smartphones and/or mobile phones in general still ship with ARM11.
Which reminds me, there's more than the main CPU - radio modules typically have some older ARM, and will likely continue that trend.
Also, Nintendo 3DS seems to have ARM11.

And that's only the visible stuff. My quite recent wifi router has some old ARM core - not sure which, but the point is: older cores continue being attractive for most scenarios.

*sigh* this is only half true. The design of the 6502 was very sweet, much more efficient etc and can be seen to have generally influenced the design goals behind the ARM family. However, the ARM is a RISC processor, the 6502 is pretty much CISC (though some argue this point.) Nothing in the actual ARM architecture shows any real influence from the 6502. In fact, there's zero compatibility and knowing 6502 assembler gives you no great advantage to knowing ARM.

Maybe the confusion comes from this: the ARM based Archimedes range of computers (which is where ARM originates from) were designed to be the direct drop in replacement for the 6502 BBC range used in schools in the UK. The main selling point initially was that they came with a very similar BASIC (same capabilities, but with a lot bolted on top) and that they could run *some* BBC software using the included software emulator. Schools in the UK had bought in to Acorn big style, and the BBC micro is very much the British Apple 2 (being that most kids from the 80's started their computing in School on a BBC.) Many would argue that the Acorn range of computers ended up crippling the UK school system, as they'd bought in to a dud and the dominance of PC in the rest of the world was already in place. But that's another day's battle.

I guess the ARM add-on for BBC Micro also played a role in the confusion?

Schools in the UK had bought in to Acorn big style, and the BBC micro is very much the British Apple 2 (being that most kids from the 80's started their computing in School on a BBC.) Many would argue that the Acorn range of computers ended up crippling the UK school system, as they'd bought in to a dud and the dominance of PC in the rest of the world was already in place.

BBC Micro came out before IBM PC, right? And while one can easily argue that PC victory was already clear in mid-80s ( http://arstechnica.com/features/2005/12/total-share/4/ and the next page, 5; still, it's not made clear, but those stats are probably mostly for North America - the article doesn't even mention Spectrum or Micral), but it didn't yet happen.

In the meantime, UK had one of the more vigorous ~computer (also education) landscapes - rest of the world didn't really have a dominance of PC yet, it didn't have much of anything.

One can accuse Amigas of pretty much the same, misplaced hope against the onslaught of the PC (just look at the graphs), but in the meantime they served well. And you still have that one of the most vigorous ~computer landscapes.

6502 was fun in its day. So was 68000. But as someone else already pointed out, compilers are damn good these days. Plenty of people get along just fine without any asm knowledge. Also, I would hope people understand how a computer works if they're going to program. I don't know any great (at least imo) programmers who don't.

6502 was fun in its day. So was 68000. But as someone else already pointed out, compilers are damn good these days.

... but can be out-performed on many machines by a decent assembler programmer.

Plenty of people get along just fine without any asm knowledge.

This might be true for database or banking software. But when it comes to embedded programming, it is always painful to discuss with programmers who do not know about the machine they are programming.
Also, assembly programming teaches good boolean algebra.

Also, I would hope people understand how a computer works if they're going to program. I don't know any great (at least imo) programmers who don't.

I have seen lots of code like this from guys who never coded a single line of assembly:

uart_format = _8BITS_PER_BYTE || ENABLE_PARITY;

I do not mean, that every programmer needs to be a perfect assembly crack, but it the same with a car. Knowing how to drive it, is just not enough. You need to know where to fill in the fuel (or today: plugin in the cable) and _what_ kinda fuel you need :-)

I have seen lots of code like this from guys who never coded a single line of assembly:

uart_format = _8BITS_PER_BYTE || ENABLE_PARITY;

I'm assuming that was supposed to be a bit-wise or instead of a logical boolean or? Or (no pun intended) was that the point - that those programmers that don't know assembly don't know the difference between the | and || operators?

I myself used to make that same mistake, and I learned C doing a tutorial that was heavily geared towards graphics and mixed in a lot of assembly (Asphyxia set if tutorials if anyone remembers that demo group).

That said, 6502 assembler is kinda cool. I learned it by trying to implement an NES emulator (turns out this is way harder than I thought it would be because of various quirks in the NES hardware), albeit that was a modified 6502. If I recall correctly, the D (binary coded decimal) flag didn't actually do anything. But I digress.

"6502 was fun in its day. So was 68000. But as someone else already pointed out, compilers are damn good these days.

... but can be out-performed on many machines by a decent assembler programmer. "

As CPUs became more orthogonal, I bet the advantage eroded pretty quickly. Compilers can just keep track of more information to make better decisions on optimizations, such as efficient register scheduling.

And it's poor software engineering to dive straight into assembly without finding the bottlenecks first. Even small micro-controllers like PIC are better programmed in C first with a directed migration to assembly based on performance requirements.

:-) I did not want to start the old ASM vs. C vs. C++ war again. But there are good reasons to write assembly code. And yes, a bad algorithm in assembly stays a bad algorithm :-)
And sometimes, you just can't speed up things anymore.

Yea, GCC often produces subpar code in my experience. Depending on how tight a loop needs to be, hand-crafted assembly can bring decent gains. Sometimes we can get away swapping in intrinsics, other times GCC just refuses to output good code.

ICC is supposed to be an excellent code optimiser though.

It's all relative though, computers have gotten so fast we're usually waiting on I/O anyway.

Yea, GCC often produces subpar code in my experience. Depending on how tight a loop needs to be, hand-crafted assembly can bring decent gains. Sometimes we can get away swapping in intrinsics, other times GCC just refuses to output good code.

It needs to be said that coding in asm does not have automatic benefits of any kind. The quality of asm depends on the knowledge and capability of the person programming it. I've seen plenty of terrible asm. There _can be_ benefits to asm, but it isn't a given and shouldn't be overstated.

It's all relative though, computers have gotten so fast we're usually waiting on I/O anyway.

Actually, I may be using poor semantics, but I consider memory limited processes (whether from CPU or GPU) to be "IO" bound. After all, those requests have to be serialised over the system bus almost like any other IO. Obviously most programmers treat "memory" as though it's different from IO. While memory IO clearly plays a specific role for the CPU, from a databus-oriented point of view it's not all that special.

Semantics aside, it can be difficult to make SMP systems scale well if memory & IO are the real bottlenecks. I'd consider a CPU-bound process one that makes minimal use of the system bus, including system memory. I find many multithreaded advocates proposing to subdivide problems among tiny light weight threads, but very often they shift a problem from being CPU-bound (which is good for SMP) into one that is memory-bound which offsets the benefits of parallelism.

Even caching is problematic in that x86 processors must implement very strict cache coherency models, which severely limits SMP scalability.

Anyway, they don't change anything in what ilovebeer said - nobody seriously targets 3-decade old computers any more.
Plus, they are just that, demos - not actually useful applications or interactive games, which basically looked nowhere close.

The ROM images for the 8-bit components required to make VICE are copyrighted, and you install them at your own legal risk. Tulip
Computers, in the Netherlands, appears to hold the copyright to them, and is being somewhat persnickety about who uses them.

Ah, so they do something, well, illegal (honestly, "and you install them at your own legal risk" is silly, it would likely never result in prosecution of individual users; OTOH, they, VICE, choose to distribute those ROMs, and via sourceforge of all places) - hoping it will go under the radar... I wonder how long that might last, with by far the most popular C64 emulator - maybe Tulip isn't that persnickety after all.

PS. It seems Tulip is even non-existing by now... http://en.wikipedia.org/wiki/Tulip_Computers (historically ironic how Tulip just copied the BIOS of IBM PC) - and they got rid of Commodore stuff even earlier. Oh well, I guess the whole legal mess with (slices of) C= corpse might be not so bad.

It's an active forum with some 6502 grognards. One thread at the moment is the progress of guy tracing circuits of the 6502 from a micrograph of the chip.

Other threads involve some folks who are working on the "65Org16" CPU, which is essentially a 6502 with everything stretched to 16-Bits. They have running examples in FPGAs.

Others are working on 65816 boards.

Over Christmas break, I decided to write my own 6502 simulator, and an assembler, and I've got Fig-Forth running on it, so the trials and tribulations of that are posted over there. I'm in the processing of adding ANSI support to my simulators terminal so I can make a screen editor for the Forth.

Simply, if you're interested in the 6502 (not necessarily the Apple/C64/Atari), it's a great place to hang out.

However, I must recommend the Altirra Hardware Reference manual, available from www.virtualdub.org/altirra.html, which is just chock full of really crazy low level stuff that the developers of this Atari 800+ emulator discovered. You learn a lot about the nuances of hardware design when you try to reverse engineer it.

I'd also be remiss not to mention visual6502.org, which is a transistor simulation of the chip. If you want to know what happens when the NMI pins fires during a BRK instruction, this is the place to go.

The 6502 remains as having a pretty active following, the chips are still in production, it's a good chip and has a cool legacy to it.