That was how I wrote my first published game back in the 80's. I have no complaints. Everything was new back then and even though the "wheel hadn't yet been invented", programming was still exciting and it was some of the most fun coding I have ever done.

I like to imagine every new programmer has that amazing sense of euphoria as they begin to uncover all the major algorithms for themselves, and begin developing a sense of just how much is possible with programming.

Then it's your job. To give the end-user some uninteresting but necessary layer of data connectivity.

As I've said numerous times: kids shouldn't be learning mathematics without the most powerful way to directly apply it: Programming. Seriously, #1 complaint teaching a kid math more advanced than long division: "I'll never use this in the real world" -- Change that. Make the way algebra is taught to be via computer programs and kids could actually DO STUFF by applying their knowledge immediately. That's how I'm able to turn any kid fluniking out in math into the head of the class.

Back then it was actually easier to read through large amounts of code, flipping between different sections, etc when it was on paper.

The listing wasn't used for paper and pencil emulation, we had quite nice integrated editors and debuggers to see what was going on (ex. the LISA 6502 assembler). The listings were for reading and understanding. These lists were used somewhat like tablets today. You can take the listing anywhere, flop down on the couch and start reading,...

And how. I recall once when I was refactoring some code written in relatively low level language. I printed it off, physically cut it up, and played with the pieces until I could imagine what I wanted to do. Seriously, there has been more than once when creating a physical space with code helped me solve some complex problems.

It didn't involve pencil and paper for long on the Apple II. I remember reading about a step-trace 6502 debugger for the Apple II back then. I didn't have any money to buy it so I wrote my own (in assembler of course) to ease debugging of a video game I was writing. It wasn't a hard job; the 6502 instruction set is small and straightforward and the CPU only has three registers.

Reading 6502 assembly is easier than reading some of today's bloated and convoluted Java/Perl/FP/what-have-you code. It's not like the assemblies of modern CPUs with OOE, branch predictions, and all such complexities.

Also, from a technical perspective, publishing source for 6502 machine code wasn't that big a deal. You could recreate a reasonable assembly source from the machine code by spending some time with reverse assembler (unless the code does goofy things like writing over its code and such). In fact, Apple II monitor code had a nifty reverse assembler built in.

In fact, Apple II monitor code had a nifty reverse assembler built in.

I'm sure there are a lot of us that remember "CALL -151"...:-)

I remember that. For whatever reason 3d0g would get me out of it. I was just a kid and had no idea what to do with the gibberish that the assembler would spit out at me. I just knew how to get out and back to my prompt.

That is true, but growing up on the Apple ][ I didn't have a good paint program, so I wrote my graphics by filling squares on graph paper, making a list of the coords on lined paper, and then typing them in.

Audio was worse, because you had to translate the tones into frequencies, and (attempt to) account for the time of your algorithms when deciding on the note timing.

I wish that Apple, and other companies, would create deep legacy support all the way back. Software from the Apple II should be able to run on the MacOSX and iOS. The computational power is there to do the necessary emulation.

Ooh then I could dig up the old 5 1/2 diskettes (made double-sided with a hole punch) with my pirated (Yes, I was 13 for a year,) copy of Karetika on it, and take that for a spin again.

Actually I'm pretty sure mame or one of its associated projects will emulate the ol' Apple 2, and I think also the C64 and maybe even the TI 99/4A. So while Apple doesn't support it directly, you probably could get that, at least on your OSX machine. Now the Amiga was a sexy little box but I haven't seen an emulator project

Screw Amiga Forever, if you still have a physical Amiga then just dump the ROM from it (or download them if you don't care about legality) and then run it in WinUAE, which is actually the base emulator in Amiga Forever (except newer, as it's currently in development).

Could be worse. I interviewed a lady for a SE position several months back that, when asked about whether she ever did any programming at home, fondly recalled programming on her C64 way back when, and how much she missed seeing that amber "C:\" prompt. The interview didn't go much further.

Meh... While I can see the value, this is exactly the problem that Windows is stuck in. Although they aren't completely backwards compatible, they try to be backwards compatible for a lot of stuff, which means they have to hold on to libraries which are poorly designed, and in some cases incorrect implementations because so much software depends on the incorrect implementation. MacOS is much cleaner because it has maintained less backwards compatibility. If you want to run old software, do it in a virtual

Some sort of virtual machine is the correct way to do legacy support. In some cases full virtualization is the answer, in others, a thinner layer that looks like the old OS to the application and like a modern app to the outer OS might be more appropriate.

The MS approach of keeping the severely broken APIs around forever is NOT the answer.

One problem with virtualization (e.g. XP Mode) or paravirtualization (e.g. WOW64) is that it's likely to support only those applications that use peripherals supported by the operating system's bundled class drivers. It's far less likely to support applications that use a custom driver, such as an EPROM programmer.

There is no reason they couldn't let the VM handle the device transparently.

Other than that a lot of programs (ab)using the LPT port as a GPIO are fairly timing-sensitive. And other than that Microsoft wants to control who has the right to market Windows-compatible hardware through the Windows Logo program, and it's likely to make VM I/O passthrough difficult for this reason, especially for a freely licensed VM such as VirtualBox.

In theory, sure. In practice, when you want applications to talk to each other and share data, virtualization doesn't really work very well. Also, Microsoft don't keep around broken APIs *forever*. A long time, yes (2-3 business upgrade cycles, so say 10-15 years, sometimes more, sometimes less) - but not forever.

That's pretty much how Linux does it as well, for libraries that do backwards compatibility at all. You provide a file that tells the ELF linker which version of an exposed api method links to which internal implementation. The linker embeds the library version linked against into the executable and voila, your program can run against a newer version of the library with no expensive, bloated vm infrastructure required.

Meh... While I can see the value, this is exactly the problem that Windows is stuck in. Although they aren't completely backwards compatible, they try to be backwards compatible for a lot of stuff, which means they have to hold on to libraries which are poorly designed, and in some cases incorrect implementations because so much software depends on the incorrect implementation. MacOS is much cleaner because it has maintained less backwards compatibility. If you want to run old software, do it in a virtual machine, and allow the OS itself to evolve and drop the baggage of keeping the compatibility. Not to say that everything should be changed every OS iteration, but there needs to be a process for getting rid of the cruft.

No what happens is that Windows has to work around everyone else's bugs - a lot of nasty developers don't do things the proper way and Windows suffers. It's why "C:\Documents and Settings" exists still on Windows Vista/7/8 - too many developers hard code that string (including the "C:\" part!) that not having that hard link means programs break.

Apple decided to take the other method - basically dictating that if you do not use just the published APIs, your programs will probably break. Yes, you can use private APIs. But as per the warning, Apple has full right to change the private APIs as they see fit.

Which is better? There's no consensus - Microsoft's means your programs still working, crappy coding and all, but you have to live with the fact that you still have a window named "Program Manager", that if you use a localized version of Windows, you'll eventually have a "Program Files" folder show up (yes, it's localized) because some program hard coded it, etc.

Apple's means a leaner system because all these hacks don't need to exist - private APIs are not fixed in stone but can change and be updated as time goes on and deleted when necessary, rather than having to hang around because some app uses it.

Kind of like every version of windows was an unstable POS until they dropped all of the dos underpinnings from it? Or should companies who have realized their products have outlived their utility be forced to continue to support them with their current products even decades later?

Back in the day, the source code for Atari DOS was included in a published book that explained exactly how it worked. That's one of the things that was great about that platform--so much information was readily available.

It was all written in 6502 assembly. Anyone that cared would disassemble it themselves, so it's not like there were any big proprietary secrets to protect. I'm surprised that this wasn't published 30 years ago.

I feel like I am just on the cusp at 35 years old where I remember when many, if not most, consumer electronics, that my parents bought when I was a kid, all had schematics. I mean, my father was no electrical engineer, he was one of those guys who knew just enough to avoid the capacitors in the back of the TV, how to identify fuses and how to resolder a bad connection.... but not enough to analyze logic or signals and really fix a non-trivially broken TV or

IIRC, the change occurred in the mid to late 90's, as software and hardware got complex enough that a lot of it started being subcontracted, and storage got large enough that you could store the entire set of plans digitally, making both the plans and the documentation much more mobile. However, the shift really began in the mid 80's, when the increasingly complex manuals started being "available" instead of provided by default.

Some examples include the Apple IIGS being the first Apple-based PC (as opposed

Back in the day, the source code for Atari DOS was included in a published book that explained exactly how it worked. That's one of the things that was great about that platform--so much information was readily available.

Yes, but possibly in spite of, rather then because of, Atari themselves. According to the book "Hackers" by Steven Levy, the Atari 800 was treated as a closed platform in the early days, and Atari wouldn't divulge documentation on its inner workings;

Transferring his new assembly-language skills to the Atari was difficult. The Atari was a "closed machine". This meant that Atari sequestered the information concerning the specific results you got by using microprocessor assembly-language commands. It was as if Atari did not want you to be able to write on it. It was the antithesis of the Hacker Ethic. John would write Atari's people and even call them on the telephone with questions; the voices on the phone would be cold, bearing no help. John figured Atari was acting that way to suppress any competition to its own software division. This was not a good reason at all to close your machine. (Say what you would about Apple, the machine was "open", its secrets available to all and sundry). So John was left to ponder the Atari's mysteries, wondering why Atari technicians told him that the 800 gave you only four colors in the graphics mode, while on the software they released for it, games like "basketball" and "Super Breakout", there were clearly more than eight colors.

Of course, it's true that all this stuff was *later* very well-documented, but how much Atari helped in that is open to question (*). It's certainly well-known that Atari were assholes in general in their late-70s/early-80s heyday, and they definitely tried to suppress third-party development of VCS games. So though I've heard enough people disputing aspects of "Hackers" not to take it as gospel, it does seem to tie in with what I've heard about Atari at the time.

The Atari DOS [atariarchives.org] book doesn't appear to have been published by Atari themselves, and whether it was with their blessing, I don't know. "Mapping the Atari" wasn't an official publication either.

While Atari released documentation, I suspect it was at the level *they* wanted people to be using the machine at. And for all their plus points, the 400 and 800 were clearly intended as more closed, consumer-oriented machines. The 800 did have some good expansion capabilities, but this was clearly meant to be done via its official ports and interfaces designed for that use. The lower-end version, the Atari 400 had far less official expansion capability, e.g. it was never originally designed to support RAM expansion- it was possible, but apparently required far less friendly hardware modifications and installation directly onto the motherboard.

The 1200XL was notoriously even more closed (and flopped massively). FWIW, the BASIC "manual" that came with my 800XL was a paltry pamphlet, and the official DOS 3 manual was nicely-presented, but certainly not deep.

Of course, it all worked out in the end, but I guess what I'm saying is that let's not romanticise the original intentions of companies like Atari back then, who'd have been happy to sit on those secrets and not release them to their users (who they viewed as potential competition).

(*) Those early days (1979 onwards) were before my time- I got my 800XL in 1986, so I can't speak from personal experience.

It was published by Compute! Books, not Atari. DOS was written under contract for Atari by a third party, so I'm not sure how much Atari had to do with it. But importantly, it was the original code with comments and extended explanations.

De Re Atari was published by Atari in 1981, and was probably the most detailed programming manual. It didn't have listings for the OS, but it provided enough detail that you wouldn't need them to understan

The big difference now, besides the general complexity, is that software is written in higher-level languages and compiled. Sure, you can disassemble it and try to make sense of it, but good luck with reverse compilation to get something resembling the original code. And making sense of compiler-generated assembly is a pain. (Yes, I do it often when debugging systems code, but when a problem is easy to recreate, I recompile the offending module without optimization for easy debugging.)

Seriously, this is cool and all, but, why wasn't this done over a decade ago? In fact, Apple should have done it themselves *before* ending the manufacturing of the Apple//, to inspire people to find new ways to hack this machine and utilize it in ways never intended by Woz.

It's sad that one of the best hacking platforms out there is the Raspberry Pi, and not the much simpler to figure out Apple// -- although to be fair, people are doing amazing things with the Pi, I just wish there was a popular 8-bit machine out there for the young'ns to get them started.

Let me try to rephrase: What some modern machines lack is the ability to become a specialist yourself, should you desire so, without beaucoup bucks of up-front costs and recurring certificate renewal costs.

Apple should have done it themselves *before* ending the manufacturing of the Apple//

While I personally find Apple fans' stylisation of the Apple II and III names in ordinary text (*) somewhat cutesy, self-conscious and contrived, I'd say that if you *are* going to do it, surely it's meant to be rendered as "Apple ][" ? I thought that the slanty "/" belonged on the Apple III, sorry.... Apple///:-)

Actually, having checked, apparently the Apple IIe used the slanty lettering, so I guess you're allowed to use it there, but *only* for the Apple//e, not the original Apple ][.

Seriously? 6502's and punched-cards together? What a wretched anachronism.

FTA: “DOS was written on punch cards. I would actually hand-write the code on 80-column punch card sheets. A guy at Shepardson named Mike Peters would take those sheets and punch the cards. The punch cards would then be read into a National Semiconductor IMP-16 and assembled, and a paper tape produced. The paper tape was read into the Apple II by a plug-in card made by Wozniak, and I would proceed to debug it. As the project got further along and the code was all written, and it was debugging and updating, I would mark up a listing and give it to Mike Peters who would then change whatever was necessary and deliver me a paper tape and I’d start again.”

There was Beneath Apple DOS [apple2history.org], a fabulous book from the time which was invaluable for figuring out what was going on. My understanding was that Don Worth and Peter Lechner disassembled the shipped code and sorted out how things worked, with great explanations. Those were a great guide and helpful for writing all kinds of software. I suspect that a similar effort these days would not be resolved without legal intervention- I have no idea if they even asked permission or if it would have occurred to people t

There was Beneath Apple DOS, a fabulous book from the time which was invaluable for figuring out what was going on. My understanding was that Don Worth and Peter Lechner disassembled the shipped code and sorted out how things worked, with great explanations.

One thing that made their task easier was a program supplied with Apple DOS called FID (File Developer). That program hooked into a mid level part of DOS called the File Manager. FID spent a lot of time populating a data structure called the "File Manager Parameter List" and then calling various lower level routines.Worth and Lechner, however, did a wonderful job of explaining Apple DOS at all levels, from how the disk hardware works all the way up to the command processor.

My memory was that the scrolling Terminator listings were assembly source code from Nibble magazine. I'm not sure what particular program, but it was a very recognizable format even when it just flashed on the screen briefly. I think there was some checksum code that came with the printed Nibble magazine that could you could check to make sure that you'd typed in things correctly. So I was probably one of the few people in the theatre who was amused that just as the Terminator robot was about to hunt and

My memory was that the scrolling Terminator listings were assembly source code from Nibble magazine. I'm not sure what particular program, but it was a very recognizable format even when it just flashed on the screen briefly. I think there was some checksum code that came with the printed Nibble magazine that could you could check to make sure that you'd typed in things correctly. So I was probably one of the few people in the theatre who was amused that just as the Terminator robot was about to hunt and kill something (or whatever it was), he appeared to be doing a quick check to make sure that the "Hunt and Kill Something" code that had been typed in from the magazine was typed correctly.

The internet is good at these types of things: here [pagetable.com] is a site with screenshots from the Terminator movie and indeed it was Nibble magazine source code, and the checksum program was KeyPerfect. The source appears at a quick look to be for some kind of disk utility, perhaps a RAMdisk or something. The code seems to be named OVLY (overlay?) and I recognize VTOC as a virtual table of contents on a disk sector.

It's really not that bad, although 6502 assembly was (for me, anyway) more challenging than 68K because the 6502 didn't have many registers to work with. You just have to decompose the problem down another level, and as long as you're paying reasonable attention to detail, it's really not that much worse than writing something in C.

Some might say that zero page was 256 registers. I don't see it that way since you couldn't make a single instruction to work with these "registers". But the instructions that did access zero page used one fewer cycle.

And I'm one of them. Seriously. I learned 6502 assembly on the Apple II, and most of the skills transferred to programming the Nintendo Entertainment System. In addition to my hobby coding for NES [pineight.com], I now have my name in a commercially published NES cartridge [infiniteneslives.com], where I wrote the menu and three of the games.

Lately it has become common for companies that own copyright in decades-old video games to rerelease the games in an emulator that runs on a modern platform. If a video game for Apple II requires Apple DOS, the game's copyright owner has two options. It can license Apple DOS in order to distribute it as part of the game's disk image bundled with the emulator. Or it can change the emulator to use high-level emulation for the BASIC integration, file system, and RWTS (block device driver) that make up Apple DOS.

Copyright should end after 10yrs max. Whatever paltry profits apple may stand to gain from hording things like this to themselves pale in comparison to the lost history if such things are destroyed before they're ever released to the public.

Have you considered the possibility that Apple simply wouldn't release the source code at all, if there were no copyright protection?

To keep companies from "hoarding," as you put it, would require a sort of negative copyright, where they are forced to escrow their source code for public release at the end of the copyright term (which would also need to be reduced). This is an interesting idea; if you want copyright protection, you have to vouch that you will release what is being protected at the end of the term. Sounds fair to me. If you don't like it, you'll have to rely on trade secrecy instead.

Have you considered the possibility that Apple simply wouldn't release the source code at all, if there were no copyright protection?

To keep companies from "hoarding," as you put it, would require a sort of negative copyright, where they are forced to escrow their source code for public release

I don't see why escrow would be so critical when someone with more time than money could just disassemble, document, and distribute a program. This already happens underground [romhacking.net]. Absent copyright enforcement, there would simply be no formal negative consequences for doing so.

If you don't like it, you'll have to rely on trade secrecy instead.

You'd have to keep the binary secret too in order to thwart a disassembly attack. Good luck keeping a binary secret from end users who who run it on hardware that they own. The sort of DRM seen in, say, game consoles has always failed within a few years.

With productivity and efficiency supposedly increasing, the rate of innovation supposedly increasing, the costs of distribution going down, and reach of distribution increasing, when copyright and patent terms are changed shouldn't it be for the shorter instead of longer?

Copyright terms that last more than a century prove something is wrong.

Copyright duration should expire with the author or if the author decides, before that date. Copyrights are different to patents - there's nothing stopping you re-implementing an idea what is under copyright.

What gives you or anyone else the right to force me to do anything with my own code?

Copyright should end after 10yrs max. Whatever paltry profits apple may stand to gain from hording things like this to themselves pale in comparison to the lost history if such things are destroyed before they're ever released to the public.

Whether copyrights should or should not last no more than 10 years is an interesting question but chances are they will always be much longer than that especially as life spans continue to increase. Meanwhile for something more unfailing than Moore's law check out the "how old is Mickey" [techdirt.com] copyright curve.

Say you wanted to reproduce OS X or iOS by starting from Darwin and GNUstep. We have Darwin source code, including the XNU kernel, but we have no guarantee that the binaries were compiled from the Darwin source code.