Posted
by
Soulskillon Friday May 18, 2012 @04:52PM
from the no-you-don't-eat-it-listen-to-me dept.

CowboyRobot writes "Opening with the line, 'To me, a personal computer should be small, reliable, convenient to use and inexpensive,' Steve Wozniak gave his system description of the Apple-II in the May, 1977 issue of BYTE. It's instructive to read what was worth bragging about back then (PDF), such as integral graphics: 'A key part of the Apple-II design is an integral video display generator which directly accesses the system's programmable memory. Screen formatting and cursor controls are realized in my design in the form of about 200 bytes of read only memory.' And it shows what the limitations were in those days, 'While writing Apple BASIC, I ran into the problem of manipulating the 16 bit pointer data and its arithmetic in an 8 bit machine. My solution to this problem of handling 16 bit data, notably pointers, with an 8 bit microprocessor was to implement a nonexistent 16 bit processor in software, interpreter fashion.'"

Well spotted. I recently learned how the paddle interface worked when reverse-engineering an old Apple II game. Even though I cut my teeth on an Apple II, I never knew how the circuit actually worked. When I saw the 6502 paddle code in the game it made no sense to me until I examined the Apple II's schematics. Then my mind was slightly blown. Just another one of those brilliantly simple hacks that riddle the Apple II's design and make it an almost magical device to me.

Your mind is easily blown. How is a single slope A/D that's been standard practice since before Woz was even born a "brilliantly simple hack"? Jesus, I've got computer and electronics books from 1962 that are yellow and brittle that describe these circuits. Woz has a bit of an overinflated reputation IMO. Every single hardware engineer of the era worked the same way. Yes, even at Atari and Commodore.

The 555 stuff isn't really that amazing, but Woz did some fairly amazing things. For example...

Integrating the dram refresh with the video display on the original Apple ][ was pretty clever as with the 1/2 phase pixel shift to get cheap color w/o fancy sub-carrier modulation.

The original Apple ][ floppy drive subsystem using "raw" drive mechanims from Shugart and implementing the controller mechanism in 5 chips and some software (soft sectored avoiding the punch hole detector, no track0 detector, no head load solinoid, 5/3 software group-coder allowing 13-16 sectors/track instead of 10 when others were using MFM, etc.). This when other vendors at the time had quite inferior, yet more expensive floppy disk drives.

Sure it isn't rocket science, but it is still good engineering wizardry, not just "plugging resistors".

Woz came up with so many improvements over previous art: cutting edge stuff. As you have noted: Video retrace DRAM refresh, well designed interpreted Sweet-16, very efficient BASIC, group encoding for the floppy disk, color shift without subcarriers.

Not rocket science? That is VERY conservative science. No place to innovate at Woz's pace. There was little, if any prior art: Microprocessors were just too small for "serious study" in most institutions. Woz was, and has always been the king of the tech in

??? Mr/Ms AC, I didn't change any examples, that was my first post. Perhaps you are refering to another poster?

I can't speak for anyone else, but my first statement of my post was "The 555 stuff isn't really that amazing" and I finished with "Sure it isn't rocket science..."

Do you have some issue with these statements?

Or are you (Mr/Ms AC) just so filled with Woz hate that you have to attack everyone that says anything even remotely positive about Mr Woz with a hair trigger post? Are you're pissed that he wasn't eliminated before your favorite Dancing with the stars celebrity? Fan of Holly Madison, or a GoGo's fan maybe? Is that why you are posting AC?;^);^)

Of course Mr Woz isn't god (despite what some OTHER posters may have gushed about), be he seems to have been a damn good engineer. However, sometimes the best role models for people are not the ones that are so beyond us that we can never aspire to be them (scientists or researchers that create a new paradigm), but maybe for some of us lowly engineers, someone that we hope we can hold a candle to on a good day and thus more relateable and a bar that we might be able to reach some day if the stars align...

Is it literally too hard for you to let people have their own heros instead foisting yours upon others? Something to think about Mr/Ms AC...

But to answer your question (if it was directed to me and not the other poster), what Woz did with the 55x timer is very vanilla and probably could be copied out of a fairchild or national app-note, but what Woz did with the disc controller was something that pretty much was wizardry. Basically he single handedly designed a amazingly cheap floppy disc controller (40 chips vs 5 chips) that not only was more advanced in storage capacity and access speed than any other in the industry at the time.

By doing so allowed Apple to sell a disc drive for under $500 with a BOM of $150 (eventually reduced to $80) enabling Apple to practically mint money with this product. In several interviews with Mr Jobs and other Apple and (some disbelieving) Shugart contemporaries, they credit this floppy disc controller design by Woz as the major growth driver at Apple and probably more important than the computer itself in launching the Apple IPO. Basically, Woz didn't have any background in floppy disc controller theory, he read some data sheets and figured it out and beat out the best in the industry at the time. He also layed out the controller circuit board to minimize the feedthroughs to help improve the reliability and manufacturability, basically a soup-to-nuts holistic designer. That's engineering wizardry (to me anyhow, as a lowly engineer)... something I might aspire to someday... But even the best designer needs to crank out a 55x-esque circuit sometimes. I'm sure all you your heroes had a few more pedestrian accomplishments along the way too.

Excellent. I hope I was to inform you and generate some interest in the history of technology. You want your mind blown? Try the history of the proximity fuze invented in WWII. You know, back when you had vacuum tubes. They managed to cram an entire doppler radar into the nose cone of artillery shells that had to survive 20,000G acceleration and 100,000RPM rotation when fired. Not only that, but be safe to handle and store, and come up to power within milliseconds after being fired. You'd find that hard to

I was a hardware engineer in the day, and yes they did when facing these problems. Yes I did build floppy controllers which used comparable hacks, and so did others.

I am not saying Woz was not a good engineer, I am saying that he was not the only good engineer, and he was doing what good engineers do. In those days, you could not get a patent on bending a piece of wire, or some other triviality.

Here's the very strange thing about that error. I have a scan of that issue of Byte and it does indeed say 553 there. The article also has a circuit diagram, again showing a 553. If you look at the original Redbook schematics, it also shows a 553 quad timer. There is even advert for 553 quad timers on page 174 of that issue of Byte. I've also seen a post online from someone with a 553 chip in an apparent timer circuit asking about it's identity. All that and no datasheet or cross reference for a 553 quad timer can seem to be found. My best guess is 553 comes from an imprinting error on actual 558 chips.

Note that the 558 is not retriggerable. This led to bugs in joystick (or game paddle) reading in many Apple II programs. If you trigger the 558 to read input 0, then want to read input 1, do NOT just trigger it again. Make sure that input 1 has timed out first, before triggering it to be read. Otherwise you'll read an incorrect value. Or trigger once, then read all inputs you're interested in at the same time.

As much as I think of Woz as one the all-time heckuvva outsanding engineering types and hacker extraordinaire (in the good sense), making him head of Apple would be one of the worst things that could happen to it. He had/has no real business sense or skill whatsoever- something he himself has admitted to. The man's approach it that he hacks; he doesn't design.

But to keep it simple and give a nice hypothetical: if Woz had been in charge and Apple managed to live that long under his leadership, the iPod woul

Having used some, I must admit it is good-quality hardware. Just not such good quality that it's worth twice the cost of a similar-spec PC. I've only got a macbook because they were the only manufacturer left that use a decently high-resolution screen. A few others used to (Dell, I recall) but they no longer offer WUXGA+, and I wanted the vertical pixels.

Scrolling down the list I come across QSXGA, FWVGA, WHUXGA..... I mean, WHUXGA sounds like a province in China or something. There's probably a systematic set of rules, but it doesn't make the names any more helpful-looking for non-autistic people:-/

I was in high school working in a retail computer store in 1978 when the Apple ][ and its competitors were taking hold in the market. The Apple was the only computer with high-resolution color graphics for under $5000. I could tell just by looking at its motherboard that its design was something special - having built a video display board from scratch with my brother, I knew how much circuitry is usually required.

The right question to ask would be how expensive were other *computers* when the Apple II was announced? The II came out in 1977, when the only real options on the nascent PC market were the II, the Commodore PET, and the TRS-80. (I won't start any religious wars here over the relative price/value ratios of each) I think Woz's point wasn't that the II needed to be the cheapest product in this category (the II was the most expensive of the three), but that this type of product needed to be in a range that

It was a long time ago that's for sure. I was a teenager. I remember we first got an Apple IIe. Then we went IBM, and have stayed that way ever since. You're probably right on the dates though. Still, back then the PC you wanted always cost around $5000, regardless of brand.

At the time the Apple II was released, there were only two other non-kit microcomputer systems available--the Radio Shack TRS-80 and the Commodore PET. Both models were well $1000, while the Apple II was about twice as much for equivalent memory. Of course, the Apple II could do a lot more than the other two systems, especially in regards to graphics. However, as the technology improved, and competitors offered more powerful systems at lower prices, Apple never reduced their prices. At the peak of the microcomputer golden age, an Apple II system cost nearly 10 times as much as an equivalent Commodore 64 system.

When Apple released their floppy disk drive, they priced it at $550. People asked why they priced it so high. Apple responded, "Because we can."

Apple pricing at the dawn of the PC era (Fall 1981), when fresh out of graduate school, the university that hired me offered me $10K in start-up funds for my research lab. I knew I wanted a microcomputer system, but didn't know if the newly-introduced IBM PC was going to be anywhere nearly as well-supported as the Apple ][. So I took that $10K and bought an Apple ][+ with 64K RAM, a Z80 card, CP/M, 3 floppy drives, a monochrome (green) monitor, a color monitor, an Epson MX80 dot matrix printer, a Diablo daisy-wheel printer, Apple Pascal, Microsoft Fortran, and Wordstar. I think there were even a few dollars left over. The next spring, I decided I wanted system for myself so I spent $2200 on a Basis 108 (a German-made Apple ][ clone with a built-in Z80 card and a monstrously heavy case) with 2 floppy drives and a monochrome monitor.

"How expensive were other personal computers when the Apple ][ was released?"

What other personal computers? There really wern't any.

There were hobby kits like S-1000 systems and for $2500 more you could get something like color graphics, 256x256x8 or somehting and a command line based CP/M. Some guys had LSI PDP's thathad been cast off, with RT-11 or something. There were other kits, products, but nothing else really looked and felt like a consumer product, that's sorta the point.

Note that prices came down over time, especially due to decreases in RAM prices.

So, I'd say that there was something of an "Apple tax" even back then, but it wasn't really so much. When you considered how much expansion capability you got with the basic unit (which for other systems was either an add-on or simply not possible), it was actually a good deal.

Do Apple critics still use that old canard? For the most part, Apple devices have been pretty price competitive for many years now, even the Macs. I remember when the iPad was rumored to cost $1000. It's like trolls ran out of every other schtick, so they remain stuck on the most recent one they had, which was price.

Dude, what planet are you on? Let me guess, one with a bite taken out of it?

Depends how you compare. If you're trying to compare say, a Macbook Pro with a netbook, then yeah, Macs are more expensive. Or even a Macbook Air against a netbook. Ignoring stuff like an Atom is no way competitive to a Core2Duo, nevermind the i5, the SSD, memory, etc.

OTOH, if you try to compare like with like (as much as possible), they're quite competitive. The usual explainations for deviations is use of cheaper bigger heavier laptops in place of svelte ones (e.g., trying to compare a MBP against some much heavier, much larger Dell model instead of using Dell's more expensive smaller and more portable ones).

And displays as well - some fail to account for upgrading a 15" laptop from a 1366x768 display to I think the 1440x900+ that Apple puts in the 15" (nevermind the 1920x1200 on the 17")

Heck, even the Air is standing on its own compared to the Ultrabooks Intel's trying to bring out (hint: they're all a joke. First pass - no manufacturer wanted to make an ultrabook because they couldn't be competitive. Second pass - with Intel subsidies, they got the price to be the same as the Air, but with specs that were iffier (i3 vs. i5, slower, heavier, etc). Third pass (current) - intel relaxed the specs even more to be far more generous - so you can find 14" ultrabooks that are 1" thick or so - basically "small laptop').

Of course, this holds true pretty much for the first couple of months of Apple's refresh cycle. After that, it's not competitive anymore. Given the current Macbooks are all needing refresh, they are uncompetitive. Once Apple releases their Ivy Bridge laptops (WWDC?) they'll be competitive again. It's because Apple doesn't drop their price as time goes on nor do they have sales.

Who says the $100 price difference is solely down to the extra cost of the higher density NAND? They cost more because a) that's what the market will bear and b) making different models of a similar device on a mass scale does not always enable the economics of said devices to merely come down to the raw delta in the cost of the pieces.

Cool, you're in 1996. Buy Yahoo stock, then sell it in 2000.
In fact, start giving up all your tired opinions in 2000, as they will start to become invalid around then. By 2012, the entire industry will struggle for years to compete with Apple on price, and fail.

It really depends - the consumer ones are pretty competitive for what you get. They're a little underpowered right now, but that's because Apple doesn't incrementally update them, so they're better value when they've just been refreshed.

They can't compete with el cheapo plastic boxes or whitebox self assembled machines, but they are not meant to.

Where they really fall down is the Mac Pro, which is simply woefully overpriced for what it is, since it hasn't been updated for 2 years and still costs the same, a

Macs price competitive? Since when? When I was looking for a laptop, an equivalent MacBook with the same hardware would have cost me FIVE times more than a windows machine. MacOS is great, but it's not worth over $1000 price difference on a non-Mac machine.

Whether an Apple laptop is price competitive depends on how long after release you look at it. Apple generally updates their laptops every 250-350 days, and AFAIK the price stays the same during that period.

The laptop I purchased was around $400. For the equivalent processor power, memory, display size, etc., I would have needed a $2,000 Mac Book Pro. I did my research. I really did want the Macbook, but I just couldn't justify to myself the massive price difference.

Let me see. The mac pro isn't a fair one to look at, being a high-end professional workstation, so how about something consumer. Say, the Mac Mini. that's £529 for an i5 dual-core 2.3GHz, 2GB ram, 500GB HD and Intel HD graphics. That's their entry-level desktop computer. Now, if I go to ebuyer... they don't actually have anything with only 2GB ram, so I'll have to get a 4GB system. But for £512 - slightly *less* than a mac mini - I can get an HP with 4GB ram, 500GB HD and a *quad* core 3.0GHz pr

Size matters, and it costs. The mini is comparatively expensive because it has to use notebook components to fit that form factor. It also has the lowest idle power of any mainstream computer, so it's lower cost to operate.

When you look at notebooks, size, weight, battery life, display quality and resolution, and durability all matter. Where are the non-Apple notebooks that are competitive with the MacBook or MBP in these factors. There are several, from HP, Dell, Sony, Asus, etc., and they're all in the sa

The people who say that Apple is no more expensive for "comparable" hardware ignore that most people don't need anything directly comparable. The people who want some super high end Dell laptop represent a small fraction of the market. For everyone else, "Apple is expensive" is a fair comment - all those extra features aren't worth the extra cost for most people, which is why they don't buy the equivalent PC products either.

I think that the PC industry is one of the very few market segments where "you get what you pay for" is ignored. Sure, you can buy a cheap Dell laptop (as opposed to an expensive Dell laptop - they do make higher end ones) and then you spend the rest of the machine's life cursing the poor build quality with the creaky plastic case and the poor cooling solution that is barely adequate for it, so that it spins up that tiny, noisy fan every 30 seconds, then eventually breaks due to the case flexing slightly so

Why does it have to be mutually exclusive? Just because there have been cooling issues with some Mac designs? You'll note that I said that you could get well made Dell laptops too - they just cost more.

There certainly have been some overheating issues with Apple laptops (almost exclusively related to Nvidia GPUs causing the whole thing to fail), but otherwise Apple's laptops have been pretty successful on that front. I know that the aluminium unibody ones can get quite hot and some people dislike that, but

Many times, the thermal shutdown has nothing to do with the cooling design. I have never seen a laptop that bad, but generally, it is caused by the heatsink getting gummed up with dirt, hair, whatever. I have had this issue with Mac book pros as well, many times, they are ones that have been in service for a while, but when the unibody came out, it happened to nearly every one of them. I do deal with many Macs as the company I work for is addicted to them, I have seen at least one from each series, and i

To take is a bit further. We have arrived at the 'great enough' stage of computing. For most practical uses the differences between the HP and apple machine are truly minimal. Raw performance is ALMOST a secondary concern now.

You won't compare an all-in-one like the iMac yet you will compare Apple's "entry level" (your description) Mac Mini with a Midi Tower.

Right. Legit.

The Mac Mini is not sold as an entry level machine designed to compare to small towers, it's designed to be an extremely small HTPC-type computer. You're not comparing like with like at all. Now, if you *do* compare the Mac Mini you'll find it is still more expensive than other machines in the same form factor, but that is mainly down to component choice (eg, th

I called it 'entry level' because it's the cheapest computer that Apple sell.

Ok, let's try something else. How about something which owes nothing to Apple manufacturing like, say, a hard drive. One drive is much like another, really. Interchangeable, so long as the numbers match. So, how much does Apple charge for one of their drives with an Apple sticker on?

http://store.apple.com/uk/product/MC730ZM/A?n=internal&fnode=MTY1NDA0Nw&s=topSellers£254, for a 7200 2TB 3.5". How much for a 7200 2TB

The PC I picked for comparison didn't come with a monitor. Nor does the Mac Mini. Thus a fair comparison. To compare an iMac fairly, I'd have to compare it with a PC which intigrated all components into a monitor of equal resolution - and that would be hard to find. One thing I really do like about Apple is their continuing dedication to high-resolution displays at a time when every other PC manufacturer seems to think nine hundred vertical pixels is good enough.

FFS, when will "you" people understand the basic concept that follows: At equal component level a PC is not less expensive than an the equivalent Apple. I mean, you can't blame a company for not wanting to sell low end computers, do you?

I recently bough a high end PC laptop... a mac book pro would have even been less expensive with pretty much the same internal components.

Oh, really? Apple wants $2500 for a 15" MacBook Pro with a 1680x1050 screen and 4GB RAM and a "2.4GHz CPU". They don't specify what actual model that is, and I'm not going to bother to look it up elsewhere. If somebody wants me to buy from them, not only do they have to have decent prices, but they need to actually define what I'm buying. I paid less than $1300 for a 15" laptop with a i7 2760 for the extra virtualization features, 1920x1080 screen, 8GB RAM, Blu-Ray drive, DisplayPort, powered eSATA/USB 2.0

Commodore bought MOS in 1976, and it was renamed Commodore Semiconductor Group. That is why CEO Jack Tramel was able to undersell the competition at about half the cost. He gave the 6502s to himself for free within Commodore, while charging Atari and Apple regular price for their computer/disk drive CPUs.

"Business is war" was his motto and he used every advantage to become the best-selling computer of all time (~20 million units). It's a shame that he died and alm

God knows what he might have come up with to save the Apple II if he hadn't had the accident.

Possibly nothing. Yes he is a brilliant guy. But it is entirely possible that his (hypothetical) next act would have been a failure. Woz was the right guy in the right place at the right time. Maybe he would have continued to pump out brilliant products. Maybe not. It's quite possible he was forced to quit while he was ahead. I appreciate your optimism but his first act was a pretty hard act to follow and he hasn't really pumped out much technology of note since.

Woz was/is good at the 'clever hack'. Getting something for nothing. As in the Apple II, where DRAM needs to be refreshed, and Video needs to read memory in a systematic pattern, so lets just make sure the video access read pattern satisfies the refresh requirement, and never have to worry about refresh after that. Also the color video by 'color artifacts' instead of adding an honest color sub-carrier to the video. Another thing I particularly liked about the Apple II is that a certain area of ROM space

And one trick he missed that could have been done cheaply... if the video vertical sync pulse had been made available someplace in the I/O space as a bit you could test, then it would have been trivial to know when you were in the vertical blanking interval so that you could flip video buffers cleanly.

I wrote a number of utilities for the Apple ][. One of which was a replacement garbage collection utility. The garbage collector in the Apple ROM would basically kick off when there was no more available memory and then "freeze" the machine for about 30 minutes while it dumped the garbage. I wrote one that could be run from the Ampersand &GC in Applesoft Basic. If your application used a lot of strings and reassigned those strings the heap would fill up really fast. My utility would run in seconds as opposed to the 30 minutes. I made about $1000 as a 16 year old kid selling this utility in Nibble magazine.

One other comment. Woz was a genius, but his shortcut for color graphics was based on 7 lines. Each byte in the $C000 address space used a nibble encoding scheme to display color. $C000+$200 (I think would move to the next line 7 pixels down. This 7 byte math drove us developers nuts. To draw on the screen you would either use FP math (very slow) or you would pre-populate a look up byte table to know where in memory you should poke to get the right row to show up a color.

I've not done assembly language since those days. It sure was fun and challenging though. Now everything is so bloated I rarely see tight efficient code anymore. I'm not suggesting that we go back to developing in assembly. I'm just pointing out that you were forced to be disciplined when you coded which made for more efficient code.

The $C000 address space was used for softswitches. The areas you're thinking about are $2000-$3FFF and $4000-$4FFF, Hi-res pages 1 and 2 respectively.

Each page held 192 scanlines of data, with each scanline described from left to right in a continuous line of 40 bytes. The upper left corner of the screen was described in the first byte ($2000 or $4000).

So far, so good - but everything after that is madness.

$2000 is the upper left corner, yes. But the next scanline down is located at $2400. The one after

...that not all was well-designed and pretty. For example, portions of graphics memory were also used for slot/peripheral I/O [wikipedia.org]. These were called "screen holes" and greatly complicated every Apple II program that used those areas of memory -- there were literally tons of one-offs you had to write into your code, special cases depending on what peripherals were installed or used. This applied to both lo-res (GR) and high-res (HGR) modes. Here's an example [apple2.org.za] (hope you can read 6502).

I wish I could have afforded one at the retail price of $666.66.Too rich for my blood back then.Still, I got to hack them eventually at school and summer camps, but that's not the same thing as actually owning one of those $666.66 masterpieces.Sigh.

Engineering is optimally solving problems given your constraints, and in that sense the Apple ][ is an engineering master course.

I remember reading the available docs and being completely bowled over by two things: The video display doing the DRAM refresh for free and the workings of the Disk ][ encoding. It was mostly software driving very basic hardware, which was way ahead of its time. DOS 3.2 was kind of ugly, but since it was mostly software, he could upgrade it, and DOS 3.3 was a major improvement! It's hard now to appreciate how revolutionary this was at the time.

Even Woz could make mistakes - his sector interleaving wasn't optimal. In the time it took to process a sector, the next one was already past, so each sector took an entire rotation of the disk. But it was software, so various alternate DOSes just added one to the sector interleave, so instead of sector 1 2 3 4 5 you had sector 1 8 2 9 3 9 and you could copy the entire damn disk in 19 seconds. At least an order of magnitude better than the pokey C64 drive which used the hardware uber alles model.

But his engineering prowess doesn't really work for Apple's current positioning. He's unabashedly pro-consumer and pro-tech, where Apple is (wisely) in the business of providing devices that do a fantastic job of hiding the tech as much as possible, since Grandma or arts majors don't care what the hell the tech is as long as it works like they expect.

And his charming naivete doesn't really work with a big corporate environment, which is why Jobs was able to cheat him out of so much of the money they got.

There was something else I wanted to mention here - there are no sensors in the drive other than for the write protect notch.

How do you know where the read/write arm is? You don't! You just slam it back to home from wherever it is by moving it long enough (which causes the grinding noise when it hits the physical stop). Then you assume you're at zero and move from there. How do you know where sector zero is on the track? You don't! You just read till you see it encoded in the header. Similarly, you don't ca

I am strangely reminded of the one-wire bus used for low-power sensing.

The one-wire bus uses three wires.

Why the name? Well, one is supposed to be optional power, but in practice you won't get more than one sensor on the bus without it. And the other is ground which, I assume, doesn't count for some reason.

You kid, but in all seriousness, SWEET-16 probably does qualify as prior art for a few dozen currently litigated patent claims. Except you couldn't really call the Apple II "mobile". You could fairly call it a "limited resource computing device", though (a phrase found in one of Apple's iPod patents)

I ran into the problem of manipulating the 16 bit pointer data and its arithmetic in an 8 bit machine. My solution to this problem of handling 16 bit data, notably pointers, with an 8 bit microprocessor was to implement a nonexistent 16 bit processor in software, interpreter fashion