How can it take 3GHz to emulate a Super Nintendo? The man behind a major SNES …

Share this story

Emulators for playing older games are immensely popular online, with regular arguments breaking out over which emulator is best for which game. Today we present another point of view from a gentleman who has created the Super Nintendo emulator bsnes. He wants to share his thoughts on the most important part of the emulation experience: accuracy.

It doesn't take much raw power to play Nintendo or SNES games on a modern PC; emulators could do it in the 1990s with a mere 25MHz of processing power. But emulating those old consoles accurately—well, that's another challenge entirely; accurate emulators may need up to 3GHz of power to faithfully recreate aging tech. In this piece we'll take a look at why accuracy is so important for emulators and why it's so hard to achieve.

Put simply, accuracy is the measure of how well emulation software mimics the original hardware. Apparent compatibility is the most obvious measure of accuracy—will an old game run on my new emulator?—but such a narrow view can paper over many small problems. In truth, most software runs with great tolerance to timing issues and appears to be functioning normally even if timing is off by as much as 20 percent.

So the question becomes: if we can achieve basic compatibility, why care about improving accuracy further when such improvement comes at a great cost in speed? Two reasons: performance and preservation.

First, performance. Let's take the case of Speedy Gonzales. This is an SNES platformer with no save functionality, and it's roughly 2-3 hours long. At first glance, it appears to run fine in any emulator. Yet once you reach stage 6-1, you can quickly spot the difference between an accurate emulator and a fast one: there is a switch, required to complete the level, where the game will deadlock if a rare hardware edge case is not emulated. One can imagine the frustration of instantly losing three hours of progress and being met with an unbeatable game. Unless the software does everything in the exact same way the hardware used to, the game remains broken.

Or consider Air Strike Patrol, where a shadow is drawn under your aircraft. This is done using mid-scanline raster effects, which are extraordinarily resource intensive to emulate. But without the raster effects, your aircraft's shadow will not show up, as you see in the screenshot below. It's easy to overlook, especially if you do not know that it is supposed to be there. But once you actually see it, you realize that it's quite helpful. Your aircraft has the ability to drop bombs, and this shadow acts as a sort of targeting system to determine where they will land.—something that's slightly more difficult without this seemingly minor effect.

The second issue is preservation. Take a look at Nintendo's Game & Watch hardware. These devices debuted in 1980, and by now most of the 43 million produced have failed due to age or have been destroyed. Although they are still relatively obtainable, their scarcity will only increase, as no additional units will ever be produced. This same problem extends to any hardware: once it's gone, it's gone for good. At that point, emulators are the only way to experience those old games, so they should be capable of doing so accurately.

But this accuracy comes at a serious cost. Making an emulator twice as accurate will make it roughly twice as slow; double that accuracy again and you're now four times slower. At the same time, the rewards for this accuracy diminish quickly, as most games look and feel "playable" at modest levels of emulator accuracy. (Most emulators target a "sweet spot" of around 95 percent compatibility with optimal performance.)

There's nothing wrong with less accurate but speedy emulators, and such code can run on lower-powered hardware like cell phones and handheld gaming devices. These emulators are also more suited for use on laptops where battery life is a concern. But there's something to be said for chasing accuracy, too, and it's what I've attempted to do in my own work. Here's why it matters to me.

Doing it in software

Back in the late '90s, Nesticle was easily the NES emulator of choice, with system requirements of roughly 25MHz. This performance came at a significant cost: game images were hacked to run on this emulator specifically. Fan-made translations and hacks relied on emulation quirks that rendered games unplayable on both real hardware and on other emulators, creating a sort of lock-in effect that took a long while to break. At the time, people didn't care about how the games originally looked and played in general, they just cared about how they looked and played in this arbitrary and artificial environment.

These days, the most dominant emulators are Nestopia and Nintendulator, requiring 800MHz and 1.6GHz, respectively, to attain full speed. The need for speed isn't because the emulators aren't well optimized: it's because they are a far more faithful recreation of the original NES hardware in software.

Now compare these to the older N64 emulator, UltraHLE, whose system requirements were a meager 350MHz Pentium II system. To the casual observer, it can be quite perplexing to see Mario 64 requiring less processing power than the original Mario Bros.

My experience in emulation is in the SNES field, working on the bsnes emulator. I adored the ideal behind Nestopia, and wanted to recreate this level of accuracy for the Super Nintendo. As it turns out, the same level of dedication to accuracy pushed requirements up into the 2-3GHz range, depending on the title.

TimeCop, on two very different emulators

Nestopia caught on because its system requirements were paltry for its time, but I have no doubt that releasing it in 1997 would have been disastrous. Since my emulator ultimately required a computing system with more power than half the market, I've seen first-hand the effect of high system specs and the backlash it causes. It's easier to blame the program than to admit your computer isn't powerful enough, but the reality is that faking an entire gaming console in software is an intensive process.

Why accuracy matters

So if an emulator appears to run all games correctly, why should we then improve upon it? The simple answer is because it improves the things we don't yet know about. This is particularly prominent in less popular software.

As an example, compare the spinning triforce animation from the opening to Legend of Zelda on the ZSNES and bsnes emulators. On the former, the triforces will complete their rotations far too soon as a result of the CPU running well over 40 percent faster than a real SNES. These are little details, but if you have an eye for accuracy, they can be maddening.

I've encountered dozens of titles with obscure quirks. Sometimes the correct, more accurate emulation actually produces a "wrong" result. Super Bonk's attract mode demo actually desynchronizes, causing Bonk to get stuck near a wall on most real systems. And Starfox suffers from significant slowdown issues throughout the game. These are certainly not desirable attributes, but they are correct nonetheless. We wouldn't round pi down to 3 simply because irrational numbers are inconvenient, right?

I don't deny the advantages of treating classic games as something that can be improved upon: N64 emulators employ stunning high-resolution texture packs and 1080p upscaling, while SNES emulators often provide 2x anti-aliasing for Mode7 graphics and cubic-spline interpolation for audio samples. Such emulated games look and sound better. While there is nothing wrong with this, it is contrary to the goal of writing a hardware-accurate emulator. These improvement techniques typically make it more difficult even to allow for the option of accurate emulation, in fact.

Another major area where accuracy is a benefit is in fan-created works from translators, ROM hackers, and homebrew developers. Few of them have access to run code on real hardware, so they will often develop their software using emulators. Unfortunately, speed-oriented emulators will often ignore hardware limitations. This is never a problem for a commercially developed game: upon required testing on real hardware, the bug would quickly be discovered and fixed. But if you can only test on a specific emulator, such bugs tend to persist.

I can name a few examples. The fan translations for Dragon Quest 1&2, Dual Orb 2, Sailor Moon: Another Story and Ys 4 all suffered invisible text issues as a result of writing to video RAM while the video processor had it locked out for rendering the screen. Only half of these titles have subsequently been fixed.

We've known about this hardware limitation since 1997, which consists of a one-line code fix, but the most popular emulator still does not support this behavior. As a result, translations made solely for this emulator continue to cause problems and lock-in. Who would want to use a more accurate emulator that couldn't run a large number of their favorite fan translations?

It doesn't stop there, though. The original hardware had a delay upon asking the math unit for multiplication and division results. Again, any commercial game ever released would respect those delays, but fan hacks led to a Zelda translation's music cutting out and to the Super Mario World chain-chomp patch going haywire.

Or an emulator might ignore the fact that the sound processor writes echo samples into shared RAM. Not a problem until you wind up with hacks that use wildly unrealistic echo buffer sizes, which in turn end up overwriting the entire audio program in memory, crashing and burning in spectacular fashion. This one issue single-handedly renders dozens of Super Mario World fan-made levels unplayable.

Joking aside, I'm honestly shocked by the response. I can link you to some absolutely vile 4chan/emucr threads about this subject. Almost threw in the towel over it last year. It didn't seem like many cared: ZSNES has 26 million downloads for the latest release, I average ~10,000 per release.

Now, I don't do this for praise or anything, but it really goes a long way motivationally to see people enjoying my efforts. Can't thank you enough for publishing this story, and everyone else for the awesome responses

> Are there any SuperNES emulators that take the opposite approach, to the extreme? I'd love to try Starfox in 16:9 at 1080P at a rock solid 60 FPS.

Unfortunately, no. The SuperFX does not act like a traditional DSP. The SuperFX CPU executes instructions right off of the ROM image itself, and acts as a general coprocessor. It has instructions that plot individual pixels, one at a time, to individual tiles, that are later transmitted by the CPU to video RAM for display on the screen.

I'm not saying it's impossible to do, but you cannot employ traditional HLE techniques to give you scaled polygons with custom textures or any of that sort of thing. With DSPs like the Cx4 or the major N64 video microcodes, there are fixed programs with generalized functions to do common operations.

Given the nature of the video, some anti-aliasing filters wouldn't be too terribly difficult, at least.

However, you can certainly overclock the hell out of the SuperFX, even with bsnes. It stops having much effect after 30MHz or so (the bottleneck becomes the CPU); but it definitely helps with Starfox, Vortex and Stunt Race FX.

I'd be happy to get as technical as anyone wants (and then some), but how many people would be interested in the result? The article was targeted toward the general audience without computer science degrees. Is there value in going into detail about the SuperFX row-caching and secondary pixel cache behavior? It'd probably end up being 20+ pages long, and even then only scratch the surface. It would also require the reader to know a lot about hardware and software design.

To be quite honest, I think an in depth technical discussion would be a little bit too much. I think Ars has a very technical audience, but getting into pixel caching might be a bit too much! I would be quite interested in reading how a lot of this knowledge is obtained, though. Or perhaps an "Emulation 101" technical walk through that most CS grads and students would understand, but without overwhelming with in depth discussion on bit registers and super low level items.

Quote:

It didn't seem like many cared: ZSNES has 26 million downloads for the latest release, I average ~10,000 per release.

Well, I would be willing to bet that most of those ZSNES downloads are purely people looking to pirate old SNES games, and not necessarily about accuracy. I'd find it comforting that at least 10,000 people might share the same mentality as you.

Speaking of which, I'm curious about how emulator authors must feel about some of the people who download your work, who don't appreciate it. The amount of technical expertise and work that goes into an emulator is probably lost on most of the people who just want to play Mario on a cell phone, and don't care about the amount of work put into the emulation process. Do you have any feedback on keeping your momentum going in spite of the "vile 4chan/emucr threads"?

Judging by the readership numbers after one half of one day and the tweets and facebook likes, that group is in the hundreds of thousands. This is going to be one of the most popular stories we publish this month unless something amazing happens in the next three weeks. The group you're describing is wider than you think. :-)

Tweeting and Facebook "likes" are mindless actions that are forgotten some time later.

> It would have been much more interesting if there was any technical explanation at all of what was involved in creating the emulator. How was "multithreading and just-in-time synchronization" used? Why? Etc?

I'd be happy to get as technical as anyone wants (and then some), but how many people would be interested in the result? The article was targeted toward the general audience without computer science degrees. Is there value in going into detail about the SuperFX row-caching and secondary pixel cache behavior? It'd probably end up being 20+ pages long, and even then only scratch the surface. It would also require the reader to know a lot about hardware and software design.

Well, you pick your intended audience. Many of us started reading Ars back when the content was much more technical. I am sorry to say, recently Ars is slowly but noticeably sliding, to the extent that one might as well be reading the NYTimes technology page.

Your article, while interesting, is nor here nor there. It seems to be aiming at the rather narrow intersection of people who care about old games, but are not more technical than "3GHz vs 300MHz". A pretty exotic group they must be :-)

Bullshit. I've read Ars since 1998 and this type of article is typical and always has been. Yes, Ars does have some articles more technical, and it has also had some articles less technical. This is hardly comparable to a NYT Technology Page article, not even in the same league.

I also think you are full of it when you say that he is aiming at an exotic group. There are plenty of old game enthusiasts who are not coders or hardware engineers. In fact, I'd wager that they vastly outnumber the technical crowd you seem to believe this article should be focused on, even on Ars itself.

It is intriguing to me that the highest accuracy mode describes the system requirements as "A super-computer cooled by LN2".

However, the article text states: "Unfortunately, using this new hyper-accurate rendering engine incurred a nearly 40% reduction in emulation speed, once again making full-speed gameplay difficult-to-impossible, even on state-of-the-art hardware (at the time of this writing, including Intel i7 CPUs)."

I am guessing that the 3 GHz number mentioned several times applies to the Compatibility mode. But if maximum accuracy only causes a 40% reduction in speed, isn't it just a matter of a few years before your average new desktop can run it with ease?

It is just amazing how quickly computing power increases, rendering these kind of arguments moot. If your goal is long term preservation, does it really matter that your emulator requires a fast system today?

I also think that the comments about display technology are very relevant. The fact that the SNES does not output square pixels means that I'll never be able to achieve pixel perfect output and aspect ratio on my LCD projection TV. It might not even be noticeable, but it still bugs me in the same "pursuit of accuracy" kind of way :).

(With SNES output resolution of 256x224 and assuming pixel aspect ratio of 7/6, I guess I would need around a 1792x1344 resolution screen, or some integer multiple thereof.)

To me there is a fundamental difference between emulating something to preserve it, and emulating something to allows so that users can just play it. One I find to be admirable. Unfortunately hardware and software have a very limited shelf life. It would be nice to have some sort of preservation of these games in their original form before we loose them altogether. While games like Mario Bros, and Street fighter will probably always be around in some form or the other there are plenty of games that you will never be able to see or play other than in an emulated form. These games should be represented and archived as closely as possible to how the original developers intended (or did not intend as is the case with bugs in the original game). While emulators like ZSNES are great because they give the layman a chance experience playing certain games that they can no longer access, I find their goal a bit one dimensional and dare I say shady? Too me emulation is not about playing some pirated old game, it should about preservation, which I find infinitely more important in the long run.

Some of the publishers of the original games have no choice but to emulate now because even they have lost the original code, or assets and/or the ability to get in touch with the original creators. It should be top priority that they get as accurate a portrayal of the original game as possible.

My hats off the bsnes developer and all emulator devs who focus on accuracy and archival.

Your article, while interesting, is nor here nor there. It seems to be aiming at the rather narrow intersection of people who care about old games, but are not more technical than "3GHz vs 300MHz". A pretty exotic group they must be :-)

Judging by the readership numbers after one half of one day and the tweets and facebook likes, that group is in the hundreds of thousands. This is going to be one of the most popular stories we publish this month unless something amazing happens in the next three weeks. The group you're describing is wider than you think. :-)

On a totally different point ben ps this is a great article unbiased and clearly not an apple press release. Why did the story you released on that puzzle game not mention it was available on android as well as apple ios and HD Ipad, was it a cut and paste job from an apple press release. I expect more far more from the gaming editor author of ars technica I at least expect a degree of net neutrality obviously not the net neutrality of packets of information but the release of articles as independent when they clearly originate from apple not the maker of the game who would display all the available formats(android) the game can be played in.

I expect a lot of people have no idea about accurate vs fast emulation. Before this, I had a vague idea about more-accurate emulators, but no clue the detail involved or the extra processing power needed. Of course, a lot of those people still wouldn't care - they just want to play the games from their childhood, or even classics they missed, and they aren't around in an Arcade Pack.

...Too me emulation is not about playing some pirated old game, it should about preservation, which I find infinitely more important in the long run.

I'm just a little confused about this though. What is the point of preservation, if not to play the games? Preservation of game code is only meaningful if people are actively experiencing said code. Imagine an art museum where all the paintings are kept under sheets, whilst locked away in vaults that no one is allowed to enter. The paintings are being preserved, but what does it matter, if they can't be seen and appreciated?

Great to hear someone is that dedicated to the SNES, it was a great machine and I have definitely been meaning to try out BSNES, however it will have to wait a while until I get the right machine to use it with. Until then, the others will have to do.

> I'd find it comforting that at least 10,000 people might share the same mentality as you.

It is, most definitely. I didn't mean to sound ungrateful. I guess I was initially hoping for reception along the lines of Nestopia, and at least being able to get some of the most basic problems with other emulators fixed by giving them a bit of true competition. Up until 2010 or so, the ZS/9X teams were the same devs, and were extremely complacent. As it stands, I'm having trouble just getting people to adopt the proper file extension for SNES images. Most are still using "Super Magicom" images.

> Do you have any feedback on keeping your momentum going in spite of the "vile 4chan/emucr threads"?

Not really. I spent the first six years passionately responding to every critique as nicely as I could, and eventually burned out after responding to the same comments for the billionth time. I was pretty rude for a few months about it. Then ultimately I just stopped caring. Kind of a desensitization akin to today's youth with violence due to over-exposure.

I guess the most important point is: don't do anything like this unless you also get personal enjoyment out of the effort. Never work on a project solely for popularity, or you will burn out very quickly.

> It is intriguing to me that the highest accuracy mode describes the system requirements as "A super-computer cooled by LN2".

That was a joke

> I am guessing that the 3 GHz number mentioned several times applies to the Compatibility mode.

The situation is fuzzy. It depends on the profile, the game, and the CPU architecture. Some real numbers:

On my Core i7 2600K overclocked to 4.8GHz (air cooling, by the way), for the accuracy core, I can get 100fps on Super Mario World 2's title screen, which is the most demanding thing there is. I get about 180fps in Zelda 3. I get about 140fps and 280fps with the compatibility core on the same system. I get about 220fps and 500fps with the performance profile.

I can hit 80fps in Zelda 3 with the performance profile on my 1.6GHz Atom CPU.

AMD CPUs tend to be about 20-40% slower than the latest Intel CPUs per clock. Pentium 4s are garbage, and you should divide their clock rates in half

I found out that in ZSNES, battles actually run at about 50-100% faster than they should. Considering battles in SO are real-time, tactics are thrown out the window because you don't have enough time to react.

It's actually how i discovered bSNES, when someone recommended I try it instead. I've been using it since.

The PSP remakes had a similar reliance on hardware latency, interestingly enough. They took into account the latency of the UMD drive when loading up battles, so if you run off a memory stick instead you get random encounters at a vastly increased rate because the reading is so much faster.

People said largely the same thing about the work of Vincent van Gogh while he was still alive. The presence of his works in the world's most famous museums (and four of them on my own living room wall) would suggest that none of us is equipped to know exactly what will and will not become "the stuff of history."

I get that, I do, and I'm not telling the guy (or you) to give it up, just to lighten up the tone. People take the stupidest things ever so seriously - games don't need the "life or death" hyperbole. They really, really don't, your taste in painters notwithstanding.

...Too me emulation is not about playing some pirated old game, it should about preservation, which I find infinitely more important in the long run.

I'm just a little confused about this though. What is the point of preservation, if not to play the games? Preservation of game code is only meaningful if people are actively experiencing said code. Imagine an art museum where all the paintings are kept under sheets, whilst locked away in vaults that no one is allowed to enter. The paintings are being preserved, but what does it matter, if they can't be seen and appreciated?

Well, obviously, you can play the game using bsnes. In fact, the point is that you are playing it in its purest form. It's like comparing the original Mona Lisa to the print sold in the museum gift shop; sure, the print looks like the Mona Lisa, and can be enjoyed like the Mona Lisa, but it's not quite the same thing.

byuu - great article. I just might have to try to get up to speed on some old school emulators again when I can find the time. For someone that actually played all those consoles from the beginning, finding good emu's through the years have always been a hit or miss game. Sega Master System and Genesis, Amiga 500, Vectrex, NES and SNES, and so forth.

zonnked wrote:

I miss real CRTs! Especially when driven as vector graphic displays!

Before we decided to run a profitable business, we thought we'd design and build arcade related hardware and the first thing I came up with was this thing: http://www.zektor.com/zvg/ (This is an archived link -- these are no longer being made.)

It allowed you to use an emulator (like MAME) to play vector games on real vector monitors. Most vector games had smoked glass over the screens that was so dark you wouldn't be able to see a raster game through the glass. But when placed over a vector monitor, the dark glass and bright vectors made for a very high contrast ratio that raster display cannot even come close to emulating. It's a very cool look, and I'm guessing half the people reading this have never seen a real CRT based vector monitor playing a game like Tempest, and that's a shame!

That sounds like a really cool thing. I actually started my gaming with a Vectrex as my first "console". Still have the Vectrex and 20 or so games in a box at my fathers vacation-house. Probably havent turned it on for almost 10 years or so, but it might be time to try it out soon. I just hope it doesnt explode considering it's what, 27 years old right now.

Although this article got a little too technical for me to understand totally, I still completely enjoyed it and found it a fascinating read. The SNES will always be my favorite system, and emulation has been a hobby for years, so getting a peek at how it all works is awesome. I'm curious about the author and what he does for a living with his skill-set. I'm guessing bsnes isn't paying any bills.

Hudson and some others did it with PocketNES for the GBA. They didn't even give them credit

That sucks. It's awful that a corporation will use public domain software without even acknowledging in the credits.

I know a bunch of people who wrote Action Replay codes for the GameCube. Datel actually took their codes and sold them as their own without giving any credit to the original hackers. They're a very unfriendly company, and thankfully the Wii hacking scene has exactly zero need for Datel.

EDIT: I just noticed you said you get about 10k downloads. That's pretty awesome...the remote debugger I work on gets about 10-30 downloads per minor release, and I think some major releases broke 150 once or twice. Like you said, if you're doing it for yourself then it doesn't matter how many people download your work.

Do we need transistor-level emulation to accurately reproduce Will Crowthor's "Adventure" text-based interactive adventure game? No. How about chess games? No! Isn't it enough to just have the original logic of the chess program, a CPU emulator, and its initial start-game and end-game states? If you want visual accuracy, that can be achieved by simulating how old machines drew raster graphics.

"Video games" in the sense that we think of them differ from other kinds of computer games in that there is some kind of real-time interactive feedback. Its the real-time component that kills us -- human's aren't cut out for thinking in 4D (the 4th dimension of course being time). So emulation of time-critical games is difficult for hardware that isn't designed to handle it.

But its possible, theoretically at least, to reverse-engineer the original game with all of the timing issues in detail so it can run natively on modern hardware. Take a detailed software description of physical game console and use it as a "compiler", and then each ROM image is treated as a "program". When the emulator-compiler compiles the ROM image of the game, the compiler should produce output a binary image that, when run on Linux with your AMD Phenom II and latest Radeon graphics card, behaves identically to the original.

The "compiler" itself may require a supercomputer to execute, perhaps running monte-carlo simulations of someone actually playing the game. The compiler might have to construct a tree for every possible if-then statement in the game and analyzing state changes that depend on a certain clock values, and also analyzing the value of audio and graphics buffers and the chain of events that produced the contents of those buffers.

But the goal should be to deduce a state-machine which executes on modern hardware but is exactly equivalent to the original game. So rather than run an emulator, you run a less CPU intensive program that faithfully recreates the original gaming experience. If we do this, we no longer worry about the job of creating accurate emulators, and worry more about the job of creating correct "compilers" that produce programs that behave like perfect emulators.

Not that there is no place for high-accuracy emulators like you describe, but I doubt they'll really be mainstream until the hardware they emulate is laughably obsolete....

-The hardware being emulated is already laughably obsolete. Indeed that is the whole point -to preserve this part of our history before it dissapears. If you wanna play SNES on your phone then all well and good, but that is a whole different thing IMHO.

Not only do the battles run twice as fast, the game constantly crashes all over itself. The game has actually earned a widespread reputation for being notoriously buggy. But let's be honest, it would have been recalled if it were that bad. In truth, while the game is far from perfect, it's extremely playable. The crashes are caused by ZSNES.

Ha, no wonder it seemed so hard! I have a copy of the sequel on PSX CD-ROM, so I wanted to play the original first... yeah, that was kind of crazy. I'll have to try it with bSNES

I can confirm this. To this day my gf still plays OoT and she's quite proud of the fact that her Ganon's blood is red.

They also changed the music in the Fire Temple. I believe it contained some Islamic chants that got some people riled up.

Yep, Gametrailers covered that in their Pop-Fiction series. Still kind of a mystery why it was ever included in the first place, IIRC.

Things like this come up so often with popular media and you would think getting an answer would be as simple as shooting off a few emails. :\ I mean I think of situations like this where the people involved "mysteriously disappear" or "just cannot remember" and then think of places like ID Software where you can read so much about what went on behind the games you love.

TLDR: I wish developers would keep more diaries/documentation but I really doubt they would have time for that.

Of course now that I brought that up... I wonder if Carmack has any copies the old RPGs I read of him making in "Masters of Doom". They sounded pretty cool.

Next in the series, an article about the summer intern who wrote the marginal Speedy Gonzalez code for level 6. Turns out he had some reaaally old milk with his Cap'n Crunch that morning, and spent most of the morning "debugging" in the 3rd floor restroom. During this session, he realized there was only time to flip some hexdump bits and pray before the QA meeting at 1. A dangling DMA trigger in the music bounced out of the deadlock, saved his job, and consequently saved his boss's Jag a few nasty scratches.

He'd be honored to know that so much concern has gone into inspecting the deeper meaning of his botulism-inspired code. Unfortunately, he tragically passed away in a multilevel filing-cabinet collapse 3 years after the episode, to the day.

byuu - great article. I just might have to try to get up to speed on some old school emulators again when I can find the time. For someone that actually played all those consoles from the beginning, finding good emu's through the years have always been a hit or miss game. Sega Master System and Genesis, Amiga 500, Vectrex, NES and SNES, and so forth.

zonnked wrote:

I miss real CRTs! Especially when driven as vector graphic displays!

Before we decided to run a profitable business, we thought we'd design and build arcade related hardware and the first thing I came up with was this thing: http://www.zektor.com/zvg/ (This is an archived link -- these are no longer being made.)

It allowed you to use an emulator (like MAME) to play vector games on real vector monitors. Most vector games had smoked glass over the screens that was so dark you wouldn't be able to see a raster game through the glass. But when placed over a vector monitor, the dark glass and bright vectors made for a very high contrast ratio that raster display cannot even come close to emulating. It's a very cool look, and I'm guessing half the people reading this have never seen a real CRT based vector monitor playing a game like Tempest, and that's a shame!

That sounds like a really cool thing. I actually started my gaming with a Vectrex as my first "console". Still have the Vectrex and 20 or so games in a box at my fathers vacation-house. Probably havent turned it on for almost 10 years or so, but it might be time to try it out soon. I just hope it doesnt explode considering it's what, 27 years old right now.

Vectrex has to be the coolest game console of all time! Maybe not the most playable, and maybe its games haven't aged well, but a portable console with a built in vector display is still cooler than the 3DS. That and Vectrex was doing 3D (with goggles) nearly 30 years ago, making it possibly the first home 3D gaming console: http://www.youtube.com/watch?v=imhbTB8AXKs

I have two Vectri (Vectrexes?) sitting on a shelf behind me. One for playing, and one used as a display for the ZVG. The ZVG could drive the Vectrex's vector display. Here are some pics of Tempest, Star Wars, and a few others, running on the MAME emulator and using a Vectrex as the display: http://www.zektor.com/zvg/zvg_vpix.htm

Of all the things I've done in my hobbies and career (and I'm one of the older guys reading Ars -- and only an "old" guy would start a sentence with "Of all the things I've done...", as well as putting a run on sentence right in the middle of another, derailing any train of thought), the reverse engineering of the Cinematronics Arcade games to allow them to be emulated, and the hardware / firmware design of the ZVG were two of my favorite projects.

At the time there were no low cost MCUs that were fast enough to drive the vector hardware, and the ZVG ended up using two MCUs (one for I/O and one for vector timing). Everything was written in assembly language, where every instruction cycle had to be counted. I had so little time between vector draws, that the firmware routines used to setup the hardware registers were dual purposed as timing routines. All vectors are drawn to the accuracy of a single MCU clock cycle using a background hardware timer sync'd to the foreground firmware timing. Since much of the control firmware was also being used for timing, conditional branches had to be written in such a way as to use the same amount of clock cycles, regardless of the branch taken.

It could all be done today on an ARM processor, using built in peripherals (DACs and timers), and could probably be written in 'C' and run in the background, with enough time left over to run an emulator in the foreground.

But as byuu says, you don't do these things for the praise or money (though there is more praise than money), you do them because they're damn fun!

"And Starfox suffers from significant slowdown issues throughout the game. These are certainly not desirable attributes, but they are correct nonetheless. We wouldn't round pi down to 3 simply because irrational numbers are inconvenient, right?"

I stopped reading right here.

Yeah. The bible says pi is three so it is. I'm gonna use some evil-utionary emulator.

Next in the series, an article about the summer intern who wrote the marginal Speedy Gonzalez code for level 6. Turns out he had some reaaally old milk with his Cap'n Crunch that morning, and spent most of the morning "debugging" in the 3rd floor restroom. During this session, he realized there was only time to flip some hexdump bits and pray before the QA meeting at 1. A dangling DMA trigger in the music bounced out of the deadlock, saved his job, and consequently saved his boss's Jag a few nasty scratches.

He'd be honored to know that so much concern has gone into inspecting the deeper meaning of his botulism-inspired code. Unfortunately, he tragically passed away in a multilevel filing-cabinet collapse 3 years after the episode, to the day.

Your 'Real Video stream of the moon landing' example is interesting because- a) the original video quality of 'the moment' is shit (and therefore not accurate itself) and b) a LOT of the original documentation for the project and event (including videos of the 'one small step' moment) are lost because no one, especially NASA engineers, thought they'd be that important. Exactly what words Armstrong said is famously a matter of (minor) debate because the recordings are crap and Armstrong himself doesn't remember. IIRC, the audio logs of communications between Mission Control and the Eagle for Apollo 11 are also amongst the lost items. Somehow, the text transcripts are still around - quite fascinating. Don't really see anyone forgetting that achievement because of lost, inaccurate/grainy documentation though so I can't really sympathize with the argument that for something to be appreciated, it needs to be completely accurate.

All I can say is, maybe ask the guy funding the project (Wayback Machine?-Ars had a recent article on it) to both digitize all books AND store paper copies of them (just in case, you see) to maybe give you some funding so you can pay some people to help with the scanning or just buy a batch scanner yourself. Great job keeping the article at just the right technical detail. People claiming it's not enough detail are *ahem* 'crazy'. I doubt most readers of Ars today, and even many of its editors/authors, understand transistor delays, busses and registers.

Am I the only one here who still has an actual SNES? Even with all this perfect synchronization, I'm still using a keyboard. I like the experience of using the actual controller, having to blow dust out of the cartriges and the connector in the console, even if I'm playing on a screen 3x bigger than what they expected when it came out (+50" tv).

Nothing beats the real thing. Try it sometime.

Reminds me of the guys down the hall who have an NES hooked up to a 65" plasma in a break room here. Kinda funny to see some guys playing Double Dragon or Tecmo Bowl on a screen that size

Just think, if we cannot even simulate an SNES at the circuit level, how hard would it be to simulate the brain? That's what Blue Brain is trying to do on Blue Gene supercomputers.

Could a super computer simulate an N64 in realtime? something like Blue Gene? Am just curious... not saying it's a practical thing to do!

Also...I still like ROM carts much better than optical disks. Optical disks are slow and get scratched. A Cart...well... Mario 64 had almost zero apparent load times! And putting a co-processor into the game cartridge blows my mind! I have wondered if this had ever been down...now I know. Imagine what could be done for AI or physics if modern consoles had this kind of capability?

As far as density goes... well...I think a well designed and supported ROM cart format could have both high density and be quite cheap due to economies of scale. And best of all...the capacity is infinite in the sense that as carts get denser... you are not limited to old desity standards. A Blu-Ray reading machine will never surpass the capacity of BluRay disks. A Cart based system's game size would be arbitrary and increase as ROM-Cart density increased at the manufacturing stage.

Also... stacking ROM cells up in 3D like MetaRAM does for RAM could also massively improve ROM desnity.

I would say that the problem isnt in simulation of a SNES at circuit level, but doing it in real-time to correspond to performance of the original hardware. So the large factor here is time.

As for simulating a brain will probably at some point, be done either on custom hardware, or on a cluster of off-the-shelf hardware. And such a simulation doesn't necessarily have to run in real time, not withstanding how a brain would intrepret time anyway. Would a simulated brain be more intelligent if it was thinking faster?

But man, this thread has everything, in depth technical discussion, opposing schools of thought on how to do things, brain simulation, nostalgia that reaches as far back as vector displays.

zonnked wrote:

Vectrex has to be the coolest game console of all time! Maybe not the most playable, and maybe its games haven't aged well, but a portable console with a built in vector display is still cooler than the 3DS. That and Vectrex was doing 3D (with goggles) nearly 30 years ago, making it possibly the first home 3D gaming console: http://www.youtube.com/watch?v=imhbTB8AXKs

Light pen, 3D ability, a great analog controller, multiplayer, "portable". Yeah, it was a great thing at the time, especially considering you didnt have to also bring a tv on the side.

zonnked wrote:

Of all the things I've done in my hobbies and career (and I'm one of the older guys reading Ars -- and only an "old" guy would start a sentence with "Of all the things I've done...", as well as putting a run on sentence right in the middle of another, derailing any train of thought), the reverse engineering of the Cinematronics Arcade games to allow them to be emulated, and the hardware / firmware design of the ZVG were two of my favorite projects.

At the time there were no low cost MCUs that were fast enough to drive the vector hardware, and the ZVG ended up using two MCUs (one for I/O and one for vector timing). Everything was written in assembly language, where every instruction cycle had to be counted. I had so little time between vector draws, that the firmware routines used to setup the hardware registers were dual purposed as timing routines. All vectors are drawn to the accuracy of a single MCU clock cycle using a background hardware timer sync'd to the foreground firmware timing. Since much of the control firmware was also being used for timing, conditional branches had to be written in such a way as to use the same amount of clock cycles, regardless of the branch taken.

It could all be done today on an ARM processor, using built in peripherals (DACs and timers), and could probably be written in 'C' and run in the background, with enough time left over to run an emulator in the foreground.

But as byuu says, you don't do these things for the praise or money (though there is more praise than money), you do them because they're damn fun!

Sounds like a cool project, and the zvg page is really nice, especially that you have all the schematics and everything. Having built a few things myself in my time, albeit on a much smaller scale, mostly stereo amps where lots of home/DIY electronics guys start out... it seems like a whole lot of work went into it.

This is by far the BEST article I have read since the in-depth articles that laid out what happened between Anonymous and HBGary! Such a wonderful read and intelligent comments too! I thought these days of the internet were dead and done! Thank you to everyone who made this happen.

This debate between performance and accuracy is almost as funny as what is better: MP3 or FLAC. Of course each area has its strengths and purposes. It is not a question of what is better, only the why's and how's of each situation's progress and purpose! We have allot of computing power to spare, and I'm glad that accurate emulation is one of the things it is being used for! Historical accuracy is important in and of itself, just as much, if not more, than if I can get Mario to jump on my iPhone.

The laments of long lost technology was also fascinating. CRTs; Vector Graphics; There is real value lost in the phasing out of those technologies. I'm not saying that the replacement technologies aren't also, in a word, awesome. But there are aspects of the old tech that can never be experienced again. And given that our hunger for technology is just to get the latest and greatest (maybe only BECAUSE someone said it was the latest and greatest) makes me very afraid that we have not learned, remembered, or preserved nearly enough of the road that got us here. Painters have had thousands of years to perfect and share their art. Electronic computing has had half a century, and what was accomplished in that half century is beyond phenomenal!

So to Byuu, I tip my hat. To Ars, and salute you for supporting and presenting this opportunity, and to all the co-cmmentors, it is an honor to share the same pixel space with you.

> They took into account the latency of the UMD drive when loading up battles, so if you run off a memory stick instead you get random encounters at a vastly increased rate because the reading is so much faster.

Great anti-piracy technique, at least. I had trouble getting over the non-optional voice acting in the PSP release.

> I'm curious about the author and what he does for a living with his skill-set.

I have no college degree, so I worked my way up from retail to call centers to help desks to software development, over the course of six years. Mostly business-related, I just love that I can mostly code in C++, and don't have to touch its retarded cousin, Java.

> I'm guessing bsnes isn't paying any bills.

Nope. I accepted donations to pay for Dr. Decapitator's supplies, and accept game/box/manual donations for preservation, but have never taken personal money from it. I don't find that ethical.

> But the goal should be to deduce a state-machine which executes on modern hardware but is exactly equivalent to the original game.

o.O ... yeah, good luck with that.

> How long has Byuu been writing for Ars Technica

This is just a guest piece.

> Does anyone know why Nintendulator's requirements are nearly 2x Nestopias if they both strive for cycle accuracy?

Nestopia is written with speed in mind, and has various shortcuts that do not affect accuracy, but make the code less easy to work with, applied.

Nintendulator is more of a gigantic state machine. Like bsnes without the cothreads. It is technically more accurate than Nestopia, especially as it is still being developed.

Speed can be affected by the talent of the developer, but I think for the most part, if you can write an emulator, you can probably write decent enough code. The former gives you a lot of info on how the low-level stuff works, so you know how to optimize your own code.

> Next in the series, an article about the summer intern who wrote the marginal Speedy Gonzalez code for level 6.

The worst part is, when I started on bsnes, I was thinking about how fun it would be to take apart the code to Super Mario World, Zelda 3, Final Fantasy 6, etc and see how it all works to improve my emulation.

... nah, those games ran well pretty quickly, they were written with a lot of love. It was those cheap shovelware games that I had to debug: Speedy Gonzales, Toy Story, Mecarobot Golf, Harukanaru Augusta - Master's New Golf, WWF Wrestlemania, Pebble Beach Golf, Koushien 2, Jumbo Osaki no Hole in One Golf, Jumpin' Derby, on and on.

> He'd be honored to know that so much concern has gone into inspecting the deeper meaning of his botulism-inspired code.

Oh believe me, it's garbage and I never enjoyed it. I know it was a bug on their side. But it has to be done if you want 100% compatibility =)

> Also, did you really blow $5000 on game manuals?

I've spent around $10,000 for: every USA game ever released, all but 35 boxes, and all but 120 manuals. Once I've redumped them all, and perhaps completed the missing items, I'd like to sell the set. As long as I don't lose more than $2k, I'll be happy.

We've also spent a combined $2,500 on special chip decapping, so that we could use pure LLE.

> @Ben, now that you've presented us with a dedicated emulator developer, you really should do an article about Dr. Decapitator

Likewise. ROM still costs more, but the instant load times, attention to detail on the content, and much longer shelf life of the media itself make it worth it. It's really amazing that Super Mario World was a 512KB game.

> opposing schools of thought on how to do things

Thomas' ideas are completely correct and plausible. It's more a matter of preference. If I had all the time in the world, I'd love to make my ideal, clean-source emulator; and another optimized for performance. Since I don't, I'd rather leave a better documentation of the hardware, and spend more time on improving the emulation.

> Do you think you will ever be "done"? Since you are striving for a specific level of accuracy, it seems like a project with a very well-defined, calculable, objective.

I do not believe perfection is possible, as we can never test every possible edge case. I do believe that the differences between a real SNES and the SNES Jr (basically an official clone hardware) are about the same, if not greater, than the real SNES and bsnes at this point.

I don't like to think I'll ever be finished, but realistically, I mostly am. There's the ST-0018 coprocessor, once the Dr. is done dumping it, then there's some BS-X stuff I can improve, then I want to time out a lot of cycle-exact cachings for raster-level PPU timings (something only Air Strike Patrol uses, and for that only for a plane shadow), but after all of that? It's pretty much complete.

> This debate between performance and accuracy is almost as funny as what is better: MP3 or FLAC.

That's a great analogy. I had a section on that in the rough-draft. Some people who are really passionate about things want perfection, yet most are content with good-enough.

MP3 vs FLAC, MPEG-2 vs HuffYUV, Honda Civic vs Accura (Slashdot car analogy, oh yeah, I went there), etc. The last bit of quality is always the most expensive, and thus the most easily passed up.

Why I think it's different in emulation, is that most people have PCs fast enough for more accurate emulation, yet a lot of people stick with the older stuff anyway due to a combination of familiarity and desire to conserve virtual resources.

While I am all for 'to each their own', the accuracy-focused emulators really need support to find bugs and fix them. Someone remarked that they'd be found eventually, but that's not true. We only found the Speedy Gonzales bug a year and a half ago, after two and a half years of believing to be bug-free. There's just not a lot of people playing obscure titles like that to completion in emulators. I am not going to be around forever, for when we spot the bug in some far more obscure Japanese title in 30 years.