...you're not part of the intended audience. Admittedly, there's a lot of necessary hardware support to get these kinds of results, but still... full A/V in a space less than the banner image of most websites. Makes you wonder what could be done with similar techniques and, say, a megabyte of space.

Well, Farbrausch did release .kkrieger [farbrausch.de] , which is a 96k FPS with Doom 3 graphics. Not much in the gameplay department, but it's still a revolutionary example of what can be accomplished within a small filesize.

Gaia machina was exaclty 65536 bytes. we did EVERYTHING to keep the size down, including some dirty tricks. The 4k demos are the new vogue in the demoscene, but there's still groups like us going for the 64k. Most amiga games used all the RAM. my synth in Gaia Machina is 8k, but it uses 150MB of ram when it unpacks...

That's 64KB of code and data, so it's still fairly impressive. Even if you're using a completely off-the-shelf 3D engine, fitting all of the geometry and textures for a virtual environment into 64KB is nontrivial.

I saw a couple of demo files years and years ago. DOS-based stuff. I think they were probably 16k files. I was amazed at how long the animations and music lasted from a 16KB EXE file. The demo just went on and on, for like ten minutes. Had some fairly impressive animations too. But it was all line-based sorts of things, like old screen savers.

But this... this is insane. I can't even believe what I'm seeing. I'm downloading the 720p version of the first video in MP4 format, and it's 91MB. A 91MB full-motion

I'd say demoscene is the perfect example of what can be done, software-wise. I'd guess you've never peeked at a demo sourcecode, or you wouln't call them generically "real programmers" (incidentally, there is usually much talent behind the code, but no structure or discipline whatsoever).

Actually, there is structure and discipline. But it's not academic structure or discipline. It's the structure and discipline of hard limitations and necessity. For example in 4, 16 and 64KiB compos, where every single byte matters, pretty much. In the 4KiB compos, a huge focus is on the packers and all the math surrounding that, where the packer is optimized for two things: Size and speed. And since you're not showing just one or two static effects, but animations, preferably with multiple effects, there's

During the nineties, I've done a lot of scene-related assembly code (mostly graphics and infrastructure - extenders, memory managers, etc), so I'm well aware of the limitations imposed and the usual workarounds. Most sourcecode I've seen is an ugly mess, even if it works. There are some true clever and elegant algorithms in the lower categories (128/256b and 4k), but what I've learned since then is that computer processing power is cheap, but software lasts. One of my favourite demos of all time is Heaven7, mostly because it implements a realtime raytracer than runs smoothly on a P200. But probably many of the optimizations that allowed it to run smoothly in such a limited processor aren't valid for a P3. Or a P4 (handcode optimization for any of the P4 lines is a _nightmare_). So, if it was a maintained application, the highly optimized algorithms would probably have to be rewritten, to take advantage of the latest features, and keeping "pushing the boundaries".

As an example (and more in line with the nineties), a lot of effort was put into highly optimized bresenham line algorithms, because traditional implementations implied a div operation per pixel, and integer division was awfuly slow (like 40 cycles on a 486). So, even if bresenham's requires extra instructions, it would still be a lot faster. Well, on a pentium, not only the div instruction took 1 clock cycle (like most of the other instructions), some instructions could be paired for execution (the processor had 2 execution pipes), so the bresenham implementation is usually a lot slower than using the div instruction. If you had to write maintainable software, would you worry about implementation details that could double your development time, but be obsolete on the next processor to be released? I guess not.

Demos are a kind of development on its own. They require no user interaction, no external data pulling (other than already packaged resources), have no error checking whatsoever and usually are buggy as hell - slightly different hardware may give you completely different results, or just don't run at all. So, that's why I don't like to call scene coders "real programmers". They are more of a class of "code artists", and yes, they should deserve more merit than "regular" coders, not only because their algorithmic skills, but also because of their creative way of implementing them.

Modern ARM (v7) cpus already have an integer divide instruction (and many other low power architectures already have them, including modern versions of traditional 8-bit microcontrollers and microprocessors, such as Z80). Most of the optimization tricks you'd use on x86 aren't very good on a RISC platform (I have no experience with ARM; but I've played with Intel RISC processors). Versions with SMID support are widely available, and it is a far more important feature than FP for most intensive algorithms. A

Tiny corrections:D
- Traditional line drawing uses a div per pixel. Check pseudocode at the wikipedia page (http://en.wikipedia.org/wiki/Line_drawing_algorithm). There are some "hybrid" algorithms that implement one division per line, but usually are a variation of bresenham's algorithm with better precision.
- I'm not familiarized with Heaven7's internals, but it is somewhat obvious that uses tricks. As most other rtr implementations out there. And some non-realtime raytracers. If you can simplify global

I wouldn't say that's the case of everybody. Yes, some people think that it's not important to make your code nice, but when I code (and many with me) we do it with pride and produce libraries well suited for public release if we wanted to.

There is a very high bar that we set onto ourselves.

The back side of the coin is that sometimes, we make demos really fast, and the actual _demo code_ isn't so well structured, but you can be sure the libraries behind it is.

I guess I was misinterpreted. I regard scene programmers more like "code artists". The software product itself is unique, in the sense that is not usually maintained or ported. Choosing long-term algorithms and performance compromises is usually not a problem, since it is made to be awesome with a specific combination of hardware. The algorithms have to be fast enough to implement the desired funcionality on a specific point in time (the party or the release date) and usually on a specific processor, not on

/. is really making me feel old these days -- I was writing demos in the early 90's. I don't know if its my overall grumpy old-man mentality or not, but as impressive as these are, they're powered by a crap-ton of software running behind them. There's not 64k of assembly pumping bytes into a framebuffer and twiddling the PC speaker port to synthesize digital audio.

One thing I couldn't find in there (and I've been out of the scene for a LONG time, so I don't know how this works on new-fangled fancy computers...) -- do these write directly to the video hardware? Or do they use OS services like DirectX11, etc? When they say 64k, is it a 64k executable using up another dozen meg of OS DLLs?

I have to give it to them, they are very impressive. But are people still getting down and counting clock cycles?

No, that would be impossible on modern computers (unless you would only run the demo on one gfx card - and maybe even only on one specific firmware version at that). The closest I know of is the 4k demo Elevated [youtube.com] by RGBA and TBC. Everything except audio is done by shaders on the graphics card. Even the camera movements are done there (as the cpu don't even know how the terrain looks).

IIRC they started with an OpenGL initialization to get the shader code to the GPU, but later switched to DirectX, because tha

No, that would be impossible on modern computers (unless you would only run the demo on one gfx card - and maybe even only on one specific firmware version at that).

Admittedly its been a while, but the VGA interfaces are still on those cards. If you needed higher rez, the VESA interfaces were pretty broadly supported ten years ago. I assume they're still buried down deep on those cards. But I grant you that utilizing more than just the framebuffer, you need to code to specific cards. But during the mid 90's, that was pretty common -- you saw lots of demos coding to specific cards like the early ATI Rage or (much more commonly) the 3DFX VooDoo cards. Same with audio har

Since modern operating systems won't let you directly access hardware, the only way to do what you want is to bypass the OS, and thus write drivers for any hardware you may want to use. And the resulting demo will only run on your particular machine, or one extremely similar to it. Good luck with that.

As for counting cycles, I don't think it's even possible in modern CPUs due to how they run instructions out of order.

No, that would be impossible on modern computers (unless you would only run the demo on one gfx card - and maybe even only on one specific firmware version at that).

If by "modern" you mean linux, there is fbdev which has a semi standard kernel interface for providing drivers that export the framebuffer. You mmap the local fbdev device into your process. Its fairly cross platform and cross video card (for many older ones it uses the VESA API). Sure it won't let you twiddle the PC

yes... same with vesa, and another poster pointing out that even in the DOS days people depended on the DOS for file io, and setting up the initial execution environment.

That said, the closest thing in windows (that i'm aware of) to the fbdev in linux is the deprecated directdraw hardware surface (which is a darn lot like SDL!). As I understand it, directdraw is completely emulated in software since vista. Even back in the w2k days it didn't give you full access to the hardware framebuffer, instead giving y

I'm not sure I agree with this 100%. In general no, most apps don't need or shouldn't have access to an actual framebuffer. Thats assuming something that even makes sense as a framebuffer is available.

On the other hand, the topic of mirror drivers comes up fairly frequently on ntdev, because windows lacks the ability to access the framebuffer directly. Sometimes people with legitimate needs want to know what parts of the screen are being updated, etc..

That was exactly my question. If it's not 64K running on bare metal, it's cheating.

Running on bare metal is hardly feasible these days, as hardware is so varied. You need the driver APIs to get anything done. Or you can watch Amiga-demos from the same compo (some incredibly good stuff there as well), but remember that they're also "cheating", employing Denise, Paula and Agnus as they are. For shame, they probably even use the blitter instead of doing it in CPU!

And - did you actually *see* "Gaia Machina"? As there is no way they can pack all that geometry pre-made in 64KB, far less texture

Running on bare metal is hardly feasible these days, as hardware is so varied.

A tiny demo that runs on exactly one piece of hardware is more impressive than a 100mb demo that runs on everything.

Or you can watch Amiga-demos from the same compo (some incredibly good stuff there as well), but remember that they're also "cheating", employing Denise, Paula and Agnus as they are. For shame, they probably even use the blitter instead of doing it in CPU!

The idea of demos and intros shifts over time. Originally, intros weren't even designed with any sort of 'competition' in mind at all, so the whole idea of 'cheating vs 'not cheating' couldn't even apply.

Nowadays, the idea of "showing lots of polygons texture mapped etc" isn't very interesting, because the hardware can this so well that there's nothing interesting to show by being clever. So the demoscene has moved on: Demos are typically more about doing something visually creative, w

Of course, unless you've created your own demo platform (all the way from mining your own minerals... Oh, and making your own pick to mine the minerals.. And...), you're just using what others have done for you already.

You have to set a limit somewhere. You just seem gruff because you've arbitrary set the limit a different place than the competition holders. Most of the old DOS demos used BIOS routines, after all.

The compo rules for win 64kb:

Windows:* The compo machine will be installed with Windows 7 64bit, however all entries have to be supplied in 32bit exe format and be able to run in a 32bit environment!

* We will provide the then-latest DirectX distributables.

* The "Media" and "Music samples" directories will be deleted from the compo machine.

*.net Entries : We will install the then-current version and patches of the latest.net runtime.

* And in case someone might still have this mad idea : We will not install any drivers or virtual machines to provide or improve DOS compatibility.

64k intro:

* Maximum file size is 65536 bytes for the executable. All other files in the archive will be deleted before showing the entry in the competition.

* Maximum running time: 8 minutes (including loading/precalc).

* We will not install any additional Runtimes, SDKs, Codecs, Drivers etc on the compo machine. This means that, among others, msvcr70.dll, msvcr71.dll and msvcr80.dll will not be available.

* You may not use the contents of the Windows "Media" or "Music Samples" directories. These directories will be deleted on the compo machine.

we crammed a LOT of code into 64k. when it decompresses, it explodes into about 800 megs of calculations before it starts, but you have a point, and I'd like to see you prove it by releasing a 64k demo that beats one of ours..

There's not 64k of assembly pumping bytes into a framebuffer and twiddling the PC speaker port to synthesize digital audio.

Of course. But all the creative work is squeezed into 64K.

One thing I couldn't find in there (and I've been out of the scene for a LONG time, so I don't know how this works on new-fangled fancy computers...) -- do these write directly to the video hardware? Or do they use OS services like DirectX11, etc?

They use DirectX, because that is the only way to support a reasonable range of hardware. (Also, you can't hit the hardware without installing a new driver or exploiting a kernel bug. Neither of which is very friendly.)

But are people still getting down and counting clock cycles?

Cycle counts aren't even documented today. Now it's all about avoiding cache misses and cache invalidation.

But in your days it was easy, you could count the clock cycles on the fingers of one hand and if you wanted a bit flipped you just climbed inside the computer with a hammer!

Anyway, you weren't all that impressive, you relied on a blacksmith for a hammer and a miner for the coal to fire your machine. You were just the slave master benefiting from the slave labor of others.

The rule is that the executable must be 64k and not use an data files. DLLs are allowed, however, and all of these demos use Direct X for video and sound.

The objects and textures are procedurally generated with a bunch of pixel shaders thrown in. Often they take minutes to load even on a high end machine as they do all the generation. No-one counts clock cycles because the target hardware is abstracted anyway.

A long way from the way things used to be, hitting the bare metal. In fact the original limit was 4

I'm meaty. I coded the synth in gaia mahina. We link against opengl, but we had to do some real dirty shit to throw out stdlib and don't get me started on the manifest. There were oldskool demos at the party on A500s and C64s, but don't get us wrong.. we had to do some crazy shit to get it down to 64kb. It was exactly 65536 bytes after some work. Try getting something like that down to 64k and you will see.

I think I know why North American readers may never heard of it. (USA and Canada represent well over two-thirds of the population of industrialized anglophone countries.) From the article:

in Saarbrücken, Germany

For some reason they never have demo parties like this in North America. Why is that?

In Atlanta, Ga a friend of mine did a cheese and demo party every year and showed a bunch of demoscene stuff from the past year. There was also a bunch of cheese tasting involved. That is the closest to a demoscene I've experienced in the states.

I'd say that the closest the US got to having "a demoparty like this" (meaning: with such good releases and turnout) was NVScene in 2008 which I helped organize. The event was documented in this now-severely-outdated blog, if you're interested in catching up: http://demotrip.blogspot.com/

I think I know why North American readers may never heard of it. (USA and Canada represent well over two-thirds of the population of industrialized anglophone countries.) From the article:

in Saarbrücken, Germany

For some reason they never have demo parties like this in North America. Why is that?

I'm Canadian, and I know of the demoscene and all the related terminology the GP doesn't. While I never had the opportunity to attend a party, I loved watching what some of these groups put out. The Amiga era was a few years before my time; I got into it in the mid-90's when names like Renaissance, Future Crew and Triton were the big guns.

There are caps of some of the big name demos from back in the day available on YouTube, with Future Crew [youtube.com]'s probably being the most popular. I still remember the awe I experienced when I first saw Triton's (later Starbreeze Studios) Into the Shadows [youtube.com] demo. I purchased their subsequent game, Enclave, because of that demo (and they were the guys who wrote FastTracker 2).

For those who don't know what the big deal is, way back when PC hardware was pretty crappy these groups were putting out some of the most demanding and advanced programs, stuff that put a lot of what the game companies were pushing to shame. The aforementioned Into the Shadows demo was released in 1995. This is long before 3D accelerators and hardware floating-point math were standard. It really was impressive at the time, and it was being done by groups of kids.

Out of points or I'd mod you up for mentioning Future Crew. I still have a 3.5" floating around with some great old demos from that era.

I'm not surprised anyone hasn't heard about the demoscene any more, but it's easy to figure describe: fit the most impressive graphics and sound you can on a very small memory footprint -- 64kb is the common limit now apparently, but I seem to recall some good "anything you can fit on a disk" rules and some impressive 4k demos.

Other than raw coolness, the point is to "do more with less" -- push creativity, efficiency, and algorithm design by artificially limiting resources. It's very impressive stuff, and having been out of touch for a long time, I'm AMAZED at the quality these vids are putting out. Has anyone been able to run the.exe's to verify? My rig keeps failing them -- I imagine it requires specific hardware and software versions.

Many people from the demoscene go on to work for game companies later.

For instance, Alex Evans (known as Statix, famous demo scener from the 90's) went on to work for lionhead studios (peter le minoux founded, famous from Popolous on the atari/amiga) and created the popular game Black and White... then he went on to found Media Molecule who created little big planet.

I am a demoscener myself and I did some programming on a Texas Hold'em online game in the early days of poker online, but I went off to work wi

Purple Motion released a CD sometime back, I bought a copy of it for happy fuzzy memories (and because it was good). I think Skaven is kind of still about.

I think ScreamTracker 3 was FC's last release, if you don't count the compilation they did last year. They fizzled out because a lot of the older members of the group hit national service age (most - if not all - of them were Finish). Futuremark, Bitboys and I think Remedy all had FC members as part of their start-up teams.

For those who don't know what the big deal is, way back when PC hardware was pretty crappy these groups were putting out some of the most demanding and advanced programs, stuff that put a lot of what the game companies were pushing to shame. The aforementioned Into the Shadows demo was released in 1995. This is long before 3D accelerators and hardware floating-point math were standard. It really was impressive at the time, and it was being done by groups of kids.

These kinds of people need to get together and make games, or even better a game engine. Inefficiency is rampant in the programming world today and is especially prevalent in the gaming world. Hell, even our OSes have these kinds of problems. I tried an Amiga at a friend's house, and the OS was snappier and more responsive than any GUI from the past 10 years that I've used.

For some reason they never have demo parties like this in North America. Why is that?

Hello, I'm meaty. I coded the synth for Gaia Machina. There were some Americans there. he was saying to me that you're all up for it, but spending months/years working on it isn't an american thing. We spent 3 years rewriting our engine since "ephemera" which pretty much maxed out our last engine. I'm British, the rest of the group is Swedish. There ARE American groups who do stuff. And Canadian - Northern Dragons kick ass. We may even visit an american party with a release just to kick your backsides and get you to do some fucking work:-) There's loads of tracker musicians in America, but you never hear music done in tracker that took more than 2 days to write! You guys have the talent, you have the place, just pull your fingers out and get working!

PS. Linux port of gaia machina is coming. It already compiles ok. We need to check it first.

I'm surprised it ran in wine.. we did some REALLY dirty shit to keep the size down, including using windows system calls to avoid linking against stdlib:-) linux version is similar.. we call kernel traps.

I forgot to mention. This stuff is also all rendered in real time. It's not a movie. The music is also composed/tracked. It is not a recording. Here's another impressive entry in the 64K competition: http://www.youtube.com/watch?v=6CiF034IhgY&hd=1 [youtube.com] It's mind blowing where these guys have gone over the years. I thought the demoscene would have died as computers became more powerful and anyone could create effects witho

For PC based demos Win 7 is used. (Linux also available) The latest version of Direct X and.Net are present and available for use. Other OSs and hardware will have their own support packages as well. So yes, third party libraries are available but that does not make the outcomes any less impressive.

Here is a link to where the info above came from with additional details:

Demoscene was not always about just making the most impressive output from a given piece of technology. As I recall, demoscene evolved from game pirates marking their cracked games with elaborate intros using only the available space unused by the game. I also recall hearing that some teams would occasionally use routines from the actual game as part of their animations. With that history in mind, how can one object to using libraries now? The original spirit of the art, where your own code had to fit into

And people before that did it without DOS. Go all the way back and people where flipping switches to input their code. And they found it boring to rewrite the same stuff over and over again and so created common libraries that soon became an OS and everything else.

If you make a cake from scratch, do you grown your own wheat? Then you are using the library of nature/god! Slacker! I create my own universe for every sandwhich, Big Bang all the way or you are just a faker!

It was amazing "back then" and even today I still think it's highly awesome. All of that in 64k.

Yeah, that's a classic, complete with awful scene poetry and all:)

I have a couple of friends from Andromeda (Hyde and Archmage), they came in second to farbrausch's "debris" at Breakpoint 2007. Archmage simply stated that "losing only to farbrausch is still a victory".

That said, to me Gaia Machine surpasses the.product in technical quality and polish. Also check out some procedural 4k images, the best ones really boggle the mind.