Posted 2008-08-02 02:49:46 by
Jim Crawford
So I've pulled the rug out from another project of mine in favor of the new hotness, a potentially exciting web project based around Google Web Toolkit, which is essentially a Java to Javascript compiler plus a minimal AJAX library.

GWT is interesting and potentially useful, but it seems really raw and beta even at version 1.4. For instance, if you try to build a project with malformed XML in the configuration file, you get a stack trace rather than an error message. For another, there's this little joy of an alert that I get when trying to run a simple RPC test from GWT In Practice:

That's what I get, I get an exception identifier and a content-free snippet of HTML. No stack trace. No additional information in the server window, no information in the console I ran the whole shebang from. No logs anywhere, as far as I can tell. No mention of anything like this in the troubleshooting section of the book the example is from. Google searches for that exception identifier are largely fruitless. I could break out the big hammer at this point and start tracing through the machine-generated Javascript in FireBug, but give me a fucking break.

So I post the problem to the GWT Google Group, leaving out the parts about how GWT seems really raw and beta and how they should give me a fucking break. I subscribe to the group and get a couple digests that don't contain the answer to my question.

I stop getting digests after 10am today, go back to the group page to see what's up, and it turns out I'm banned. Not clear why, no recourse to contact the administrator to find out. Since I'm banned, I can't even look at my post to see if I was inadvertently rude somewhere, though it wouldn't surprise me if the post vanished as part of the ban.

I'd love to start throwing out conspiracy theories, but my guess is actually that the admin saw the <script> element in the error message I posted and assumed I was trying to post an XSS exploit or something.

Posted 2008-06-27 00:41:23 by
Jim Crawford
Panda3D didn't work out as a demo platform for me because it also can't seem to use GL extensions on my machine. So Python is pretty much out the window. My current idea for a demo is Foolish 2 in 64k using Farbraush's libv2 softsynth library and the Windows Speech API. And OpenGL to render the ASCII animation, so we can bust into 3D ASCII animation when it's most dramatic.

The only problem with this idea, as I see it, is that NVScene isn't actually holding a 64k intro competition, just the 4k intro and 64 meg demo varieties. But, you know, fuck it. I don't need their imprimatur.

Posted 2008-05-25 23:30:24 by
Jim Crawford
I ordered a book on C# a few days ago. It was odd, to want to get a technical book again; lately I've just been -- and this seems to be the general programmer zeitgeist nowadays -- relying on google for just about all my technical questions. But the question I had in this case wasn't a technical question, it was a style question, and style questions don't fit search engines. Not only because it's harder to translate such a question into a keyword search, but also because given a decree from a random blogger or forum poster or what have you, it's much harder for me to verify whether it's sensible.

I've been programming in C# for a couple years now and I still don't know what, you know, good code is. Not the way I did for C and C++. I think partly this is because C# style is still evolving; it's still a relatively new language. But partly I think it's because the way I've been learning languages changed.

For C I had a mentor, a really brilliant guy. For C++ I think the big driver was working with Tim, who had a very strong, and rather incompatible sense of style from his university studies. Coming to understand each other's extreme positions on style, and coming to a compromise between them, was really the C++ code style boot camp neither of us knew we needed. The three solid months of coding in the same room on the same project helped in that regard too.

For C#, what I basically have is Adam's old codebase for RECE, the RosArt External Combustion Engine, a very large and very general website back-end which was architecturally solid but very much a work in progress when he left RosArt for greener pastures.

Python actually makes an interesting contrast -- when I started working in Python I felt like I was solid, stylistically, very quickly. Python lends itself to elegant code well, has a tiny syntax so is quick to pick up, and due to its hacker roots, has a user-base that can be trusted to a much greater degree to know what makes makes good code. Another factor to consider, though, is that coming from a C++ and Perl background, Python is such a pleasure to work with that you could probably spew out any old nonsense and it would be as much a pleasure to work with as well-written C++ and Perl programs are.

By itself, this creates a 16x16 square of empty space. The trickery is in the style definition for arrowRight:background: transparent url(/img/master.gif) no-repeat scroll -423px -20px;

Take a look at master.gif. They're setting that mother of an image as the background image of the 16x16 space, and aligning it such that the only 16x16 pixels showing are the ones showing an arrow pointing right.

I imagine the reason they're doing this is to cut down on the bandwidth and time overhead the browser usually incurs by retrieving each image in a separate HTTP request. Another advantage is that putting all the images in the same GIF allows the compression algorithm to find redundancies across all the images.

For the record, disadvantages include: the additional bandwidth and time overhead incurred by the lengthier code required to refer to each image, breaking the site for browsers that don't support CSS or don't support it properly, and the ongoing cost of implementing and maintaining a more complex solution.

Only the last disadvantage is even a little compelling in this instance, and I'd imagine it's easily offset by the coder morale boost of working with such a neat trick.

Posted 2007-11-07 15:45:04 by
Jim CrawfordChris Lamont has produced a PDF explaining the Excel formatting bug that led to certain values near 65,536 being displayed as 100,000. On the second page, he writes: “The bug seems to be introduced when the formatting routine was updated from older 16-bit assembly code used in previous versions of Excel to a presumably faster 32-bit version in Excel 2007. It is surprising such a bug slipped through, but to anyone thinking they can write an IEEE 754 floating-point to text routine using only bit twiddling and integer math with no 'sprintf' cheating, please try to write one and see how hard it is to get right!”

The Excel team deserves no sympathy here. The real mistake here wasn't the bug: it was that they made a tradeoff that drastically increased the likelihood of bugs -- and drastically decreased the maintainability of their code -- for a negligible speed gain. The mistake was exactly their decision to implement a custom, assembly-optimized decimal print routine rather than just calling sprintf. Or “cheating,” as Lamont puts it.

He later goes on to offer the rationale that “converting
floating-point values to text needs to be high performance for Excel.” I'm sorry, but you're not going to sit there and tell me they ran a profile and sprintf came up near the top. The people who write standard library implementations are not chumps. It would take tens of thousands of calls during a single screen update to generate a noticeable hitch on CPUs manufactured in this millenium, and I would be stunned if Excel peaked at a tenth of that. And if it turns out the standard library was implemented by chumps, you're Microsoft and you can afford to switch library providers.

But that's just a basic premature-optimization lesson. As evidenced by the involvement of 16-bit assembly language, this decision was very old. Maybe they hadn't invented profilers back then, who knows? There's another, more interesting way to look at this mistake, and a subtler lesson to learn. As the master put it: “byzantine code paths extract costs as long as they exist, not just as they are written.”

Case in point: a coder on the Excel team took a look at this particular 16-bit assembly routine sometime during the past year or so, failed to realize that it was fucking 2007 already, and decided it was high time to rewrite the code in 1986-era assembly language rather than 1982-era assembly language.

Eye tracking a technology I've wanted to work with for a while now. I had initially thought that it be a drop-in replacement for the mouse, but I wasn't thinking of controls like scroll bars, which you are definitely not looking at during use. It's still possible to use eye tracking as a mouse with some minor modifications, but to get the most out of it it would be best to scrap the mouse paradigm and work with an entirely new UI paradigm.

I'd like to work on one of those paradigm things myself, but they don't give a price for the hardware. They're selling a computer for quadriplegics for $17,000, though, and judging by the rest of the computer, the eye tracking hardware is probably at least $15,000.

Posted 2006-01-24 11:18:32 by
Jim Crawford
I've recently read a few interviewswithBill Roper, who is a part of a team made of a bunch of ex-Blizzard folks. Their upcoming game Hellgate: London promises to be even more generated than Diablo II was, with random events as well as levels and items. I like how he says “We're getting a load of weapons, armor and monsters in right now” as if they just dump them into the database and the level generation algorithm starts spitting out levels with that stuff in. Because I'm hoping that's how it is!

On events: “For instance, you could be battling along a street when a Templar comes running out of the Underground pursued by demons. If I help him fight off the monsters he'll then stick around and I'll get a companion.”

This might not be obvious from my lack of faith in the erasmatron, but I'm really looking forward to see how far they can push generated content. If they can make it really work, we get rid of all these goddamn level designers, reduce development costs by millions, and save the entire games industry. Phew!