Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Eric S. Raymond wrote
in to tell us that he has updated The Magic Cauldron
(his essay on economics and Open Source) to contain
an appendix on common arguments for keeping device drivers
closed (Pay attention Creative Labs!) He also says "The argument turns on the fact that drivers are small pieces of code, easy
to disassemble if need be. This argument would be considerably strengthened
if I could point readers at a working set of tools for disassembling Windows
drivers into recompileable source (or even just assembly) code. I would
appreciate pointers to any such tools."

Here's another ancient memory ping. I transfered, via serial port, the sources from the Commodore to a dual floppy Dec Rainbow running a CPM86 6502 cross-assembler. Intel hex files were then shoveled into a serial EPROM programmer. I used the C64 as an initial embedded development system, then moved everything over to the Rainbow and CPM86 before getting "wise" and moving on to the PC and beyond. It was during the Rainbow period that I became acquanted with Dec Ultrix on a MicroVax 1. And it was all mine.

My favorite disassembler is IDA [datarescue.com]. It handles a lot of other targets as well. Having said that, I'd like to point out this argument is flawed. It's not always possible to disassemble something and then reassemble it. Case in point I made a program [jitit.com] that sort makes this task impossible with a traditional disassembler.

Even without using something like this, disassembly of programs cannot be 100% correct. There is a lot of information needed to reconstruct a program that is thrown away by the compiler. For example you can't determine where a jump table (switch statement) or function pointer (virtual function) will lead to without actually executing the program and then it's not possible to cover all of the cases that could occur. IDA does a pretty good job of guessing though. Also IDA has hacks if it is looking at C code like looking for procedure frame headers (push ebp; mov ebp, esp) and termination code (pop ebp; ret). My solution to correct disassembly is to use emulation. This insures you never have an incorrect interpretation, but you will miss code that is never executed, and this can be important for understanding how errors and exceptions are handled.

Just because you can dissemble something doesn't mean you can understand it. It requires a huge amount of time to try to comprehend uses for unnamed memory locations and layout of data structures. In the case of a driver this is frustrated by the fact you are dealing with a lot of hidden code and data (the logic on the hardware). Not all drivers are "small pieces of code." Having looked at several vendor's Direct3d driver source code I can tell you each was extremely large and complex. Writing the driver with full knowledge of everything is hard enough, but disassembling one and understanding it would require super human willpower. There are all sort of commands that will do different things depending on that state of the card, so you have to understand what previously executed code was important to getting the card into that state.

With respect, Bruce, I don't think this is an omission on ESR's part. The percentage of the market which cares enough about open-source drivers is so tiny as to be statistically insignifigant for hardware vendors. No Gateway, Dell, or (shudder) Packard Bell is going to quibble about whether or not their newest OEMed machine is carrying open-sourced device drivers.

For the open source community, this is a different story. I occasionally dream that the entire computer buying community suddenly gets wise and starts asking for real change in the industry, but I'm not holding my breath.

First off, I want to make it clear that I'm not arguing against open source drivers, I'm arguing that a) open source drivers are necessary but insufficient to getting hardware running on different os's. and b) there are costs in resources and $$$ for a company in supporting an open process for drivers.

I was thinking in the context of drivers for things like new generation 3d boards, which are not reverse engineer-able in any useful sense.

You don't really "port" a 3d video driver from Windows to Linux, you write a brand new one, basically from scratch. Source code would be nice, but you still need the docs plus actual human beings to ask questions of. (Yeah, glide helps)

3Dfx has done a great job of this, but it took a some amount of liason work (==$$$) on 3dfx's part to make it happen. Every hour the 3dfx engineers spend talking to the linux driver guys is an hour a) they get paid and b) they aren't working on something else. It was probably worth it for them (hey, I bought one) but it was an "allocation of resources" problem just like in sim city.

Another reasons open sourcing drivers might cost a company : The drivers for 3d boards are areas of competition (the drivers can be as important as the chip design for producing good benchmark results)

Again, I _like_ the idea of an open driver development process. I bought and very happily use a 3dfx Voodoo3/2000 because of it. I hear nvidia is doing a great job as well.

I'm not exactly in the device driver writing business but I'd imagine that maintenence of drivers for old hardware cost a fair bit time and money. Once you make the device driver opensource and get it past the 1.0 stage, you can go on to bigger better things.

Companies like Matrox and Adaptec have, as far as I can remember, been very good about supporting even their oldest devices with drivers and jumper settings and such. As time goes on they accumulate more and more stuff that their customers expect them to support just as well as they ever did. Opensource fixes their growing legacy support problem. They don't have to worry about weather MS will include their driver in the next OS release, it will get ported to the next version of Linux almost automatically. The only way tht would not happen is if there doesn't exist anyone who is using that card anymore. If that happens, who cares if the driver dies?

The time-to-market argument certainly fails for novel wacky stuff in the drivers. But as ESR notes, if you do have clever proprietary novel wacky stuff it belongs in the ROMs on the board. If you've got clever wacky stuff that can only live at the driver level, release interface specs and let other folks write less clever drivers anyway.

The more basic argument he is making is: Hardware vendors don't want to be in the software business, particular in the boring part of it that's about porting drivers to x, y, z, and w random architecture. If they open the specs and other people can do the porting work for them. There's no reason that has to touch on any wacky proprietary innovation of theirs.

Fundamentally, having proprietary stuff at the interface level is silly. More to the point, it doesn't much happen. That, I think, is the basic argument here.

There's no Better example of why having good low level specs out then the GUS. it's 1999 and I *still* cant play the audio in new 64k intros. infact its not uncommon to see things like requires pII 400 and 1mg GUS of couse, sound blaster took over, and everyone knew the low level specs for that..._ "Subtle mind control? Why do all these HTML buttons say 'Submit' ?"

i use Sourcer..it does direct disassembly into assembly and is pretty good after 8-9 cycles. i think the newer versions also do disassembly into C...although i have an old version and it works ok on DOS.

First off, they don't need to release documentation, write specs and give support. Just release the source dammit. Isn't it funny that Linux has come this far without support from hardware companies? The network cards, for example works much better i Linux than in Windows. (Thanks Donald Becker)

You say that it cost them resources. How much will it cost them when no one buys their hardware? And you are forgetting the bugfixes and enhancements they will get back. Give and take, you know..

Funny that you mention 3dfx. I own a VooDoo 2 card, and it has given me much joy. But my next card is not going to be from 3dfx. Why you say,... because Matrox has released their specs, and NVidia are writing open source drivers for Linux. And yes, the other cards are technical superior also. 3dfx is going to lose... bigtime.

Crackers should not be classified as warez doods. The only skills warez doods have is being part of a group in which somebody has a bunch of bandwidth to send warez around. Crackers, on the other hand, are talented assembly programmers who reverse-engineer programs to remove the copy protection. Certainly they are several steps above warez doods.

I found the situation of disassemblers for the X86 relatively bleak, since there is only one tool that suits my needs (offline disassembly). Every development tool contains a debugger that also does some disassembly of the current instruction stream, but they all lack the ability to define data structures.

I did find "the one true" (at least to me) disassembly tool, IDA by http://www.datarescue.com. It is an interactive disassembler that does background disassembly and lets the user identify additional code and data sections (and correct the mistakes:) ). IDA supports a whole slew of input files (DLL, VxD, NLM, COM...) for a lot of processors (Z80, x86, Java VM,...). It has working versions for OS/2, DOS, Win32...

There are only two things about it - it is payware and no source code is available. And it is virtually impossible to find warezed copies on the net. I paid for my copy and I'm still happy with IDA (in fact, I was happy with the trial version and found it well worth the money to upgrade). The copy protection seems also to be relatively hard, at least from what I found (or better, didn't find) by looking at it.

That defensive do-this-or-you're-a-moron tone isn't going to make any friends in the corporate world. Is ESR beginning to lose his temper a bit? Understandable given the total idocy level out there, but that appendix could use a bit of calming down.

This reminds me of the bad old days, and four stories in particular. The first was reverse engineering Commodore ROM Basic back to commented source code on the Commodore 64. Once reversed, I ported it to an embbeded 65C802 system I was working on. The tool I was using was called Sourcerer. Needless to say it worked, and I had my own kernel with a BASIC running on top of it.

The second story was with a 68000 monitor and debubber that I had to write a disassembler for in order to move it to a 68010 SBC. The disassembler was hand-made by me.

The third involved the original 8052 BASIC52 that Intel had produced. There were some bugs and lacking features in it, so I wrote a small assembly routine to dump the contents of the chip out its built-in serial port. The 8052 output was captured on an IBM PC, and I wrote a disassembler in MSC 4 to put it back into 8052 assembly language. I then fixed the bugs and added the features I needed, after which I programmed it back into an 8752 for further development. I sent the code back to Ciarcia, and I even got a little note back.

But the biggest tool I used was V Communications Sourcer [v-com.com]. I got started using it to reverse IBM PC ROMS to hack the drive tables, and then started to add little features and fix little bugs. I used it commercially to reverse engineer the IBM RTIC communications card DOS drivers. I did that so we could then port the card to Novell Netware 2.15 as a VAD (IBM had no plans at the time to port it, and we needed it to provide a Bouroughs data link between a Netware server and a Bouroughs mainframe). Sourcer was, and still is, the cleanest disassembler on the market, capable of giving you back highly commented source code from binary files. I haven't updated my license lately since I prefer open systems to Windows, but if I had to do that again I would go back to Sourcer.

The point is that reverse engineering is equal parts art and science, and if you need it bad enough and know assembly well enough, you'll find and/or build the right tools for the job. ESR needs to calm down a bit.

I actually found it as a matter-of-fact type position. I didnt read it as a do this or else stance at all. ESR is known for stating his opinion clearly, without the usual candy coating or bullshit you get from other writers.

So if a competitor can trivially disassemble drivers to reverse-engineer them (which is not the case anyhow), what is the added value in doing their reverse-engineering work for them and releasingly nicely formatted source code with symbols and #defines and everything intact, with no optimizer obfuscation, and possibly even comments? Sounds like a gift to the competition if I ever heard one.

My argument has been that it's unlikely anyone will reverse engineer your hardware from your software, but perhaps this is not the case?

ESR seems to neglect the classic argument for open-sourcing the drivers---the users will make them better. From personal experience, I have noticed that device drivers (especially new ones for new products like video cards released by the company that made the product) tend to be buggy as hell and can cause frequent crashing of a Windows machine. (Alex St. John has ranted about this many times in his "Maximum PC" opinion pieces.) By opening the driver source to your users, a company could get fixes to these problems much more quickly than keeping them closed. This seems to me to be an even more convincing argument than the "some-kid-will-disassemble-them-eventually-anyway" one."there once was a big guy named lou

It seems that ESR's recent writing are both volumous and comprehensive, and If he is taking suggestions for after-the-fact mods, perhaps we are beginning to see an era in which we "open source" our documentation, theory, and text-books. Yeah, I know, books are open, you can read them. But I am talking about development projects, where the original author throws up a basic outline of the work, with a few sections filled in, and then manages the submissions. So I want to propose the _YALS_ project, standing for "Yet Another Love Story", we need some web space, and a few core developers, and we'll be ready to go. The basic outline will be: 1) Character_One Meets Character_Two 2) Character_One Loses Character_Two 3) Character_One Gets Character_Two We should try to flesh the plot out by August 20th, Aiming for a Version 0.1 publication by January 1st. Check The site at http://taz.eng.ua.edu:1138/crutcher/yals [ua.edu]

Forget the open source movement for a moment. It seems that the lifetime for computer hardware is incredibly short. By keeping the source to driver software closed manufactures accomplish;

1) Decrease the life of the product, shortening support obligations.

2) Hide any shortcomings to their hardware product, saving on warranty obligations.

3) Forcing the consumer to upgrade by preventing product 'enhancments', or fixing 'problems'

4) Reveal design shortcomings to the competition.

Some wise man said that a there are two reasons someone does something. A good reason, and the _real_ reason. The arguement that it protects proprietarty technology is in most instances completely false, as many chip makers release 'reference' designs to many hardware manufactures, who in turn make only small changes to the basic design. And any valid 'proprietary' enhancements there may be are quickly undone when the next competing product hits the market. So companies are left with the real reason - less hassle and more money.

I do not think that is enough that you argue politely the benifits of open source. Consumers are going have to get tired of being on the endless upgrade mill of expensive hardware and the expensive bloated code to use it. How much power do you need to write a letter or balance a checkbook?

Maybe then we would see real innovation in the computer industry instead of creeping featurism.

The only way that you are going to see open source from a manufacture is if it is legislated by congress answering the demand of angry consumers who catch on. IBM's Aptiva MWAVE is a classic example of the above. Cut and paste the link below into your browser.

ESR makes the argument that product cycles are not as long as they were. This is true, though it misses the point that important bits can stay the same between generations. Eg say you invent wizzo-bang method to help 3D graphics and no-one else figures it out, and it gives you a big lead, for your next gen cards, you might just use a more refined algorithm. By publishing the original specs, you could give your competitors a helping hand. Of course, it'd still take them another year or more to actually bring something new to market, by which time you've got your 3rd gen out, but if your competitors have got a lot more money for research than you, then they could catch up.

However, even this is not a real excuse - if you have a situation like this, publish as much as you can, and guard your most treasured secrets if you want to be that way.

For example, Matrox released the specs for their graphics G200 cards, except their triangle setup engine (for whatever reason) and they've also released most of the specs for their next gen G400, even before it started shipping!

http://fravia.org (which seems to be down half the time) is all about reverse engineering (cracking). It isnt just a simple how to crack (software crack, not bad "hacker") site, he really is interested in reverse engineering. There are even pages about Linux cracking. If his site is down just search for +HCU or +ORC along with +fravia. When I was into that type of stuff, before I got into linux, I used w32dsm, which there are shareware version s of it, but just as I stopped doing that IDA became really big. IDA seems to be better, but W32dsm worked well. And for a windows debugger, you can't beat SoftICE, by Numega. No other debugger for windows even comes close.

My favorite disassembler is IDA. It handles a lot of other targets as well. Having said that, I'd like to point out this argument is flawed. It's not always possible to disassemble something and then reassemble it. Case in point I made a program that sort makes this task impossible with a traditional disassembler.

And EZIP is different than PKLite how? If a program was PKLite'd one would only have to run it in a debugger until it had unpacked itself into memory then dump the memory contents. Eventually, tools such as UnPKLite were written to automate the process. Compressing a file just adds one more level that will take all of 10 minutes to eliminate.

Even without using something like this, disassembly of programs cannot be 100% correct. There is a lot of information needed to reconstruct a program that is thrown away by the compiler. For example you can?t determine where a jump table (switch statement) or function pointer (virtual function) will lead to without actually executing the program and then it?s not possible to cover all of the cases that could occur. IDA does a pretty good job of guessing though.

Also IDA has hacks if it is looking at C code like looking for procedure frame headers (push ebp; mov ebp, esp) and termination code (pop ebp; ret). My solution to correct disassembly is to use emulation. This insures you never have an incorrect interpretation, but you will miss code that is never executed, and this can be important for understanding how errors and exceptions are handled.

You won't miss code. When you come to a branch, take BOTH routes into the code. For ISRs and such, knowing how and when your target processor handles them will allow you to find them easily. It too can be automated.

Just because you can dissemble something doesn?t mean you can understand it. It requires a huge amount of time to try to comprehend uses for unnamed memory locations and layout of data structures.

Here, we agree. However, I've never come across code I was reverse engineering which I didn't eventually figure out. It just takes time, patience, and a good reference library.:)

All coders know about the fancies of how to reverse-engineer code, crack copy-protection, fix bugs. Back in the Amiga days, I had the pleasure of using very fine reverse-engineering programs so I would clip portions of demos / games and stuff and turn them into relative assembly code.

Indeed, it always seemed wierd when the program was written in C. However, C is a low level language, and a good programmer can figure it out. As of '99 things have changed a bit. First of all, the complexity of the programs have increased both in size and functionality [ so we can talk about kolmogorov complexity;) ]

Now imagine a hardware whose specs aren't available. In the case of RIVA TNT, I was really pissed off and almost going to work on cracking the windows GL drivers open and writing Mesa stuff for it before the free stuff came in. Guess what! I never started it, let's say it didn't turn me on. The real reason, of course, is that it would be almost an impossible task. Although one could argue that the modularity in the hardware design would lead to modularity in the driver code, thus comprehensible assembly stuff. I wouldn't be so jumpy about it, it's still an undecidable problem folks!

Sincerely, I don't think I could accomplish that without knowing card's specs. I recall that it was quite difficult to find out Amiga AGA's specs by disassembling the AmigaOS3.0 ROM. And that was just 2d.

Precisely Bruce, hardware vendors that don't divulge the source to device drivers these days are really only shooting themselves in the foot! An awful lot of my friends and acquaintances now list Linux compatibility as not just a contributing factor, but as a deciding one in all of their hardware purchases. This big change, only in recent years, affirms to me your comment that we are indeed a market force now, a fast growing force, and one to be reckoned with.

information on how to reverse engineer is all over the web. fravia, the first site listed, is by far the most detailed, has been around a very long time, and has at least 6 mirrors, in europe, asia, a couple in the usa, etc.

The more interesting question is, given that most of these sites have been around so long, why dont we see more reverse-engineering of software going on? i think the availability of all this information (especially on fravia) weakens esr's argument significantly. regardless of what he says, reverse-engineering is really, really difficult, even for small pieces of code. i dont think he'll convince very many people based on that argument.

As for the appropriate tools, a while ago i found copies of wdasm and softice using ftpsearch (remember, one version of softice was a fully operational time-demo which could be cracked by itself). much of the other stuff you might need that ive seen are freeware or shareware.

So if a competitor can trivially disassemble drivers to reverse-engineer them (which is not the case anyhow), what is the added value in doing their reverse-engineering work for them and releasingly nicely formatted source code with symbols and #defines and everything intact, with no optimizer obfuscation, and possibly even comments? Sounds like a gift to the competition if I ever heard one.

It's better to relase source and win the respect of a community who will then go out of their way to buy your product, than to keep your product closed. Either way, if someone wants to reverse engineer your product, they will.

My argument has been that it's unlikely anyone will reverse engineer your hardware from your software, but perhaps this is not the case?

I can only point to a couple examples where a company wouldn't release specs so people took it upon themselves to reverse engineer the drivers. One recent one would be the GATOS project which reversed engineered the ATI television cards fairly well thus forcing ATI to cooperate in the support of Linux/FreeBSD. The other was the DOS graphic/sound demo group Renassiance who reversed engineered the Advanced Gravis Ultrasound wavetable sound card thus forcing Advanced Gravis to release the low-level card info and a free SDK.

Of course, there's also the arcade emulator scene as an example. They reverse engeneer the game ROMS, usually without access to or much info on the arcade hardware except maybe the make of CPU, and come up with complete working arcade emulators in a week or so. Just take a look at MAME. It's up over 1400 emulated games if I'm not mistaken.

Not to start up the whole hacker/cracker thing, but even in america I've long heard the term "cracker" to refer to someone who cracks a game's protection ("cracked by xxxx"), rather than someone who hacks into systems. Apparently the old school "hackers" are partially successful in getting their word back, but I dunno...

Enterprising hardware manufacturers please note: there is currently only oneDV+Analog video capture card on the market and that product lacks Linux support.

Fast Electronics [fastmultimedia.com] is the only manufacturer of this type of card, viz the Fast DVMaster [fastmultimedia.com] PCI card which has 2 DV inputs, 1 DV output, analog SVHS input and output with both NTSC and PAL support (SECAM extra). The card has no Linux drivers available yet but customer pressure could apparently change this. Requests for Linux drivers are being solicited by Anuschka.Schweizer@fastmultimedia.de [mailto] who is the product manager for the DVMaster. Alternatively, an enterprising competitor might like to enter the market.

Digital Video capture cards are a limited interim solution because they lack legacy analog video compatibility. A DV and analog capture card, which captures both analog and digital video data, is what many people who buy DV capture cards later realize would be much more useful to them because they'd get backward compatibility with (S)VHS and real-time conversion to Digital Video for their old collections of (S)VHS tapes.

I gathered that he was saying do this or your potential customers will call you a moron.

As one of those people who influences buying decisions, I can confirm that position. Venders who won't release spec ARE clueless, and I will strongly recommend against them if any reasonable alternative is available.

Eric's arguments are valid, but he's missing the primary motivator for hardware vendors. If you don't open the source for your device drivers, we will not buy your hardware. We are a market force now, and don't have to rely on threats of reverse-engineering any longer.

Chip vendors don't like this because they don't want their competitors to use their own documentation to produce drop-in replacements for their components. They use trade-secret because it's the cheapest form of protection. However, if they have to open their interfaces to get their chips designed into Linux systems, they will publish their documents and use patents and copyright as protection against their hardware competitors.

exactly. i do not want to give my money to a company that does not open their hardware specs and i would rather give it to a company that has open source drivers. what would be useful in this context would be a web based hardware database, where find quick info about how open a specific piece of hardware is before buying one.

like there could be 5 categories:

1 proprietary hardware, company does not give out spec. 2 company allowed people to write binary only drivers under NDA but for free. 3 company releases binary only drivers 4 company published detailed documentation 5 company wrote open source driver or funded/helped in the development

i would only buy hardware in category 4 or 5 but often it is not easy to find out if the person who wrote drivers got support from the company.. sure, a grep through the linux kernel helps sometimes but that is not something the marketing departments will consider a threat to the image of their product.

having such a database where one could directly compare the openness of various vendors would put some market pressure on those companies..

4. company releases open source driver (nVidia) 5. company releases detailed specifications (hopefully Matrox, soon) 6. company releases detailed specifications AND has some employees help with development (no one yet that I know of, at least in gfx cards)

I remember reading once that someone claimed that if the entire internet worked together, they could write a book and have it ready for publication in one day. This was a few years ago; I don't remember who said it or where I read it. If anyone can back me up on this, I'd appreciate it.

The only way that you are going to see open source from a manufacture is if it is legislated by congress answering the demand of angry consumers who catch on.

I hope you're not seriously advocating something like this. I can see it now, a computer hardware market as regulated as the Medical Device industry. With clowns like Ralph Nader riding up on top 'directing' hardware manufactuers in how to 'innovate' like good citizens. Costs would skyrocket, the whole industry would devolved into a chickensh*t pack of me-too companies.

... is what supporting an open source driver is for a manufacturer. Now, it might be worth it, but the there's a lot of support work that goes with releasing driver information. There's some generic stuff that goes with the chip, but each board is potentially different, and many board manufacturers simply don't have the resources to support an external driver porting effort. The people who know enough to write the docs are busy writing the next generation drivers. And in case you were wondering, no, just releasing a couple of binderfuls of specs is not generally enough to write a good driver.

Are you missing the point that they could simply open the driver source? Yes, without good docs we might have a hard time reading it, but at least we could port it. And reading it would not be any more difficult than reverse-engineering is now.

I am not an electrical engineer, but how impractical would it be to start an open source hardware movement? Linux and the GPL were created because someone got sick of having to use software that was created for the sole purpose of making money. This has lead to a new revolution in the way that some people are looking at software development. Why not start a similar movement in hardware. I know that hardware is much harder to build, as far as manufacturing, but Linux was started on a lot of donated machines.....why not get some donated manufacturing equipment??? Just an idea, but why not make the entire computer market and Open Source market. No body likes disassembling and porting (not normal people anyway:P), so get rid of it. I don't know if there is anything out there like this if so I would be interested hearing about it. --------------------------------------------