Thursday, September 10, 2009

Random Things: Volume #7

Moreover, the code line from all of the major vendors is quite elderly, in all cases dating from the 1980s. Hence, the major vendors sell software that is a quarter century old, and has been extended and morphed to meet today’s needs. In my opinion, these legacy systems are at the end of their useful life. They deserve to be sent to the "home for tired software."

I'm not quite sure why old code is necessarily bad. From the comments:

And of all the valid criticisms of a model or a technology, "elderly" and "tired" are worse than useless. Do we believe that technology builds on prior discoveries, or that new technology throws older discoveries away? By such a standard, we would stop teaching Boolean logic, Turing machines, and all the other things that predate us.

Also in the comments you learn that Mr. Stonebraker is the CTO of Vertica. Makes perfect sense...I guess. Would have been nice to see that up top though.

An Idea?Since everyone's all abuzz about 11gR2, I think it's a good time to bring this up. Back in June, I was having problems debugging an INSERT statement. I kept getting, not enough values.

4 comments:

The tendency to deride "old" code is odd. If an old snippet of C code that works reliably is at the heart of a system, why would anyone rewrite it? At some point, larger systems will be built on a layer of code that no one ever thinks about. An extreme example is from a book--Fire Upon the Deep--set in the far future. The author makes a reference (if memory serves) to the fact that the near sentient machines of the time use an offset counter from 1980 to determine the time of day... And why not? Some far future Terminator will probably report its latest human kill to the Mainframe through a service and using a protocol built from layers of protocols on top of TCP. And the kill report will probably be written as a row in an Oracle-based database...

Unless Stonebreaker can show me something that works better, then I am sticking to the tried and true. Just because Google or some new hot technology has something like Map Reduce doesn't make it applicable to everything. There is a reason they use what they do.

Secondly, take the search engine. It provides a way to search large chunks of data. Is it accurate, does it protect data integrity? In many cases Google can live with errors, even though they are minor. In the DB world, people go with something like Oracle because THEY NEED transactional integrity etc... I just hate when people say it should be thrown out but can't show me anything viable to replace it.