Java discourages intelligent use of database technology: Discuss.

Not trying to provoke a major conflict with all you good people at Javaranch, especially as I'm still learning lots of great Java goodness from all of you, but the above statement encapsulates a common experience I've had on several recent projects, and I'd like to see if others have had the same experience.

I'm an old Oracle database developer with some (about a year) experience of J2EE etc, plus a keen interest in various other technologies. I'm interested in Java/J2EE and other OO tools, but most of my career has been spent doing RAD with various (mainly Oracle) tools on an Oracle RDBMS platform. So I'm pretty comfortable with databases and in particular I have learned to appreciate the tremendous advances in relational technology and the immense power of a well-designed database application.

But in recent years I have found myself working on the database side of several n-tier J2EE applications, where I have encountered almost universal ignorance of (and even contempt towards) the power of RDBMS tools on the part of Java designers/developers. As a result, these projects have tended to produce poorly designed, inefficient, over-engineered, top-heavy J2EE solutions to problems that would not even have existed if they'd made proper use of the RDBMS tools available.

This is a major change from just a few years ago, when businesses were still very aware of the importance of their investment in powerful and effective technologies for data storage and manipulation. Today, this often seems to have been forgotten, even though there's an obvious reason why the database is called a "persistence" layer: because the data will stick around long after this year's J2EE application has been mothballed and replaced with next year's fashionable technology. If you want to be able to use your data in future, you'd better put some thought into getting it right at the outset.

My impression is that this growing (and often expensive) ignorance stems from the lack of in-depth experience with relational design/technology at university and in the early years of a junior developer's career, combined with the typical background of many more senior Java developers/designers i.e. they often got into Java from a strongly procedural (C/C++) background, again without any real exposure to the very different, set-oriented thinking required for large-scale database development.

The result is that several J2EE projects I've seen basically regard the database as a glorified spreadsheet and they design/build their systems accordingly, treating the database as a purely passive dumb store, instead of taking advantage of the powerful tools available within the database for storing and manipulating massive volumes of data efficiently and flexibly. So you get people fetching thousands of records individually from the DB, modifying them in teh Java tier and then storing them again individually, instead of calling a single SQL update statement, for example.

This is kind of like spending tens of thousands of dollars on a Ferrari (which probably costs about the same as a large-scale enterprise Oracle RDBMS), driving it home, taking the wheels off and keeping the chickens in it. You can do it, sure, but why on earth would you think it was a good idea?

There is also a lack of understanding of good relational design, with Java designers simply mapping their classes onto tables without paying attention to issues such as relational integrity, identifying proper natural keys, normalisation etc. There is widespread misunderstanding about the fundamental difference between OO modelling and relational data modelling, which leads to OO designers imposing inadequate data models on the underlying RDBMS layer.

The relational data model is based on sound mathematical principles, and the RDBMS represents a practical implementation of that model , so any relational designer is essentially working on the same theoretical basis (allowing for the requirements of a particular system or for variations such as star schemas etc). If two skilled data people try to model the same business domain under the same constraints (business rules etc), they will typically produce essentially the same data model, because they are applying the same (pretty rigorous) theoretical model.

OO modelling is not based on any single underlying theoretical model, but on a growing library of recommended practices - design patterns. These patterns are needed precisely because otherwise there is no clear theoretically justified solution to many common problems in OO design. So the quality of OO design tends to be much more variable and subjective than relational design. If you're a real expert, then you'll produce a great design. But if not, well,....

It's kind of like the difference between medieval architects who built cathedrals based on ad hoc rules and practices developed after many years of often catastrophic blunders, and modern architects or civil engineers, who can base their designs on a sound foundation of materials science, engineering, physics and so on.

Unfortunately, the medieval OO approach appears to be spreading in many J2EE projects, as the importance of sound relational design is increasingly neglected in favour of ever more complex J2EE "solutions" to old problems that only arise through a refusal to take advantage of existing solutions from the boring old relational world. This is perhaps partly due to the fact that J2EE is inherently complex, so that many systems designers tend to focus so much on this complexity that they forget about the simpler and more robust alternatives available just over the O/R fence.

But whatever the reasons, the results often seem to be pretty much what you'd expect if you let a medieval builder design your 21st century skyscraper...

Anyway, that's my experience and my impressions of why this seems to be happening.

But what about you guys, have you seen similar developments at the OO/relational interface, and if so, why do you think that is happening?

Or are you all fortunate enough to work on projects where OO and relational technologies are being used intelligently and appropriately to produce gleamingly efficient, robust and flexible n-tier systems that are the envy of all your neighbours?

Well I am a Jr. Developer and I didn't get my degree in CS. I am mostly self-taught. I have only expose to doing J2EE project that are more reporting based (query oriented) rather than DML. I have a question, about this part

So you get people fetching thousands of records individually from the DB, modifying them in teh Java tier and then storing them again individually, instead of calling a single SQL update statement, for example.

when you say storing them again individually does that mean they wrote thousands of UPDATE statement with different primary key in the WHERE clause?
and are you suggesting it should only use one UPDATE statement to modify all the changes the user made?

My intention is to learn some of your intelligent use of database technology since I am only a Jr. level person, I am not here to criticize your point because you have a lot more exp than I do.

I have a suggestion, if you feel so strongly about this issue maybe you should write a book about it.

Like David, I'm also a Jr. developer, so my point is not to criticize you or prove you wrong. But one thing I'm wondering about, whether have you used any ORM framework like Hibernate, ibatis, etc which may fit into the better criteria of RDBMS in plain JDBC.

Many programmers (not only Java programmers) know a lot about programming, but don't know a lot about relational databases. When they need to design a system that has to store data, they very much look at it from the programming point of view, and treat the database as nothing more than a black box where you persist your data in. They probably know some basic SQL, but they don't really understand the principles of designing a database with a correct and efficient structure, integrity constraints, normalisation etc.

Things like ORM frameworks such as Hibernate try to solve the OO - relational mapping for you, but also encourage you to design the database in a certain way that reflects the OO design of your program. That database design might not be the best from a relational database point of view.

I must say that I'm not a database expert myself, and I never had any formal training on relational databases (what I know is self-taught, mostly on the job). If I'm in a project where I need to design a non-trivial database, I'll find a database expert to help with it.

I do not know anything about Hibernate or anything related however, I did learn quite a bit about referential integrity and normalization. Are you seeing poor db design like tables that should of being breaking into smaller tables and similar issues? What do you tell the others when you see poor design? I think it would be very helpful if you can provide us (people with less exp than you) some tips to avoid bad design, that would be awesome.

This seems to be a rant about how ORMs are misused and OO and relational databases are not linked together correctly to produce the desired results. Some of it is true

Many programmers (not only Java programmers) know a lot about programming, but don't know a lot about relational databases.

We have to define 'a lot' here. I know about joins, table spaces, types of indexes, query planners and tweaking them, stored procedures, synonyms, sequences, transaction isolation levels etc. However my job is to design the DB alone. If we are hit with a performance problem because a setting in the DB is to be tweaked against design / transaction related problems, I would have no option but to turn to a DBA. I would not consider myself a database expert either so I guess we are on the same page on this one.

I have used hibernate on one small project so far. I never design the classes first and then map them to a DB table. It might simply not make sense. I design the tables first and relate them up. After this I reverse engineer the tables to classes and annotations.

I think the inefficiencies that you talk about are related to fetching un necessary data, incorrect joins, inefficient queries that an ORM might write etc. I fear using ORMs for this very reason. Here is a comparison of 2 projects

Project 1

This uses a plain JDBC connection pool managed by the container. Applications take the connection from a pool and use a JDBCHelper class I wrote to get data from the DB and put them into objects. I found this to be very flexible but it was tedious to write. It took more time to write the VO and DAO classes.

Project 2

This project uses hibernate to map tables to classes. It was a simple project so I decided to give hibernate a chance. I used some persistence tools that were eclipse plugins. The tools automatically converted the tables to classes and wrote the VO and DAOs for me. It took 10 minutes against several hours to write the code. However at times the data fetched by the code was not needed. Updates needed entire classes to be formed and inserts were taking longer than expected.

It would be unfair to jump and say Hibernate was introducing the inefficiencies. I find that with ORMs getting to the end is easy, but fixing the stuff that pop up in the end, like performance problems, is a real pain. This is because you will end up tweaking ORM parameters and debug the internal queries generated to fix the problem. The time taken to fix the problem will make you wonder if sticking to JDBC + Manual DAO + VO is a good idea. I guess the benefits would come to light as you understand the ORM a little more. Flexibility is also a great boon when doing things manually. With ORMs though, moving to another DB should take significantly less time if you were to change the dialect that it uses.

Overall I would use an ORM again only after I understand it a little deeper. The inefficiencies that are introduced probably stem from not understanding how the ORM should be tweaked.

You're blurring the lines between programmer, database designer, and DBA (database admin). Not all programmers, regardless of the platform, know anything about how databases work, how to design good databases, nor how to administer them. In large enough projects, these kinds of things are handled by different departments and experts. For example, while I consider myself an expect in JDBC and Java, if I was integrating with an Oracle database I'd take the advice of a certified Oracle DBA quite seriously.

Part of what your missing is that programming is no longer a single guy sitting in his room making an application from scratch. Software development can involve hundreds of people in teams across the world. It's a business now, and likewise, its better to have specialists on certain aspects of the system. In other words,a division of labor (analyst, designer, dba, developer, tester) has worked for industries for years, software developer was just slower to catch up. Is it great to have a single guy who can do everything? Sure. Do they do well in corporate environments where they are told they can only do 1-2 things? Not all that often.

Lastly, I'm not a huge fan of ORM because it avoids dealing with database performance myself. The idea, though, is that if the ORM software is smart enough, it can solve some of the performances issues for you. It's not that the Java developer is limited from getting good performance from an object mapping tool, its that there are teams of developers devoted to making that object mapping tool faster. Either way, you're ignoring the obvious benefits of having software that's completely free of SQL code. Take it from someone who has had to review 10-20 year old SQL queries, most of the time you wish you can throw them out. Having a ORM tool means that you never have to edit code to rewrite SQL queries and that in and of itself is a wonderful thing for long term code maintenance.

Well, in my opinion a large proportion of the software business is still in the medieval phase which you describe.

If you're an engineer in Britain and you want to build a bridge, there are 16 volumes of guidelines which you must follow to do that. (I stumbled across that in Wikipedia a couple of weeks ago but I couldn't give you the link.) Whereas if you are an IT person in Britain and you want to build a website, there aren't any guidelines which you must follow.

Instead there's a variety of approaches. There's the Michelangelo approach, where one person and a few fairly talented apprentices builds the thing. This isn't especially scalable and it involves the problem you alluded to, where Michelangelo is a great painter but doesn't know anything about wall-building, so he chooses a bad wall to paint on and it starts to flake off after a few months.

Or there's the Egyptian Pyramid approach, where you have a small group of technicians designing the thing and a horde of ignorant serfs paid minimal wages to build the thing. This ends up with something which implements a small and simple task but can't be changed to do anything else, and is not at all beautiful. Frequently it goes broke and you end up with nothing, although in the computing world it's a lot easier to get rid of a failed project than to get rid of half a real pyramid.

But so far there isn't a British Engineer approach. Ironically frameworks like Java EE are meant to be that, but so far I don't think they have succeeded. Still, it took several hundred years for people to learn how to build cathedrals, and the road-builders have two thousand years of experience to draw on, so perhaps it's too early to expect competent engineering in the software field.

I'm with Jesper and Deepak here. I've used JDBC (and EJB 1.1 entity beans - aargh!), and I've used Hibernate a little (and Grails), and I know that if I'm trying to get some Java code to write stuff to the database etc, the ORM makes things easier. But it doesn't make it right, especially once you move beyond a fairly trivial app of the kind you often encounter in tutorials etc and start looking at enterprise-scale databases. But this distinction tends to be overlooked in many J2EE projects that I've seen. As Jesper and Deepak say, we all need to recognise the limits of our own expertise and know when to turn to people who understand the stuff that we do not. That's why I keep coming back to Java Ranch, after all!

Paul:

Glad to hear I'm not a lone voice in the wilderness, and I like your Michaelangelo example. But increasingly we seem to be stuck in the land of the pyramids...!

Davie Lin:

I can't really give you a Relational Databases 101 course here, so I'd recommend that if you're interested or expect to be working on an enterprise databse application, you should pick up and read a good textbook on the subject. You can also learn more about the theory and practice of relational databases via distance-learning courses with various universities (e.g. the UK's Open University offers a good one), if you're really serious. Of course, you can't become an expert just by reading/studying, but it often helps to understand more about the principles of an unfamiliar area of technology, and also to realise how much more there is to know. That's certainly been my experience with Java/OO - I know enough to get me started, but also enough to know when I need to ask for help.

Also, if you're using a particular RDBMS at work, try downloading a "lite" version to work on at home (such as Oracle XE for example), and push your boundaries to find out more about how your particular RDBMS really works, and what can be done with it beyond simple Hibernate/JDBC queries.

Meanwhile, some of the common mistakes I've seen in Java-style DB applications include:

"Chatty" interactions e.g. fetching lots of records via a SQL SELECT, modifying them, then writing them back via individual UPDATEs, instead of just running one UPDATE with the appropriate WHERE clause.

For example, an ID is not a real key, but a "surrogate" key i.e. a convenient alternative to your natural primary key (as in Hibernate etc where this is how you usually access your records). But if you cannot find a genuinely unique attribute/combination of attributes that uniquely identifies a single instance of your class (or record in your table), then you don't yet understand your data properly.

Refusal to take advantage of other platform-specific DB tools such as Oracle's PL/SQL language, which allows you to perform many data-oriented operations far more efficiently than using the equivalent combination of Java and JDBC/SQL.

Scott:

Interesting points, but I have to disagree to some extent. I'm not calling for a return to the days of the pizza-fuelled Guru programmer sitting alone in his cubicle cranking out Genius Code, far from it. Like you, I recognise the importance of having the right mix of specialists on a large project. I've worked that way for much of the last 20 years anyway.

The problem I'm seeing is that while my recent projects have been infested with J2EE architects, J2EE designers, J2EE developers and so on, the value of database skills is rapidly being relegated to a post-production maintenance issue for DBAs locked in some underground basement, instead of as an integral part of the system.

If this approach worked, then maybe that would be OK, but every one of the projects I've seen taking this approach has been a near total disaster. As I said above, I'm mostly a database developer, so if I wanted to build a J2EE application, I'd make sure I talked to and listened to some J2EE experts. But the opposite does not seem to be the case: too many J2EE Architects/Designers/Whatever seem to think that because they know all about J2EE, they know all they need to know to build enterprise class systems that are ultimately about processing data and storing data in a database.

Incidentally, I've worked on several systems where a "legacy" database was being used as the basis for a new front end application (using various GUI technologies including Java), precisely because the data was regarded as the valuable asset, while the front-end apps come and go. Data persists, that's where the real long term value lies for many businesses and that's where you need to invest some effort in making sure your database is well designed and well managed.

"Take it from someone who has had to review 10-20 year old SQL queries, most of the time you wish you can throw them out."

Yeah, me too. Especially when they've been written by procedural C/C++/Java people

As for the idea that ORMs will replace SQL, nice dream!

Either way, you're ignoring the obvious benefits of having software that's completely free of SQL code...

Why do you think that would be a good idea? That would be like going through your toolbox and throwing out all the screwdrivers and wrenches.

SQL is really well designed for what it does - e.g. Groovy-meister Scott Davis has described SQL in fashionable terms as "a DSL for databases" - and if you want to do difficult things with a database, you need to know how to use the appropriate tools. ORMS are a well-intentioned fudge to allow people who don't understand relational databases to do relatively easy things with an RDBMS without learning to use those tools properly. But you cannot have a one-size-fits-all generic approach to serious problems that will be specific to a particular RDBMS platform, application or data domain. For that, you need to really understand your data and the technology that is used to store and manipulate that data.

SQL rocks, dude!

And remember, there is far more to a modern enterprise class RDBMS than mere SQL processing, so knowing SQL is just a small part of the job of developing serious database applications. Just as there is a lot more to J2EE than simply understanding the syntax of Java.

chris webster wrote:Not trying to provoke a major conflict with all you good people at Javaranch, especially as I'm still learning lots of great Java goodness from all of you, but the above statement encapsulates a common experience I've had on several recent projects, and I'd like to see if others have had the same experience.

Why Java in particular? Because you are asking at JavaRanch? Because you've worked with Java developers? I fail to see why Java would have a bigger problem with this than .NET (which also as ORM frameworks.)

Deepak Bala wrote: The tools automatically converted the tables to classes and wrote the VO and DAOs for me.

I've seen this more than that the tables being designed to match the code. It's a matter of where the design goes. Both fail to deal with the impedance mismatch well.

I think lack of experience is a problem anywhere. When my team was new, we made large monolothic SQL statements that calculated everything for us. They were too slow (mainly due to not being cached for various numbers of parameters.) Then we switched to simple queries that did one thing only and the network traffic killed us. We finished by batching them

But Codd defined the relational database paradigm back in the 1970s. That's over 30 years ago. So why the lack of experience? It's hardly a new field which hasn't had time to attract practitioners yet.

Jeanne Boyarsky wrote: Why Java in particular? Because you are asking at JavaRanch? Because you've worked with Java developers? I fail to see why Java would have a bigger problem with this than .NET (which also as ORM frameworks.)

Hi Jeanne,

All of the above, to some extent I guess. Again, I must stress I'm not "anti-Java". I think it's more a combination of the dominance of Java in commercial software development, combined with the issues I've described above.

There's also the fact that Java is the first programming language that was actually fashionable. I remember when Java first came out, and it was definitely "cool" to be a Java developer in a way that never really applied to, say, COBOL. Java arrived at a time of massive growth in the IT industry, cost nothing, was surrounded by a lot of hype, and rapidly took over large areas of the industry. Although today Java is very much the mainstream, and the "cool" people seem to be looking at dynamic languages etc. As for .NET, it will never be "cool" in the same way because of the common antipathy towards its owner, plus .NET is still mostly a Windows-only platform in practice, whereas Java by its nature runs on pretty much any OS.

Also Java, and especially J2EE, is pretty complex, which means people have to invest a lot of effort ( = money) in making their J2EE applications work. This was particularly true in the early days of J2EE and is still the case today, despite the welcome arrival of tools such as Spring, ORMs etc. So I think there is also a tendency in Javaland to think (a) that difficult solutions must be intrinsically better than easy solutions (whatever happened to "K.I.S.S."?), and (b) that because you've invested so much in your J2EE development you have to do everything with Java to justify the cost. Yet in practice, this often leads to over-engineering, over-spending and over-reliance on the tools available within Java/J2EE instead of recognising the benefits of other technologies.

Finally, for those of us who remember the arrival of 4GL productivity tools in the late 1980s/early 1990s, the arrival of Java actually represented a step backward towards the old 3GL (C/C++ etc) world, although at least we got rid of all those ****** pointers! Java is great for geeks and people who really love getting deeply involved in code, but not for people who care about productivity. Even today, Java still doesn't offer anything like the productivity that was available with platform-specific tools such as Oracle Forms etc. I know that (with help from Java experts) I could do anything with Java that I used to do with Oracle Forms, for example, and more, but I also know it will take 3 times as long to do it, even allowing for my comparative lack of Java experience.

Of course, those 4GL tools also had their own limitations e.g. 2-tier client-server only, proprietary, expensive, Windows-only, and so on. And these days many of them have disappeared or turned into "legacy" platforms. For example, Oracle hasn't really done much with Forms in the last 10 years, as it has invested primarily in the Java platform - they liked it so much they bought the company!

So far as I can see, Java/J2EE is a great technology, but it should be just one tool in the toolbox. The problem is that many factors in Javaland seem to encourage a blind faith in the idea that all you need is Java. Which is roughly where I came in...

Jeanne Boyarsky wrote:Then we switched to simple queries that did one thing only and the network traffic killed us. We finished by batching them

Just read your interesting post about this issue. I'd suggest one small diifference of emphasis, though. On using stored procedures you say:

However, it ties you to one database and requires you to split logic between the application server and database server. If your architecture already uses stored procedures, this is the typically the best solution

This concern about being tied to one database is often raised in Javaland, but how often does a company really shift its entire system from one RDBMS to another? Many Java people I have worked with seem to regard this as a key issue, as if the company might decide to swap its expensive enterprise Oracle RDBMS for a MySQL server (or vice versa) at any moment. Yet my impression is that this is actually fairly rare when you're building serious enterprise applications, as people have usually chosen a specific RDBMS platform for specific reasons that are unlikely to change. In any case, moving an enterprise RDBMS application to another platform will almost always involve a lot of re-work on the DB side. If you've encapsulated your SQL inside stored procedures, at least it will be easy to find and change it all in one place, and in the meantime you will have all the benefits of using the available tools intelligently. As for splitting the logic, some things are just much easier and more effective if you do them with SQL etc, so you're going to have to do this anyway.

So unless you have a concrete reason to do otherwise, I would always recommend taking advantage of whatever tools your specific (and often very expensive) DB platform provides to make the job easier and ensure you produce a robust and efficient database application, instead of restricting yourself to a vanilla lowest-common-denominator approach that will not perform well. Otherwise, it's the keeping-the-chickens-in-your-Ferrari syndrome again.

But Codd defined the relational database paradigm back in the 1970s. That's over 30 years ago. So why the lack of experience? It's hardly a new field which hasn't had time to attract practitioners yet.

Ok. So suppose you are in college. You take a class which covers relational databases (if you are lucky - in some schools it is an elective.) They cover how to write queries and the core concepts. They don't cover things like when to cache in your application because it is a class in relational databases. Everything is done in SQL there. Then you get out into the real world and you know what to do, how?

The same applies to Java or anything else. You take classes in programming but don't learn why it is important to use proper Java idioms, write unit tests, write readable code, etc.

chris webster wrote:So far as I can see, Java/J2EE is a great technology, but it should be just one tool in the toolbox. The problem is that many factors in Javaland seem to encourage a blind faith in the idea that all you need is Java. Which is roughly where I came in...

I see. Thanks for explaining all that.

chris webster wrote:This concern about being tied to one database is often raised in Javaland, but how often does a company really shift its entire system from one RDBMS to another?

We did. We switched from db2 to Oracle. We still had to rewrite some queries that were using db2 specific syntax though so the argument isn't that strong.

chris webster wrote: If you've encapsulated your SQL inside stored procedures, at least it will be easy to find and change it all in one place, and in the meantime you will have all the benefits of using the available tools intelligently. As for splitting the logic, some things are just much easier and more effective if you do them with SQL etc, so you're going to have to do this anyway.

There's also version control issues. If your SQL changes with every release, you have to make sure the database is correct.

I'm not anti-stored procedure. And we do use stored procedures - when doing something where it makes sense. However, I don't think it should be necessary to use a stored procedure for a single select statement which is where my article comes in.

Also, my quote about stored procedures says that if you aren't already using them. The first time you use any new technology, there is an associated cost. For example, there is the cost of coming up with a procedure to keep the stored procedures and application in sync. I think this cost should be absorbed by a feature that actually needs a stored procedure rather than one like select statement batching. You may disagree. That's the nice thing about opinions .

Well, at the highest level, OO is really not well suited to an RDBMS. There is an impedance mismatch between the rectangular tables that any RDBMS likes and the inherently hierarchical nature of an OO tree. While I have not personally use Hibernate, I have written perhaps a dozen mapping layers between various OO languages and various RDBMS packages over the past three decades. And the short answer is, its never pretty.

Part of the problem is that businesses move slowly on something as critical as a DBMS. It took a very long time for RDBMS to be accepted. We were using flat files on mag tapes at least 15 years into the RDBMS revolution. It costs a ton of money to covert applications that are in production.

While it is true that a relational database has a mathematical model that is complete, that doesn't mean its great. What is technically correct is that there are functions defined in early relational theory (specifically relational calculus) that is a basis set of operations that are theoretically complete. In English, you can do anything you want to do with a database using the functions of relational calculus. Its proven, done.

But SQL is not relational calculus. SQL is more accessible, easier to learn and program. Its close enough for most usages, but when was the last time you saw a programmer doing a "project" function against a database?

Another huge issue is vendor lock-in. Its true that a lot of new Ferraris can be bought for the price of an Enterprise-wide Oracle license. And I'm not picking on Oracle, the other vendors charged similar prices. To get the best performance out of any large scale Oracle system, you have to use the Oracle features and extensions over the SQL standard. Once you do that, if you have a few hundred developers working a few years on the project, you can't consider switching to another RBMS package, you are locked-in.

This may or may not be an issue for your company, but often when the top level managers have the light bulb go off, that they have to keep using package O or S or whatever, and they have to pay many hundreds of thousands of dollars (pounds, euros) each year for maintenance licenses, they get grumpy. So they then issue an order that no more lock-in is allowable. So then we right to the ANSI SQL standard, or use JDBC, or whatever other minimal common subset that we can use.

Using a minimal common subset can make sense, but if you are not literally keeping chickens in your Ferrari, you are using it to got the the corner grocery store.

Orthogonal from this: there are other DBMS systems, the world is not all relational. There are times when a tree structured DB package is a better fit. There were good reasons that the CODASYL datatbases were popular, they worked for many real world business problems. But they lost out on flexibility, because an RDBMS can do anything, while there are queries that you might want to do with a CODASYL database that are impractical if not impossible. So we have moved from using a great design that works quickly for 95% of our use cases to one that is more flexible and less efficient for 90% of the time, but can do that last 5% better.

So far, all object-oriented databases have failed in the market. And the RDBMS market is limited to just a tiny handful of players. This is not good on many levels.

chris webster wrote: If you've encapsulated your SQL inside stored procedures, at least it will be easy to find and change it all in one place, and in the meantime you will have all the benefits of using the available tools intelligently..

I'm not anti-stored procedure. And we do use stored procedures - when doing something where it makes sense.

I'm anti-stored procedure, only because 85% of the time I see them used incorrectly. Stored procedures should be used for queries that it would be too time consuming or costly to transfer back to the client. For example, queries that involves complex searching, such as searching multiple tables, are best down by a stored procedure to reduce the number of round trips to the database as well as prevent the client from downloading intermediate data that it doesn't really care about. The majority of queries, simple updates/selects/join/etc, can be done via a single SQL statement from the client and I rarely see the need to put them in stored procedures.

As far as using stored procedures for a management tool and as Chris said to keep everything in one place... I wish it worked out that way. In practice, a number of nightmare anti-patterns and supportability issues often creep into the code making it a lot harder to manage rather than easier. The most common issue is that the name of the stored procedure often doesn't reflect what it does either because of poor naming (see: The Daily WTF) or someone updating the logic to do something completely different without changing the code (see: The Daily WTF for a similar issue). There's often dual maintenance issues involved as well (see: The Daily WTF again).

Personally, though, my issue with stored procedures is that its nearly impossible to move them between database platforms since the syntax isn't standard. And yes (for all the usual neigh sayers) I have in my career switched an enterprise application's underlying database multiple times. It does happen, although its not for the faint of heart. That's not to say you should never use stored procedures, but IMHO you should use them sparingly, when you really need them. Using them to replace a single query just to have control over it often leads to less control in the long run.

Scott Selikoff wrote:[I'm anti-stored procedure, only because 85% of the time I see them used incorrectly.

Personally, though, my issue with stored procedures is that its nearly impossible to move them between database platforms since the syntax isn't standard.

They were invented specifically to make lock-in more secure. Sybase had them first when they entered the market with "we are fastest" claim. I talked to Bob Epstein about this and other topics in 84 at Decus before he announced his product. And for a while, they were a lot faster than Ingres, Informix, Oracle and the rest. But each revision of each vendor includes performance fixes, and over time, they all end up being about the same for the same thing. And when one vendor invents a cool technique, its only theirs for a while, the next few revisions bring the other packages up to speed.

I hate stored procedures for the same reasons as others, plus, the development tools for them are completely non-standard. You have to debug your application, and get it working and then move some of the more complex SQL into a stored procedure and debug it all over again.

In case anybody is interested, there's a great talk by Rod Johnson on lessons learned from the evolution of Java and J2EE, including some fairly trenchant comments about J2EE and the "complexity industry", and the negative effects of some of the historic ideology in Javaland.

Frank Silbermann wrote: Distributed computing is hard, and I don't think we've found a good solution yet. Today's most widely adopted web technologies are just as ugly and complex.

The ugly and overly complex part is the heart of his presentation.

He talks lightly on Corba and J2EE's excessive distribution model. I've never seen a real world use for all that distributed remote invocation stuff. I've built a lot of systems that are distributed for scaling, but we pass messages, not objects. Skips all that evil marshalling and unmarshalling cruft.

I don't even understand what the original attraction of corba was, I had a team of smart folks working on it in the early 90s, and it drove us nuts.

Just writing proper concurrent algorithms for the multi-core systems we have today is very hard. Intel is talking about 64 core systems in the next months.

I’ve looked at a lot of different solutions, and in my humble opinion Aspose is the way to go. Here’s the link: http://aspose.com