By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.

By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.

You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

1970s, but many of today's data modeling professionals say that projects still require good old-fashioned craftsmanship.

While data modeling tools and resources -- which range from automated modeling software to prefabricated data models -- fit nicely into today's quick-paced business environment and certainly get the job done faster than purely manual efforts, experts say that data modeling projects often suffer because the tools pay too little attention to details and the specific needs of individual organizations.

When combined with a lack of human involvement, this inattention to detail can have dire consequences for an organization over the long term, according to William G. Smith, principal of William G. Smith & Associates, a Jackson, Wyo.-based consultancy that specializes in data modeling and information management.

Smith said there is a host of "case tools" on the market with some data modeling capabilities built into them. The tools are usually necessary, especially for those data modelers who want to remain competitive in a fast-moving world. But when relied on too heavily or used improperly, he said, the tools can lead to more problems than solutions.

"The tools are very frequently something that the user has to work around or fight against," Smith said, "more than the thing [that] is productive and helpful to them."

But the issue isn't so much that data modeling tools perform poorly. According to Smith and other data modeling experts, the bigger problem centers on an IT culture that tends to reward speed over quality.

"Companies [and government agencies] tend to just build stuff quickly," Smith said, "without sufficient thought, design or attention to the longer-term life of what is being built."

The basic idea behind data modeling is about organizing information into useful structures that can be implemented in a database management system.

Data modeling consists of analyzing data objects used by a business or other organization and identifying the relationships among those objects. Data modeling is a first step in object-oriented programming, because data modeling efforts allow users to define the classes that provide the templates for program objects.

In recent years, several differing approaches to data modeling and its notation were combined into the Unified Modeling Language, which experts say is becoming a standard modeling language.

"Most of the data modeling tools that have existed and that do exist today have an orientation of 'let's slam-dunk database tables as fast as we can,'" Smith said. "And in the companies that I work with, one of the main problems is that they've just slam-dunked more tables than they've had the ability to control or maintain, and the tools that they're using, just make more mess quicker."

Dire consequences?

Poor data modeling practices and a culture of mismanagement can lead to serious consequences for an organization, Smith said. A former client of his, a large bank, which had managed to build and separately maintain 196 known customer databases, saw that firsthand.

Smith asked himself why the bank would go to such great lengths to redundantly build out separate customer databases when for more than 35 years it has been technologically possible to build a single customer database -- a single source of the truth -- that all parts of an organization can call upon.

"The simple answer is that this is gross mismanagement of a very critical resource of the business -- its data resource," he said. "My sister-in-law happened to be a customer of this bank. She moved and tried to notify the bank of her change of address [and] it took her over five years to get all the statements and other communications she received from the bank to come to her new address."

Out of anger, Smith's sister-in-law ultimately stopped doing business with the bank.

"They basically had 196 different places where her address could have resided redundantly -- a classic data mess," Smith said. "I can predict the future of this bank: Their mismanagement of their precious data resource is of course replicated in their mismanagement of almost every other resource, and it will eventually drive most customers to leave, especially if there is even one reasonable and better alternative."

An evolution in progress

Two of the most popular data modeling tools on the market are CA's ERwin Data Modeler and Embarcadero Technologies Inc.'s ER/Studio, according to Dr. Terry Halpin, vice president of conceptual modeling and distinguished professor at Neumont University in Salt Lake City. Halpin said these tools and others like them essentially work by allowing users to enter into the system a high-level conceptual or logical model, which the software then translates into a relational model for use in a relational database.

The tools have added capabilities over the years, Halpin said. For example, data modeling tools could initially conduct only forward-engineering processes, and then the vendors added reverse engineering. Some of the vendors provide "round-trip" engineering as well.

"When data warehousing became popular, most of the tools added options for generating dimensional designs for data warehousing with controlled normalization," he said. "But most of them are still not much more than a little bit of sugar on top of relational models, to be honest."

Prefabricated data models: Pros and cons

One approach to data modeling that has gained in popularity in recent years is the use of prefabricated or template-based reference models.

Some consultants have criticized the template models as having the potential to contribute to the problem of data modeling laziness, but Len Silverston, a well-known data modeling book author and the creator of many reference models, disagrees.

Silverston says that when the reference models are used as just that -- references -- and not as an all-out replacement for homegrown models, they can only add to the speed and quality of data modeling projects.

"I think in the future that is going to be a huge trend," Silverston said, "where there [will] be all sorts of templates available and all sorts of reusable models, just like there is in application code."

0 comments

Register

Login

Forgot your password?

Your password has been sent to:

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy