I think the best you are going to be able to do is to come up with some kind
of "meta model", a model that models models. I see your proposed solution
as the first step in this direction. It's highly flexible, but almost
useless. The next step, as I see it,
is to come up with a good model of the CURRENT state of the subject matter,
given the CURRENT information supplied by sensors.

This should be a straightforward exercise in data analysis and database
design, even if it takes a while. I'll call this the normalized database.

After that you need to come up with processes (programs) that will transform
input data in the form you poposed into the form I proposed. You will have
to make your programs work on incremental inputs, most likely, in order to
keep up.

Then you need to make metadata updates to the normalized database as simple,
easy, and automatic as you can. You will need to make metadata updates
whenever a new kind of sensor is introduced.

After that, you might want to build a reporting database along some kind of
OLAP principles that allow your data analysts to adapt more or less
automatically tothe changing world the sensors depict. ETL between the
normalized database and the reporting database will be a real bear, because
of the ongoing metadata updates to the normalized database.

How many years do you plan on working before retirement? ;-)
Received on Tue Oct 24 2006 - 13:34:59 CEST