From ...
From: Erik Naggum
Subject: Re: is CLOS reall OO?
Date: 1998/05/20
Message-ID: <3104637922021222@naggum.no>
X-Deja-AN: 354878824
References: <354DEDA4.2592E916@ecst.csuchico.edu> <6jrtc6$ve6$1@reader1.reader.news.ozemail.net> <0qn81.1590$DM1.1198054@news.teleport.com>
mail-copies-to: never
Organization: Naggum Software; +47 8800 8879; http://www.naggum.no
Newsgroups: comp.lang.lisp
* Mike McDonald
| The constraint that Y2K programs fell under was that disk space was
| expensive and extremely limited.
this is historically false. none of these "space-saving" arguments are
actually supported by evidence from the time the were supposedly made.
these arguments are only made up now that we face the problem, but they
are incredibly convenient arguments: "it was somebody else's faulty
assumptions a long time ago", but the danger in believing these false
reasons is that we won't learn anything and won't change our ways.
our whole culture has historically only been concerned with centuries
very close to their beginnings and ends ("fin-de-siécle" is even a
well-established psychological effect on people near a century's end, and
it's even worse now, with journalists crying the end of the world because
of computer glitches -- but they would cry the end of the world, anyway,
if this was Y2K problem didn't exist, just look at newspapers and books
from 1898), because people don't plan very far ahead and they can always
figure out which century was meant for anything that happens in their
lifetime, anyway. Y2K is a people and a culture problem, not at all
computer problem. the solution is a people and culture solution, too:
DO NOT ABBREVIATE INFORMATION!
if you look at 16th, 17th, 18th, 19th and 20th century art, you'll find a
predominance of artists who signed and dated their works with only two
digits. I have just as hard a time believing they did this to save space
as I have believing 20th's century programmers or system designers did.
centuries do not matter to people who live less than a 100 years, unless
they stop and think carefully about it.
and _if_ they had wanted to save memory, they would have stored a date in
_much_ less space than they would require for 6 characters or even 6 BCD
digits. think about it: with millions if not billions of records, if you
could use a date encoding that spanned 179 years in 16 bits (or 716 years
in 18 bits) simply by counting the number of days since some "epoch", you
would save yourself millions if not billions of bytes or half-words, at
the cost of a table lookup and some arithmetic when reading and printing
dates, but also with the gain of simplifying a lot of other calculations
with dates. you can bet hard cash that _somebody_ would have thought of
this if "saving space" was as important to them as people would have it
these days in order to "explain" this problem away in a way that does not
affect them.
however, _every_ person who writes years with two digits is part of the
cultural problem that resulted in the Y2K mess. the only _permanent_
solution is that people start writing years with all four digits. the
other solutions require moderate amounts of intelligence in all software
that reads dates from humans, and we know what _requiring_ intelligence
from today's crop of unskilled software workers will result in: massive
system failure. when a computerized packaging plant's quality control
discards freshly canned tunafish that were date-stamped to be fresh for
two years because it believes the cans are 98 years old, it's not _just_
a case of braindamage by design, it's the culture of abbreviated
information that is at fault. fix that, and your Y2K problems go away.
then never, _ever_ store dates or timestamps as anything but with a
non-lossy textual form like ISO 8601 (which includes the timezone, which
people also remove in the belief that it can easily be reconstructed,
which it cannot) or as a linearly increasing number of units of time from
some epoch like Common Lisp's Universal Time (and with a timezone to make
the information complete).
with all the "pattern" talk going around these days, I'm surprised that I
cannot find any "patterns" for dealing with dates and times in the right
ways, only a massive number of botched designs.
the really sad thing about this is that people will now store dates in
their databases with 8-digit numbers, requiring 32 bits of storage if
store a parsed integer value, or 64 bits if they store characters, when
16 bits would _still_ be enough for 80 more years if they had really
_wanted_ to save space and started counting days from 1900, or 179 years
from whenever they started counting -- surely a flight control system
wouldn't live for 179 years. so the "save space" argument curiously
requires _really_ stupid people who didn't realize what savings they
could make. if they had been _smart_ when they wanted to save space, we
wouldn't have had any problems with dates for yet another lifetime.
I think the Y2K problem is just a very convenient outlet for the public
_need_ for fin-de-siécle hysteria. with many managers behaving as if
there won't _be_ any year 2000 so they won't have to deal with this, I
don't think we need to look for old reasons why people don't "get it" --
we need to look for reasons why they are _still_ solving it the wrong way
-- and that's the cultural problem that won't go away.
also, I'd be curious to see how many historical archives suddenly lose
all their material from his century when we roll over into the next
because "5/20/98" will be parsed as 2098-05-20 and no data exists for
that date¹. as far as I have seen, people are mostly concerned with Y2K
as it relates to treating "now" correctly. they have indeed learned
nothing about the root cause of this tragic cultural problem.
#:Erik
-------
¹ (encode-universal-time 0 0 0 5 20 98) will produce the right date in
Common Lisp until 2048, because it treats years in the range 0-100 as
being within 50 years of "now".
--
"Where do you want to go to jail today?"
-- U.S. Department of Justice Windows 98 slogan