I shortcircuit on datelines, I hate them

There's something I should have done few weeks ago. Is a simple project, really. But they put a dateline on it. Every time I see a dateline, my brain short-circuits. I can help it but to stare at my documents with no idea of where to start. Is really hard to focus... All I can think about is the dateline aproaching. Can't concentrate on the task at hand.

I quit my 1st real IT job as a developer because of the datelines. I hate datelines. I hate them, I hate them , I hate them!!!

EDIT: Just cuz I just realized that soon some smarty arse texican is gonna post that this thread is due in MMDDYYYY just to watch my brain fry for fun. :S

There's something I should have done few weeks ago. Is a simple project, really. But they put a dateline on it. Every time I see a dateline, my brain short-circuits. I can help it but to stare at my documents with no idea of where to start. Is really hard to focus... All I can think about is the dateline aproaching. Can't concentrate on the task at hand.

I quit my 1st real IT job as a developer because of the datelines. I hate datelines. I hate them, I hate them , I hate them!!!

EDIT: Just cuz I just realized that soon some smarty arse texican is gonna post that this thread is due in MMDDYYYY just to watch my brain fry for fun. :S

I bet you left out the dependencies and forgot to account for weekends and holidays when determining the duration.

You are talking about programming ease, I think the others were talking about processor ease.

In other words, YYYYMMDD can be compared with one (hardware) instruction. MMDDYYYY cannot. You can, in some machines, rearrange the sequence in one instruction and do the compare in another but it still isn't equal (even ignoring the additional storage references). It would be possible with microcode changes to invent an instruction that compared difference date sequences in one instruction. I do not know of anyone that has done that. These days, it hardly seems worthwhile as processors are pretty dang fast.

As for YYYYMMDD vs. DDMMYYYY, from a computer point of view, YYYYMMDD may make sense but from a human point of view DDMMYYYY may make more sense as one only has to read as much as necessary, e.g., just the DD may be enough as you already know the MMYYYY. Your brain is pretty good at dismissing useless stuff pretty fast.

Actually, overloading those suckers is a pain in the arse... but since Saturday's are my S.A.P. days (Strawman Argument Practice) I will say this...

<strawman>
That obsession with making it easy on the processor was what gave us the darn W2K! Just because DDMMYY was faster than DDMMYYYY...
</strawman>

Wow I did?

PITA aside, it still only has to be coded once but executed every time.

And the pain with W2K was not that YY was faster, it is because it took less storage. Storage used to be a more expensive resource than it is now.

Storing dates in either the preceding formats is not favored anymore. Databases, as an example, store it internally as a number of units following some arbitrary epoch. Number of days since January 1, 1900 is one example. If time is also to be included, it might by number of seconds since January 1, 1900. I just used January 1st 1900 as one example, it depends on the database architecture. These formats allow a variety of calculations to be done quickly. The database (usually) provides a myriad of format options for the User Interface to use that satisfies all locales.

My excuses for the broken English. Too many 12h long night shifts in a row plus a lot of painkillers (of the narcotic kind) tend to turn off my internal ETM (English Transcoder Module). I was indeed talking about my troubles with deadlines. Date lines never caused me any stress at all. I think they call them Escort Services now days...

I always reasoned it was because most of the systems that needed updating were created back when around 8 extra bytes would have been a moderately big deal, but I was only but a toddler back in 2000.

The extra storage was a big deal.

Up through the mid 70's, most input was in the form of 80 column cards. Any columns you can conserve was worthwhile.

Programmers back then thought of disk drives in terms of track capacity. One had to develop record and block sizes that optimally fit on a track.

Internal storage (aka memory) was very expensive. I remember my company spending a million bucks for 1/2 Meg of internal storage (yes 1/2 Meg). I can't remember how many cards it was on, a rack full. Perhaps 32. The smaller your record size, the more you can fit in storage.

And a reason not talked about much but may be the most important, back then we never dreamed the programs we were writing would still be in use 25 years later.