Stupidest thing I ever saw in programming: I inhe...
# random
s
Stupidest thing I ever saw in programming: I inherited a C application storing dates and other data in a binary format. By looking at the code, I realized that dates were being stored as follows: subtracting 2000 from the current year, then storing the resulting number in the upper 4 bits of a byte (the lower four bits were for the month, of course!) The worst part is that this code was obviously written AFTER the whole Y2K debacle, and yet the person who wrote it was worried about squeezing a year into four bits to save on disk space. It was already 2012 at the time I discovered that.
n
My first development role was on mainframe coding in PL/1 (yes I am that old 🤭) and dates stored much like that were common place. As I recall the main reason was that storage was incredibly expensive. It was a complete pain to convert to and from but looking at data in print outs, you did get used to and a feel for the date from looking at the the ascii.
s
I sort of get that, and in a 20th century app it was common place. I started programming in the 90's so I certainly saw my share of that, especially with low level languages. But to see someone doing that sort of thing after the year 2000 was a bit unexpected. Especially given that storage costs have really been so much more reasonable in the last 20 years than they were prior to that.
n
Oh for sure, it's all kinds of bizarre!