Stupidest thing I ever saw in programming: I inherited a C application storing dates and other data in a binary format. By looking at the code, I realized that dates were being stored as follows: subtracting 2000 from the current year, then storing the resulting number in the upper 4 bits of a byte (the lower four bits were for the month, of course!)
The worst part is that this code was obviously written AFTER the whole Y2K debacle, and yet the person who wrote it was worried about squeezing a year into four bits to save on disk space. It was already 2012 at the time I discovered that.