Editing Talk:2697: Y2K and 2038
Please sign your posts with ~~~~ |
Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.
The edit can be undone.
Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision | Your text | ||
Line 35: | Line 35: | ||
:::{{w|Binary-coded decimal}}... Loads of interesting uses (including precision decimal fractions), but of course largely fallen out of favour for various technical and logistial reasons. [[Special:Contributions/172.70.86.61|172.70.86.61]] 14:44, 14 November 2022 (UTC) | :::{{w|Binary-coded decimal}}... Loads of interesting uses (including precision decimal fractions), but of course largely fallen out of favour for various technical and logistial reasons. [[Special:Contributions/172.70.86.61|172.70.86.61]] 14:44, 14 November 2022 (UTC) | ||
:The first computerised passport system for the UK had a y2K issue. In fact, it was designed in, because it was supposed to be replaced before 1999. Unfortunately, progress with its replacement was running late. We thought that we could get away with two digits for certain dates because the software was going to be thrown away before the end of 1999. And yes, two digit years were common in COBOL programs because decimal numbers coded using ASCII or EBCDIC were the default for numeric data. [[User:Jeremyp|Jeremyp]] ([[User talk:Jeremyp|talk]]) 15:32, 13 November 2022 (UTC) | :The first computerised passport system for the UK had a y2K issue. In fact, it was designed in, because it was supposed to be replaced before 1999. Unfortunately, progress with its replacement was running late. We thought that we could get away with two digits for certain dates because the software was going to be thrown away before the end of 1999. And yes, two digit years were common in COBOL programs because decimal numbers coded using ASCII or EBCDIC were the default for numeric data. [[User:Jeremyp|Jeremyp]] ([[User talk:Jeremyp|talk]]) 15:32, 13 November 2022 (UTC) | ||
β | |||
β | |||
:1. Having done programming since 1966, I know that much data was stored on 80-character cards (and way before that year and the IBM System/360) and using 2 characters (2.5% of the card) to store the "19" was not acceptable. As processes moved into the tape and disk world, human nature tended to not expand the field to 4 characters (the future is a long way off until, suddenly, it isn't). [[Special:Contributions/172.70.178.65|172.70.178.65]] 07:57, 13 November 2022 (UTC) | :1. Having done programming since 1966, I know that much data was stored on 80-character cards (and way before that year and the IBM System/360) and using 2 characters (2.5% of the card) to store the "19" was not acceptable. As processes moved into the tape and disk world, human nature tended to not expand the field to 4 characters (the future is a long way off until, suddenly, it isn't). [[Special:Contributions/172.70.178.65|172.70.178.65]] 07:57, 13 November 2022 (UTC) |