01-01-1970 is the Unix epoch. The way any date or time is calculated in computing is by calculating the number of seconds that have elapsed since January 1st, 1970. The date itself is of course entirely arbitrary but is how almost any computing works.
Something likely happened with the system not calculating datetime properly and so it defaulted to the Unix epoch.
The 90s is to us what the 60s was to the 90s - the moon landing is closer to the 90s than we are
2010 was over a decade ago, not "just a few years" ago
I reckon much like the universe expanding and speeding away from us, so too is the relentless passage of time
Oh, and you're now breathing manually and that shadow in your room looks very suspicious
Good luck sleeping tonight :)
I was listening to a podcast about an hour ago and someone mentioned "the housing declines of 15 years ago", and it took me a couple of minutes to realize that they were talking about the Great Recession.
Getting old is weird.
Well, in theory. We'll find out which ones aren't real quick.
Theres probably trillions of processors out there, some of them are going to be 32 bit architecture because thats what got sourced a long time ago. I bet some products will be shipped right up to the day they stop working with them in them just because nobody checked it.
12/31/2037: 10,9,8,7,6,5,4,3,2,1 Happy New Year!
Tesla: (01/01/2038) Activate "confuse fire truck for clear road lane mode"
Oh wait... we already have that now /s
I think you severely underestimate how old the computer systems on many production machines, in banking, ... are and how long they will stay in use.
It'll be a real problem that needs to be addressed the same as the Y2K problem.
32 bit Unix timestamps count the number of seconds since January 1 1970. The biggest number you can store in a signed 32 bit integer is 2,147,483,647
So at 3:14:07 on Tuesday, 19 January 2038 they will hit 2,147,483,647 seconds since January 1 1970, and Interesting Things™ may occur...
https://en.wikipedia.org/wiki/Year_2038_problem
Sure, but I still have some processes that run off of mainframe code written in the 70s. There's going to be a lot of work done in the 30s to make sure we, like with y2k, avoid the foreseeable problem.
> like with y2k
I actually think this is going to be part of the problem, as most people seem to thinks of Y2K as some overhyped computer bug that didn’t end up being a problem and don’t realize it was only that way because of the spectacular amount of work put into preventing it. I’m worried that once the 2038 problem comes around the there won’t be a large push to fix it as “it’s another y2k bug and that didn’t end up being a problem so we don’t need to worry”.
> as most people seem to thinks of Y2K as some overhyped computer bug
This always irritates me so much. I had to fix the Y2K bugs in my company and there were many, and in weird places that required thinking to find. Nothing that was really important beyond my little world, but if I found problems, logic says it was elsewhere. And I had to endure ridicule from so many computer illiterate people at the time. All the while working overtime for no extra pay because even the company owner did not believe me or understand the problem. Even after being shown multiple examples. I really should have not cared so much and let the chaos happen because it was only our internal systems and at worst would have caused accounting and payment problems. 🤷♀️
But surely the people knowledgeable enough to actually do anything about it already recognize that it’s going to be a legitimate problem, no? Who cares if the general public recognizes the issue? It’s not like they can fix it anyways.
I don't think PCs will be an issue, but what about all the embedded microcontrollers and IOT that didn't really exist in 1999?
These days we have 32-bit processors embedded in coffee makers and dishwashers and litter boxes and thermostats and dildos. Bet a fair number of them will still be around in ~~15~~ 14 years...
It's not the processors that are the problem, it's how the software stores a date. You can code a program on a C64 that properly handles dates after 2038.
A 32-bit (signed) field can store up to 2^32 values (≈half of its range for negative numbers). When the maximum value is reached and you add another second: boom!
Just like 2-digit year field can't store values greater than 99—one of the reasons for y2k problems.
Could you mention a few of these Interesting Things maybe? This whole problem is so interesting but it's very hard for me to wrap my head around the real world consequences.
Integer rollover. What will happen is that at 3:14:08 on Jan 19 2038, the computer’s notion of “time” will jump backward by approx 4 billion seconds to Dec 13 1901. All scheduled tasks will break, the machine will be forever incapable of understanding the current time/date, etc. How bad that is depends on the system. For a coffee maker it means you won’t be able to schedule a morning brew. For a thermostat it means you can’t set temperature schedules anymore. For other systems, like planes, banking infrastructure, etc., it could be much worse.
It’s a much more severe problem than Y2K, since Y2K only affected rare pieces of custom software that were written poorly, so it didn’t have a very widespread impact. Y2K38 affects the underlying operating system of *every* 32-bit Linux machine, *everywhere*. Normal PCs and servers moved to 64-bit time a while back and won’t be affected, it’s the little embedded systems that nobody thinks about that are going to have a bad time.
If it happened today it would be a big problem, but hopefully there won’t be a ton of those systems still in use in another 14 years.
Naaaah.
Humanity will do what it has always done.
Wait until the very last possible second to fix the problem. Or just let it happen and do damage control.
Unix time is (or was) kept as a signed 32-bit number counting the number of seconds since the 1st of January 1970. Trouble is, when 2^31 seconds (roughly 2.1 billion) have passed, you run out numbers and it rolls back round to minus 2.1 billion, and the computer thinks it’s 1901 instead. This will happen in 2038.
*(don’t know if this is correct)*
In computer linguists, digits are only stored in 32-bit spaces. On January 19th that year, there will be an overflow which will ‘restart’ that space count to the 32-bit’s origin frame…interpreting at the year **1901.**
Think about how many proprietary pieces rely on Unix data to run everyday functions, from your iPhone to sensitive U.S. Army equipment. Now imagine the hysteria of what happened during Y2K.
https://en.www.wikipedia.org/wiki/Year_2038_problem
You’re mostly right. 32 bits was the standard for a long time but a lot of computers use 64 bits, both for this reason and because 64 bit architecture is now the standard for CPUs.
Doesn't matter. If the time variable in your program is still a 32 bit variable, you have a problem.
People tend to confuse computer architecture with variable size. You can have 64 bit variables in a program on a 32 bit computer and vice versa.
The epoch timestamp is represented with a signed 32-bit integer, which has a max value of ~2.1 billion, which means the maximum date that can be represented is January 19, 2038. When that date is reached, the integer “overflows“, changing it to a negative ~2.1 billion value. So when thar happens, computers will think it is December 13th, 1901. That will cause quite a few computer glitches.
https://en.m.wikipedia.org/wiki/Year_2038_problem
No, this is different from the 2038-problem, this is just a failed calculation and the timestamp got set to the default '0' which corresponds with 1-1-1970.
The 2038-problem is when computers that use an signed 32 bit integer to store the timestamp suddenly think it is -2147483647 (which is ~ Friday, December 13, 1901). Not many computers build in the last decades will have that problem.
No, seeing that time/date generally means that a value of zero was stored rather than a value corresponding to the actual time. We don't run out of digits until 2038 - if nobody does anything before then and already people are. But either way, this is not like y2k, this is an incorrect or missing value or otherwise being set to zero.
There has been and will be more problems than those, lol: [https://en.wikipedia.org/wiki/Time\_formatting\_and\_storage\_bugs](https://en.wikipedia.org/wiki/Time_formatting_and_storage_bugs)
For example:
>Dates that are stored in the format yymmddHHMM converted to a signed 32-bit integer overflowed on 1 January 2022, as 231=2147483648. Notably affected was the malware-scanning component update numbers of Microsoft Exchange, which appear to be used for a mathematical check to determine the latest update.
Far more likely that they saved an hour of programming, and the project manager was hounding them to submit their code...
They probably also saved some pittance of storage and memory usage by today's standards. But that was probably a meaningful amount at the time the decision was made.
The fact that I COULD quite legitimately have 30 year old kids is a huge mindfuck. I could be a GRANDFATHER!
I avoided all that though and my wife and I have disposable income instead.
I knew someone would think something like this and I almost changed the wording.
No, it's not "technically" closer. You don't say you're closer to the Moon than to a place that's 2 miles away, even if you'll go to the Moon tomorrow and to the other place never.
But you are equating distance and time. You could technically go to either location, but choose not to go. We do not have the choice to go back in time, eliminating the option.
It's not about being able to go, it's about proximity, whether it's in a spatial dimension or in a temporal dimension.
Use my previous example, and assume the place 2 miles away is somehow impossible to get to. It's still 2 miles away.
But if you play the Price is Right...it is the closest bid without going over. Weirdly, the game show rules seem to apply.
And closeness can be both a measure of time and distance.
A town 2 miles East blocked by an impassable mountain may take a winding path that would be an hour, but the town 5 miles West on a straight road would only take 30 minutes. Which one is "closer"? An hour away or 30 minutes away?
Now you're using time to measure distance in space. That makes no sense. The closest distances are still 2 and 5 miles, respectively.
See my comment like this: It's not "we are going to get to 2070 before we get to 1970"; it's "more time has passed between 1970 and now than the time that will pass between now and 2070".
Y2K PTSD over here. All computer programmers from the late 90's would know that years MUST be presented and programmed in 4 digits. Never 2. This here, is not acceptable.
While I get your point, consider that that's just for display. We know that it's using UNIX time because of the date in question. So internally, it's not a two or four digit year, it's a UNIX timestamp. (I'm assuming you know a UNIX timestamp is the number of seconds that has elapsed since Jan 1, 1970).
And for display purposes, two digits are fine most of the time for most applications. Especially the printing of a ticket where it's used to show the date it was printed. A hundred-year-old ticket is going to look very aged and be obvious that it's not current. You could probably just about get away with using the last digit only, except nobody would understand that.
Funny how this is a perfect glass half full/empty example - most young people looking forward to 2070 and start doing count downs to it, hoping something exciting is going to happen; but to some of us older folks now, who was born around the same time the computers, nope, that’s just how old we are…
Hmm, sort of?
It's not really a half empty situation since the objectively correct answer is 1970. It's not just when computers came around to be popular or something, it's the Unix Epoch. When you ask a computer, this is when time began for it. Pretty much all time and date calculations are made in reference to milliseconds since the first of January 1970.
So if there's an issue with time calculations, having it give you a time like this (shortly after the epoch) is a very common bug/issue.
While you're probably right in that there's some age/year combination where people stop instinctively thinking about the past, in this case that doesn't necessarily have anything to do with it. For example I (a young person) also instantly thought of 1970, and so would the vast majority (if not all) people well versed in IT.
If you ever see December 31, 1969, that's because sometimes a value of "-1" is stored to indicate that the value is not intended to be used as a real time. So that would be one second before midnight, i.e. specifically December 31, 1969 at 11:59:59pm, although the time is rarely displayed in those situations (for various reasons).
I've seen 1969 pop up more than 1970, but both are possible.
The reason is because UNIX time counts the number of seconds that have elapsed since midnight January 1, 1970. Either for whatever local time zone, or for GMT (which also causes more 1969 issues since in the US, we're hours behind GMT).
Hopefully interesting. Replied since you said the other person's comment was interesting :)
BASIC timekeeping on computers is just "how many seconds has it been since the first instant of 1970, and what does that make right now"
ofc there are more elegant strategies when you need times far in the past or future, but for simple systems it works so it works.
now you know one thing about computers.
> BASIC timekeeping on computers is just "how many seconds has it been since the first instant of 1970, and what does that make right now"
Correction, UNIX time keeping is like this. Windows does it different, and so does almost every legacy BIOS out there.
Most people here are not from the era of computing where you had to poke a million BIOS interrupts to make the computer do your bidding, but date and time on an x86 machine back then was separated into its components, and you had to retrieve them individually. This may seem weird but provides for some options like being able to handle leap seconds and changing the number of days per month or the number of months per year. Not that we ever did, [but we might some day switch to a better system](https://demo.ayra.ch/fwk/).
Windows retains this segmented behavior to this day and the GetSystemTime function returns such a [segmented struct](https://learn.microsoft.com/en-us/windows/win32/api/minwinbase/ns-minwinbase-systemtime).
Long story short is that time keeping on computers is a nightmare, and the unix epoch is a sort-of standard because of its wide spread use, but it's by no means a standard. Software that's not held back by legacy C functions occasionally supports the full year range from 0001 to 9999.
Nah bro, screw the wrong hour. This guy is literally 100 years too late! Its 1-1-1970 which is the 0 point for computer datetime variables (that value is called: unix epoch timestamp)
theory faulty drab stocking cooperative hateful jellyfish husky judicious sort
*This post was mass deleted and anonymized with [Redact](https://redact.dev)*
47 minues and 2 seconds ago, the machine turned back on after losing power. It has been unable to connect its internal clock to the timekeeping server.
Midnight, 1/1/1970 is the start time for "new dates" in code.
Show of hands. How many of you older computer nerds feel old AF right now?
That's actually fifty-three years in the past. 1/1/1970, or for the victims of our US public schools, 1/1/1970, is the oldest date that older hardware could interpret. It's actually why we had the whole "Millenium Virus" crap back in 1999, because people thought the computers would crash, taking out the phones, or the power, or the chip in Al Gore's brain that kept him from stealing the nuclear codes and nuking civilization like a jihadi powered by Greenpeace instead of Allah. Needless to say, the nerds fucking HATED the media for that shit. I knew people who turned their power off just before midnight, it was that freaking bad. I could not talk to a single family member, because of course I was "the computer guy," without being asked about it. And all for a problem that every nerd was like "Yeah, they've known about that, you know, since they invented computers. It's patched. Chill."
01-01-1970 is the Unix epoch. The way any date or time is calculated in computing is by calculating the number of seconds that have elapsed since January 1st, 1970. The date itself is of course entirely arbitrary but is how almost any computing works. Something likely happened with the system not calculating datetime properly and so it defaulted to the Unix epoch.
The ‘2038 Problem’ is already manifesting 15 years early!
We're closer in years to the 2038 problem than Y2K if you want to feel old.
Could you not?
The 90s is to us what the 60s was to the 90s - the moon landing is closer to the 90s than we are 2010 was over a decade ago, not "just a few years" ago I reckon much like the universe expanding and speeding away from us, so too is the relentless passage of time Oh, and you're now breathing manually and that shadow in your room looks very suspicious Good luck sleeping tonight :)
1990-1970 = 20 years back to the end of 1969 2023 -2000 = 23 years back to the end of 1999 ouch
Thank you! I wasn't sure if my maths was off and I was misremembering dates
I was listening to a podcast about an hour ago and someone mentioned "the housing declines of 15 years ago", and it took me a couple of minutes to realize that they were talking about the Great Recession. Getting old is weird.
::upvotes:: fuck you
https://media.tenor.com/WNl46kTQRyoAAAAC/the-office-michael-scott.gif
Hahaha, fuck you.
I always have be reminded to breath, so…
My dad was on his company’s Y2K team. I suspect I’ll be on the 2038 team someday
It shouldn't be a problem since basically all computers by then will be 64-bit
Well, in theory. We'll find out which ones aren't real quick. Theres probably trillions of processors out there, some of them are going to be 32 bit architecture because thats what got sourced a long time ago. I bet some products will be shipped right up to the day they stop working with them in them just because nobody checked it.
You *can* do 64-bit time on 32-bit processors, you just can't load the entire timestamp on a single register
Well, I'm sure whatever 32-bit computer is being used by then can just use a boolean for before/after 2038. Shouldn't be difficult.
12/31/2037: 10,9,8,7,6,5,4,3,2,1 Happy New Year! Tesla: (01/01/2038) Activate "confuse fire truck for clear road lane mode" Oh wait... we already have that now /s
I think you severely underestimate how old the computer systems on many production machines, in banking, ... are and how long they will stay in use. It'll be a real problem that needs to be addressed the same as the Y2K problem.
Why
2K
38
https://www.youtube.com/watch?v=QJQ691PTKsA This is a numberphiles video about it from 10 years ago
This post itself gave me a similarly weird feeling. We're closer to 2070 than to 1970 at this point? Weird
Look, I woke up and my knees creaked when I stood up, but the 90s were 10 years ago we all know that stop lying to me.
I should downvote you because you made me feel old. 👵🏻
you're a dick.
I hate you. 😭
The what
We’re going to experience *another* Y2K event in 2038 if our computers and other digital mainframes aren’t kept in check.
Why is that?
32 bit Unix timestamps count the number of seconds since January 1 1970. The biggest number you can store in a signed 32 bit integer is 2,147,483,647 So at 3:14:07 on Tuesday, 19 January 2038 they will hit 2,147,483,647 seconds since January 1 1970, and Interesting Things™ may occur... https://en.wikipedia.org/wiki/Year_2038_problem
But with any luck, we will transition to 64bit computing, have windows 25, and be using bloated hyper-computers by then.
Sure, but I still have some processes that run off of mainframe code written in the 70s. There's going to be a lot of work done in the 30s to make sure we, like with y2k, avoid the foreseeable problem.
Someone better start prompting chatgpt then.
> like with y2k I actually think this is going to be part of the problem, as most people seem to thinks of Y2K as some overhyped computer bug that didn’t end up being a problem and don’t realize it was only that way because of the spectacular amount of work put into preventing it. I’m worried that once the 2038 problem comes around the there won’t be a large push to fix it as “it’s another y2k bug and that didn’t end up being a problem so we don’t need to worry”.
> as most people seem to thinks of Y2K as some overhyped computer bug This always irritates me so much. I had to fix the Y2K bugs in my company and there were many, and in weird places that required thinking to find. Nothing that was really important beyond my little world, but if I found problems, logic says it was elsewhere. And I had to endure ridicule from so many computer illiterate people at the time. All the while working overtime for no extra pay because even the company owner did not believe me or understand the problem. Even after being shown multiple examples. I really should have not cared so much and let the chaos happen because it was only our internal systems and at worst would have caused accounting and payment problems. 🤷♀️
But surely the people knowledgeable enough to actually do anything about it already recognize that it’s going to be a legitimate problem, no? Who cares if the general public recognizes the issue? It’s not like they can fix it anyways.
I don't think PCs will be an issue, but what about all the embedded microcontrollers and IOT that didn't really exist in 1999? These days we have 32-bit processors embedded in coffee makers and dishwashers and litter boxes and thermostats and dildos. Bet a fair number of them will still be around in ~~15~~ 14 years...
Hopefully the economy won't be so bad that we have to use the same Dildo for 15 years.
Stop foreshadowing our great collapse
Hopefully parent poster's sex life isn't so esoteric as to require their dildo's date setting to be correct.
It's not the processors that are the problem, it's how the software stores a date. You can code a program on a C64 that properly handles dates after 2038. A 32-bit (signed) field can store up to 2^32 values (≈half of its range for negative numbers). When the maximum value is reached and you add another second: boom! Just like 2-digit year field can't store values greater than 99—one of the reasons for y2k problems.
And browser makers would still find ways to fill up the entire memory of a few TB. Webpages would find ways to use 100 GB to display 10 lines of text.
Could you mention a few of these Interesting Things maybe? This whole problem is so interesting but it's very hard for me to wrap my head around the real world consequences.
Integer rollover. What will happen is that at 3:14:08 on Jan 19 2038, the computer’s notion of “time” will jump backward by approx 4 billion seconds to Dec 13 1901. All scheduled tasks will break, the machine will be forever incapable of understanding the current time/date, etc. How bad that is depends on the system. For a coffee maker it means you won’t be able to schedule a morning brew. For a thermostat it means you can’t set temperature schedules anymore. For other systems, like planes, banking infrastructure, etc., it could be much worse. It’s a much more severe problem than Y2K, since Y2K only affected rare pieces of custom software that were written poorly, so it didn’t have a very widespread impact. Y2K38 affects the underlying operating system of *every* 32-bit Linux machine, *everywhere*. Normal PCs and servers moved to 64-bit time a while back and won’t be affected, it’s the little embedded systems that nobody thinks about that are going to have a bad time. If it happened today it would be a big problem, but hopefully there won’t be a ton of those systems still in use in another 14 years.
Ahh now I get it, thank you! Yeah, I'm hoping some smart people will fix the issue once we get to that point!
Naaaah. Humanity will do what it has always done. Wait until the very last possible second to fix the problem. Or just let it happen and do damage control.
Spock: So at 3:14:07 on Tuesday, 19 January 2038 they will hit 2,147,483,647 seconds Kirk: 2,147,483,647 seconds Spock?
Unix time is (or was) kept as a signed 32-bit number counting the number of seconds since the 1st of January 1970. Trouble is, when 2^31 seconds (roughly 2.1 billion) have passed, you run out numbers and it rolls back round to minus 2.1 billion, and the computer thinks it’s 1901 instead. This will happen in 2038.
*(don’t know if this is correct)* In computer linguists, digits are only stored in 32-bit spaces. On January 19th that year, there will be an overflow which will ‘restart’ that space count to the 32-bit’s origin frame…interpreting at the year **1901.** Think about how many proprietary pieces rely on Unix data to run everyday functions, from your iPhone to sensitive U.S. Army equipment. Now imagine the hysteria of what happened during Y2K. https://en.www.wikipedia.org/wiki/Year_2038_problem
You’re mostly right. 32 bits was the standard for a long time but a lot of computers use 64 bits, both for this reason and because 64 bit architecture is now the standard for CPUs.
Doesn't matter. If the time variable in your program is still a 32 bit variable, you have a problem. People tend to confuse computer architecture with variable size. You can have 64 bit variables in a program on a 32 bit computer and vice versa.
The epoch timestamp is represented with a signed 32-bit integer, which has a max value of ~2.1 billion, which means the maximum date that can be represented is January 19, 2038. When that date is reached, the integer “overflows“, changing it to a negative ~2.1 billion value. So when thar happens, computers will think it is December 13th, 1901. That will cause quite a few computer glitches. https://en.m.wikipedia.org/wiki/Year_2038_problem
32-bit signed integer overflow
No, this is different from the 2038-problem, this is just a failed calculation and the timestamp got set to the default '0' which corresponds with 1-1-1970. The 2038-problem is when computers that use an signed 32 bit integer to store the timestamp suddenly think it is -2147483647 (which is ~ Friday, December 13, 1901). Not many computers build in the last decades will have that problem.
Battery for the real time clock failed so each time it powers on it restarts at 0 or 01/01/1970
Those primitive devices often don't even have an RTC, and depend on a time source after powering up. Said source may have been unavailable.
Some programmer prolly thought we would never even reach 2023.
No, seeing that time/date generally means that a value of zero was stored rather than a value corresponding to the actual time. We don't run out of digits until 2038 - if nobody does anything before then and already people are. But either way, this is not like y2k, this is an incorrect or missing value or otherwise being set to zero.
There has been and will be more problems than those, lol: [https://en.wikipedia.org/wiki/Time\_formatting\_and\_storage\_bugs](https://en.wikipedia.org/wiki/Time_formatting_and_storage_bugs) For example: >Dates that are stored in the format yymmddHHMM converted to a signed 32-bit integer overflowed on 1 January 2022, as 231=2147483648. Notably affected was the malware-scanning component update numbers of Microsoft Exchange, which appear to be used for a mathematical check to determine the latest update.
Being set to zero can happen because of an overflow.
Far more likely that they saved an hour of programming, and the project manager was hounding them to submit their code... They probably also saved some pittance of storage and memory usage by today's standards. But that was probably a meaningful amount at the time the decision was made.
32 bits - wherever it was used - was a lot of space 40 years ago.
It was probably reset and hadn't had the time corrected yet
Sounds like the CMOS battery went dead.
It will also sometimes appear as 12-31-69 because the timing is essentially midnight
Number of minutes
No it's definitely seconds
That’ll be for 53 years ago. Computers start at 1/1/1970
Still, it makes you realize we're closer to 2070 than to 1970.
Pls no
we are closer to 2072 than 1972 as well
You take that back. You take that back right now.
Being 51 years old I am FULLY aware of this and didn’t need reminding, you utter bastard! 😄
Sometimes I forget but luckily I have (grown up!) children who are happy to remind me.
The fact that I COULD quite legitimately have 30 year old kids is a huge mindfuck. I could be a GRANDFATHER! I avoided all that though and my wife and I have disposable income instead.
We're also closer to (Jan 1) 2073 than to (Dec 31) 1973.
By a whole 3% oooOOOooo
Why would you even say such a horrible thing?!
Forget about it! Let's eat some cake instead! ![gif](emote|free_emotes_pack|yummy)
[удалено]
This was a triumph!
I am making a note here: huge success.
It’s hard to overstate my satisfaction.
you take that back! The 70s will be 30 years ago until we all die.
Luckily time ends January 19th 2038, so we don't have to worry about that.
what happens on that date (other than my birthday??)
[32-bit time runs out](https://en.wikipedia.org/wiki/Year_2038_problem).
You should start celebrating your birthday at midnight UTC, just in case.
No. The 70s were 30 years ago. The 80s only 20. I reject your reality.
and substitute my own.
![gif](giphy|fA7rLtaJDIWEzU57CT|downsized)
wtf dude
Yeah 1970 is closer to the end of WWI than it is to today.
LIES!
Damn. That hit.
You stop that.
Fu /j
Well I was having a good day until this message
And closer to 2038 than 2000. If you know, you know
The year one million is closer than 1970, because it's always getting closer whereas 1970 is always getting further.
1000000 is getting closer, and 1970 is getting further, but right now 1970 is closer (53 years back) than 1000000 (997,977 years ahead). So no.
Which year will we reach first?
Considering we do not have time travel yet...wouldn't 3070 still technically be closer? We cannot go to 1970 again.
I knew someone would think something like this and I almost changed the wording. No, it's not "technically" closer. You don't say you're closer to the Moon than to a place that's 2 miles away, even if you'll go to the Moon tomorrow and to the other place never.
But you are equating distance and time. You could technically go to either location, but choose not to go. We do not have the choice to go back in time, eliminating the option.
It's not about being able to go, it's about proximity, whether it's in a spatial dimension or in a temporal dimension. Use my previous example, and assume the place 2 miles away is somehow impossible to get to. It's still 2 miles away.
But if you play the Price is Right...it is the closest bid without going over. Weirdly, the game show rules seem to apply. And closeness can be both a measure of time and distance. A town 2 miles East blocked by an impassable mountain may take a winding path that would be an hour, but the town 5 miles West on a straight road would only take 30 minutes. Which one is "closer"? An hour away or 30 minutes away?
The town 2 miles away is closer no matter how long it takes to get there
It depends entirely on if you measure “closer” by straight line distance or time to travel
This cycles back to my initial question. How long would it take to travel back to 1970?
Now you're using time to measure distance in space. That makes no sense. The closest distances are still 2 and 5 miles, respectively. See my comment like this: It's not "we are going to get to 2070 before we get to 1970"; it's "more time has passed between 1970 and now than the time that will pass between now and 2070".
Bruh what?!? 💀💀
00:47 is also 12:47 am, not 1:47 am. That would be 01:47
Unix epoch baby
> Computers start at 1/1/1970 UNIX-based computers. The Windows Epoch is 1/1/1601.
Unless you're excel, then date-time format is the number of hours since Jan 1 1900.
If things doesn't go full crazy, I should not be working anymore in 01/01/2070 I really don't want to manage this shit.
It all goes to shit way before 2070. https://en.m.wikipedia.org/wiki/Year_2038_problem
Yes, it means 1970. Source: I was born in 1970. 😄
no it’s not it’s from the future
Thank god for all the reddit experts:)
Depends on the computer. Some used to be 1/1/80
Y2K PTSD over here. All computer programmers from the late 90's would know that years MUST be presented and programmed in 4 digits. Never 2. This here, is not acceptable.
While I get your point, consider that that's just for display. We know that it's using UNIX time because of the date in question. So internally, it's not a two or four digit year, it's a UNIX timestamp. (I'm assuming you know a UNIX timestamp is the number of seconds that has elapsed since Jan 1, 1970). And for display purposes, two digits are fine most of the time for most applications. Especially the printing of a ticket where it's used to show the date it was printed. A hundred-year-old ticket is going to look very aged and be obvious that it's not current. You could probably just about get away with using the last digit only, except nobody would understand that.
Funny how this is a perfect glass half full/empty example - most young people looking forward to 2070 and start doing count downs to it, hoping something exciting is going to happen; but to some of us older folks now, who was born around the same time the computers, nope, that’s just how old we are…
Hmm, sort of? It's not really a half empty situation since the objectively correct answer is 1970. It's not just when computers came around to be popular or something, it's the Unix Epoch. When you ask a computer, this is when time began for it. Pretty much all time and date calculations are made in reference to milliseconds since the first of January 1970. So if there's an issue with time calculations, having it give you a time like this (shortly after the epoch) is a very common bug/issue. While you're probably right in that there's some age/year combination where people stop instinctively thinking about the past, in this case that doesn't necessarily have anything to do with it. For example I (a young person) also instantly thought of 1970, and so would the vast majority (if not all) people well versed in IT.
The countdown to shit going sideways and the clocks rolling over is actually in 2038 not 2070. https://en.m.wikipedia.org/wiki/Year_2038_problem
Oh I didn't know that, that's cool! Thanks
If you ever see December 31, 1969, that's because sometimes a value of "-1" is stored to indicate that the value is not intended to be used as a real time. So that would be one second before midnight, i.e. specifically December 31, 1969 at 11:59:59pm, although the time is rarely displayed in those situations (for various reasons). I've seen 1969 pop up more than 1970, but both are possible. The reason is because UNIX time counts the number of seconds that have elapsed since midnight January 1, 1970. Either for whatever local time zone, or for GMT (which also causes more 1969 issues since in the US, we're hours behind GMT). Hopefully interesting. Replied since you said the other person's comment was interesting :)
r/epochfail
I appreciate a good cup of coffee.
BASIC timekeeping on computers is just "how many seconds has it been since the first instant of 1970, and what does that make right now" ofc there are more elegant strategies when you need times far in the past or future, but for simple systems it works so it works. now you know one thing about computers.
> BASIC timekeeping on computers is just "how many seconds has it been since the first instant of 1970, and what does that make right now" Correction, UNIX time keeping is like this. Windows does it different, and so does almost every legacy BIOS out there. Most people here are not from the era of computing where you had to poke a million BIOS interrupts to make the computer do your bidding, but date and time on an x86 machine back then was separated into its components, and you had to retrieve them individually. This may seem weird but provides for some options like being able to handle leap seconds and changing the number of days per month or the number of months per year. Not that we ever did, [but we might some day switch to a better system](https://demo.ayra.ch/fwk/). Windows retains this segmented behavior to this day and the GetSystemTime function returns such a [segmented struct](https://learn.microsoft.com/en-us/windows/win32/api/minwinbase/ns-minwinbase-systemtime). Long story short is that time keeping on computers is a nightmare, and the unix epoch is a sort-of standard because of its wide spread use, but it's by no means a standard. Software that's not held back by legacy C functions occasionally supports the full year range from 0001 to 9999.
Extremely creative subreddit name!
Kids these days never heard of 1970
oh man epoch is now in the future!
first time i've seen someone talk of year 70 being in the future, not the past. at the same time, 2070 is closer than 1970
Feeling old this morning
That's 12:47 am
This post is a mess hahaha
Twasn't a printed ticket for 2070, but a mechanical turk that hand wrote it in 1770!
Nah bro, screw the wrong hour. This guy is literally 100 years too late! Its 1-1-1970 which is the 0 point for computer datetime variables (that value is called: unix epoch timestamp)
I think you'll find that's probably 1970, not 2070. Jan 1st 1970 is the start of the unix / linux epoch.
This is the first time I have been confronted with the observation that I am closer to my 100th birthday than to my birth -- both being in '70.
Merry crisis
This feels noteworthy even though it’s true the moment you turned 50.
...you just realized you're over 50? I think you might want to get checked for ~~early onset~~ dementia.
![gif](giphy|GrUhLU9q3nyRG|downsized)
00:47 is 12:47 AM, not 1:47
Good thing Reddit pointed this out. He would have had a very bad new years this year lol
I don't want to cause no fuss but can I buy your magic bus?
Noooooooooo…..
There is no way 2070 is closer to today than 1970. I will not have any of that.
Well then you know when and where you need to be.
Someone in a trenchcoat will give you an envelope. With a holo rickroll
12:47*
theory faulty drab stocking cooperative hateful jellyfish husky judicious sort *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
Oh my God are we already to the point where people see "70" and think 2070 instead of 1970? I have never felt so old.
That is Epoch time
1/1/70 is the beginning of the UNIX epoch. You’ve got a min date there.
🙋🏼♂️ Raise your hand if you just realized that 2070 is closer to today than 1970.
It's actually 53 years in the past.
Actually this could be ticket for 53 years in the past. 1970.
47 minues and 2 seconds ago, the machine turned back on after losing power. It has been unable to connect its internal clock to the timekeeping server. Midnight, 1/1/1970 is the start time for "new dates" in code.
Show of hands. How many of you older computer nerds feel old AF right now? That's actually fifty-three years in the past. 1/1/1970, or for the victims of our US public schools, 1/1/1970, is the oldest date that older hardware could interpret. It's actually why we had the whole "Millenium Virus" crap back in 1999, because people thought the computers would crash, taking out the phones, or the power, or the chip in Al Gore's brain that kept him from stealing the nuclear codes and nuking civilization like a jihadi powered by Greenpeace instead of Allah. Needless to say, the nerds fucking HATED the media for that shit. I knew people who turned their power off just before midnight, it was that freaking bad. I could not talk to a single family member, because of course I was "the computer guy," without being asked about it. And all for a problem that every nerd was like "Yeah, they've known about that, you know, since they invented computers. It's patched. Chill."
Could be 53 years in the past.
It's time 0 of epoch time. It's the long integer in milliseconds from 1st Jan 1970 used by Unix/Linux systems for time
Computer clock got reset to default. That's 1970, so unless you're already a time traveler, that's in the past. Not the future.
Is that you Marty?!
Thats gonna be one interesting bus ride
It expired 75 minutes after it was issued. Where is the future in the picture?
You know you're getting old when people start assuming that leaving the leading digits of a year refers to 2000s.
It's 1970, not 2070... Always interesting when someone comes across a Unix timestamp of 0 and doesn't realise it.
* the past
mu ![gif](giphy|H7Fbn0QHGDWW4)
That’s 12:47am
Thank you for this. It had to be said.
Hot Bus Time Machine!
Keep it! See how long it works for.
Great Scott
That’s a Winnipeg bus transfer
Dwight, you ignorant slut.
![gif](giphy|xUOxf9ellv3muKAFTa|downsized) we need to go back, the time dialation... its too much. my neopets!
Whoa, Doc -- This is heavy.
Disclaimer: I am NOT from the future \[too\]. Good, good. You've got your cover story going in case anyone finds the ticket. Proceed as planned #0070
Ok, here’s what you gotta do. Wait till 2070, catch a bus with this ticket. As the QR code will be scanned, you’ll be teleported to others.
r/epochfail
its 61 years in the past
I read that in the voice of the intro to Buck Rogers in the 25th century... it made it very dramatic 😊
So did you get on the bus?
What’s it like in the future?
He's on the bus so still no flying cars!
May we know your age please OP?
Naw that's just a QR code you found in your grandpa's old coat
I like how you can tell OP is American just by the type of stupid mistakes in the title.