return to table of content

The Unix timestamp will begin with 17 this Tuesday

susam
15 replies
2d20h

Unix timestamp 1 600 000 000 was not too long ago. That was on 2020-09-13 12:26:40 UTC. Discussed on HN back then here:https://news.ycombinator.com/item?id=24452885

My own blog post here commemorating the event:https://susam.net/maze/unix-timestamp-1600000000.html

Given that 100 000 000 seconds is approximately 3 years 2 months, we are going to see an event like this every few years.

I believe the most spectacular event is going to be the Unix timestamp 2 000 000 000 which is still 9½ years away: 2033-05-18 03:33:20 UTC. Such an event occurs only once every 33 years 8 months approximately!

By the way, here's 1700000000 on Python:

  $ python3 -q
  >>> from datetime import datetime
  >>> datetime.utcfromtimestamp(1_700_000_000)
  datetime.datetime(2023, 11, 14, 22, 13, 20)
  >>>
GNU date (Linux):

  $ date -ud @1700000000
  Tue Nov 14 22:13:20 UTC 2023
BSD date (macOS, FreeBSD, OpenBSD, etc.):

  $ date -ur 1700000000
  Tue 14 Nov 2023 22:13:20 UTC

scbrg
5 replies
2d19h

I believe the most spectacular event is going to be the Unix timestamp 2 000 000 000 which is still 9½ years away: 2033-05-18 03:33:20 UTC. Such an event occurs only once every 33 years 8 months approximately!

Egads! 33 years! I spent my late 90:ies mudding[0] and for some reason we had a lot of save files named by their epoch timestamp. When I ended up responsible for parts of the code base, I spent a lot of time dealing with those files, and they were all in the 800- or 900- million range. At some point I was pretty much able to tell at a glance roughly what date any number in that range corresponded to, within perhaps a few weeks.

Weird environments foster weird super powers.

[0]https://en.wikipedia.org/wiki/Multi-user_dungeon

twic
4 replies
2d17h

The only reference point i have is that the millennium was very roughly around 1000000000. And that's only because of an agonising C/Prince pun in NTK [1]:

"Tonight I'm gonna party like it's (time_t) 1E9"

[1]http://www.ntk.net/2001/02/23/

basilgohar
1 replies
2d13h

I don't know whether to thank or revile you for sharing that link. I've never heard of NTK before and I was at once amazed at something that tickled me so and also realizing it's no longer around. Now I feel sad.

twic
0 replies
1d22h

It was so good. But then, the web was young, and everything was. You are quite right to feel sad. I know this doesn't help.

tingletech
0 replies
2d12h

I went to an EFF billion second party 2001-09-08 in golden gate park and then the next week was 9/11.

Etheryte
0 replies
2d17h

That pun is horrendous beyond belief, I love it.

reliablereason
3 replies
2d17h

I think 2222222222 is more special Sat Jun 02 2040 03:57:02 GMT+0000

I might even get to experience 3333333333 if I am lucky. What a day, what a day, yes indeed!

jona-f
1 replies
2d8h

I would like to argue, the next and final special date is 2^32=4294967296 which is Sun Feb 7 06:28:16 AM UTC 2106.

xnzakg
0 replies
2d6h

You're using unsigned timestamps?

cmrdporcupine
0 replies
2d17h

I hope you have enough bits!

midasuni
3 replies
2d19h

The spectacular events is in Jan 2038, when it reaches a nice round number of 2,147,483,648

Gare
2 replies
2d19h
midasuni
0 replies
2d4h

Unix time will reach it, even if 32 bit signed representations tick over.

erik_seaberg
0 replies
2d17h

I hope to arrange an extended tour of Iron Mountain for early 2038.

RedCinnabar
0 replies
1d20h

I used to follow your blog about 15 years ago, what a blast of nostalgia! I’ll add it to my RSS feed. Keep it up.

ta1243
10 replies
2d21h

I remember staying up late to see the tick to over from 999,999,999 to 1 billion, thinking "I'll remember this week my whole life". Little did I realise how 60 hours later the whole world would remember.

petrikapu
6 replies
2d21h

please explain

jccooper
2 replies
2d21h

999999999 was on Sept 9, 2001

pphysch
1 replies
2d18h

9/9, a remarkable coincidence

If only it were year 1001 (binary 9)

shriek
0 replies
2d12h

What's even remarkable is 999999999,9/9 has 11 9s.

philshem
0 replies
2d21h

2001-09-09 01:46:40 UTC

in_a_society
0 replies
2d21h

2001-09-09T01:46:40.000+00:00

2 days later...

charred_patina
0 replies
2d21h

September 11th

_kst_
1 replies
2d9h

Timestamp 1000000000 (Sat 2001-09-08 18:46:40 PDT) triggered a bug in the bug reporting system (Remedy) we were using at the time.

The system stored timestamps as a string representing seconds since the epoch, but it assumed it would fit in 9 digits. At 1000000000, it started dropping the last digit, so it went back to Sat 1973-03-03 01:46:40 PST, advancing at 10% of real time. It was fixed fairly quickly.

zoky
0 replies
2d7h

Wow, that’s areallystupid bug, especially given the general level of awareness of the Y2K problem at the time the system probably was being built. Whoever the hell looked at a 9 digit timestamp that started with a 9 and said “nah, we’re never gonna need more than 9 digits to store that” should have their programming license revoked.

mongol
0 replies
2d13h

I didn't remember it was so close but those were the days I obsessively read Slashdot, which helped during 9/11, and which certainly covered the epoch event.

clarkmoody
7 replies
2d19h

One of my favorite bits of Vinge'sA Deepness in the Skyis the use of base-10 time: ksec, Msec, etc. There is a nice time log scale with Earth time to base-10 time conversions.

NoMoreNicksLeft
4 replies
2d18h

I also like the Emergents. Liberal progressives creating a utopia. 100% employment, palaces made out of diamond for everyone.

tekla
1 replies
2d17h

That sounds more like a "The Giver" style distopia

jefftk
0 replies
2d15h

Spoiling the joke, but the Emergents are very clearly dystopian and NoMoreNicksLeft knows it ;)

Definitely give the book a read! One of my favorites.

ekidd
1 replies
2d15h

The Emergent "utopia" is both horrifying and eerily believable. I've known some grad students and some tech workers who are way too close already.

But it'sreallly strangeto try to map the Emergent political structure onto any modern political axis. It's not "liberal progressive" or "traditional conservative" or "libertarian". Or any other popular political ideology. It's certainly authoritarian, but uniquely so. It's almost a dystopia run by project managers and exploiting specialists.

Also a fun bit: The traders in the book count base their epoch on the first moon landing, but if you pay attention, the lowest levels of software count from a different epoch.

NoMoreNicksLeft
0 replies
2d7h

They merely got confused... thousands of years from now, they assume unix epoch time is based on the moon landing. And it's only a few months off anyway. Not much is left of Earth as far as they know, to be able to properly understand that those were two different events.

The Emergents are what they are, because they are, at the most fundamental level, busybodies who want to control others. Sometime in their own history, they found an excuse (the Emergency) to do that, and they never stopped doing it even after the crisis was over. In this way they map to most other authoritarian regimes in reality, but especially to the leftist authoritarian regimes. They hate "peddlers" after all, who sell things to others at fair prices and of their own free will. Not unlike the mutterings you see all over social media concerning capitalism.

cpeterso
0 replies
2d11h

Though the software archaeologists in the book mistakenly thought the time_t epoch marked the moon landing, not just 1970. :)

blahedo
0 replies
2d18h

Yes! It is as a direct result of that book that I now know without having to look it up that a ksec is about a quarter hour and a Msec is on the order of a fortnight, which comes in handy when doing back-of-envelope estimation more often than you'd expect. (I'd already known that a Gsec was about a third of a century thanks to Tom Duff's observation.[0]) I don't see us moving to such a system anytime soon in general (tying to the circadian cycle is just too convenient) but I'm a little surprised I don't see it more often in discussions of humans in space.

[0] "How many seconds are there in a year? If I tell you there are 3.155 x 10^7, you won't even try to remember it. On the other hand, who could forget that, to within half a percent, pi seconds is a nanocentury." --Tom Duff

xavdid
6 replies
2d20h

Perfect time to fire uphttps://datetime.store/and try your luck on the perfect shirt!

SLWW
1 replies
2d11h

You know how you can use a "death clock" to figure out when you should statistically expire?

I wonder if you can find a shirt that would print that

bionsystem
0 replies
2d8h

By the time you receive the shirt the answer would have changed right ?

pests
0 replies
2d19h

This reminds me of an idea I had - the 1BTC coffee mug - this was back during the early first rise to $20k.

mi_lk
0 replies
2d19h

internet is really wonderful sometimes

helsinki
0 replies
2d18h

This is quite clever. I'm going to get one.

function_seven
0 replies
2d20h

That website gonna crash so hard on Tuesday…

“Boss! We’re being dee dossed!”

“No, son, it’s Tuesday”

hiAndrewQuinn
4 replies
2d20h

Instant bookmark for me. I've always loved the idea of measuring time in computers by a single integer like the timestamp does, but it always seems like such a pain to work with outside of that.

bloopernova
0 replies
2d19h

This might interest you, if you haven't already seen it: Unix timestamp clock, in hex!https://retr0.id/stuff/2038/

anyfoo
0 replies
2d20h

Because the bases are all wrong. Common number bases are 10, 16, maybe 8 if you live in the 70s, and 2.

Except for the utterly unwieldy binary, none of those bases adapt well to the bases used in representing time, which are mostly the (partially related) bases 60, 12, and, annoyingly, thirty-ish.

So you always end up doing opaque arithmetic instead of “just looking at the digits” (which you still can do in decimal for century vs years for example, because we defined centuries to be exactly that).

PrimeMcFly
0 replies
2d19h

I've always loved the idea of measuring time in computers by a single integer like the timestamp does

Why?

3np
0 replies
2d15h

FWIW these bash shortcuts are handy if you do this a lot:

    tss() { date -Is -u -d @$1 ; }
    tsm() { date -Ins -u -d @$( echo "scale=3; $1 / 1000" | bc) | sed -E -e 's/[0-9]{6}\+/\+/' -e 's/,/./' ; }
    tsn() { date -Ins -u -d @$( echo "scale=9; $1 / 1000000000" | bc) | sed 's/,/./' ; }

    $ tss 1700000000
    2023-11-14T22:13:20+00:00

    $ tsm 1700000000000
    2023-11-14T22:13:20.000+00:00

    $ tsn 1700000000000000000
    2023-11-14T22:13:20.000000000+00:00

Ayesh
4 replies
2d21h

... which happens roughly every three years.

asplake
2 replies
2d21h

How so?

stop50
0 replies
2d20h

1 year are 31557600 seconds so roughly a third of 100 million seconds 1.7 billion seconds since the epoch is the next big rollover since 2020 and the 5th-last before 31 bits are not enough to hold the seconds since the epoch.

dooglius
0 replies
2d20h

See for yourself

date -d '@1600000000'

date -d '@1700000000'
jefftk
0 replies
2d15h

Or roughly every 136 years.

shizcakes
2 replies
2d20h

I went to a 1234567890 "gathering" in a hotel lobby in Boston in 2009

cryptoz
0 replies
2d19h

I remember that moment! I was out at a bar or something at the time but I was prepared and had my laptop with me haha. I was mashing the up arrow and enter to make sure I didn’t miss it.

alexpotato
0 replies
2d19h

I was doing the late shift on a trading floor at a big bank.

The head of the derivatives tech support team pointed out it was about to hit so we opened up a shell and did a "watch" command + outputting the "date" command in epoch seconds and watched it happen.

Then we went back to working.

hallman76
2 replies
2d14h

There's a lot of epoch love in the comments. For me, it's never "clicked". I assumed that after seeing a ton of timestamps that I'd have a Neo-seeing-the matrix moment with timestamps but it just hasn't happened. Can you all easily decode them?

Is there talk anywhere of using a human-readable timestamp instead? e.g. YYYYMMddHHmmssSSSSZ

stkdump
1 replies
2d13h

Sure there is. But since it is not a continuous range there are the fixed separators --T::. between the parts. It is the javascript time format, which is a subset of the RFC3339 and ISO8601 time formats. The separators help at least to allow for a variable amount of sub-second digits.

perilunar
0 replies
2d9h

The hyphen and colon separators are optional though, so YYYYMMDDThhmmss.ssssZ is a valid ISO8601 format.

xyproto
1 replies
2d19h

In relation to UNUX time; the 20000th UNIX day is at 2024-10-04 (the 4th of October).

It's a special day, since the next round UNIX day is 30000, at 2052-02-20.

https://github.com/xyproto/ud/

mgdlbp
0 replies
2d11h

Reminds me of sdate for the Eternal September epoch. 10000 Sep 1993 was 2021-01-16.

http://www.df7cb.de/projects/sdate/

one commit message for the QDBs:

    From 14df411817feda9decf9dd8a6cd555d71f199730 Mon Sep 17 00:00:00 2001
    From: Christoph Berg <myon@debian.org>
    Date: Thu, 4 Jun 2020 20:05:49 +0200
    Subject: [PATCH] Fix long --covid option
    
    
     scripts/sdate.in | 2 +-
     1 file changed, 1 insertion(+), 1 deletion(-)
Sep 17 2001 (1000684800) is a special date from git-format-patch. Its significance is lost to time.

perihelions
1 replies
2d20h
benatkin
0 replies
2d15h

3.17 years.

SirMaster
1 replies
2d13h

When did Unix time start being used?

Was it being used in 1970 and actually started at 0?

Or did they just pick a date to start it and if so what was the initial Unix time when it was first used?

leonidasv
0 replies
2d11h

The Unix epoch is midnight on January 1, 1970. It's important to remember that this isn't Unix's "birthday" -- rough versions of the operating system were around in the 1960s. Instead, the date was programmed into the system sometime in the early 70s only because it was convenient to do so, according to Dennis Ritchie, one the engineers who worked on Unix at Bell Labs at its inception.

"At the time we didn't have tapes and we had a couple of file-systems running and we kept changing the origin of time," he said. "So finally we said, 'Let's pick one thing that's not going to overflow for a while.' 1970 seemed to be as good as any."

https://www.wired.com/2001/09/unix-tick-tocks-to-a-billion/

wolfi1
0 replies
2d8h

I'm more interested in such events when these coincide with a beginning of a year, month or week but it's a little too early to work out the math now

withinboredom
0 replies
2d9h

Yesterday, I was digging into some stuff in the database and saw some events scheduled for 17*. My initial reaction was that it was some far-off date. Then I realized ... nope, not far away at all.

russellbeattie
0 replies
2d21h

5,148 days left until January 19, 2038.

Assuming I live that long, the next day will be my 65th birthday. Just in time for digital Armageddon.

neomantra
0 replies
1d2h

I love that others get excited about this. UNIX Timeval Aficionados should try out thistf tool[1]. I used my buddy's C/Lex/Yacc one daily for 1.5 decades, then ported it to Golang + Homebrew to share the love:

[1]https://github.com/neomantra/tf

  brew tap neomantra/homebrew-tap
  brew install tf
Printing out these round ones. `tf` auto-detects at 10-digits, so I started there in the `seq`.

  > for TV in $(seq -f %.f 1000000000 100000000 2000000000); do echo $TV $TV | tf -d ; done
  2001-09-08 18:46:40 1000000000
  2004-11-09 03:33:20 1100000000
  2008-01-10 13:20:00 1200000000
  2011-03-12 23:06:40 1300000000
  2014-05-13 09:53:20 1400000000
  2017-07-13 19:40:00 1500000000
  2020-09-13 05:26:40 1600000000
  2023-11-14 14:13:20 1700000000
  2027-01-15 00:00:00 1800000000
  2030-03-17 10:46:40 1900000000
  2033-05-17 20:33:20 2000000000
Some funny dates. -g detects multiple on a line, -d includes the date:

  > echo 1234567890 __ 3141592653 | tf -gd
  2009-02-13 15:31:30 __ 2069-07-20 17:37:33
Enjoy... may it save you time figuring out time!

neogodless
0 replies
2d20h

Starting Tue Nov 14 2023 22:13:20 GMT+0000 to be exact!

msavio
0 replies
2d9h

The Unix timestamp inspired me to throw a birthday party on the day when I got a billion seconds old: 31,7 years :)

ksaj
0 replies
2d18h

The next one lands on a nice round hour, since it'll be at exactly 3:00:00 AM.

@ date -d '@1800000000' Fri Jan 15 03:00:00 AM EST 2027

jrockway
0 replies
2d13h

I like the unit of 100 million seconds. Longer than a year, shorter than a decade. The era of 1.6e8 was the pandemic. What will 1.7e8 bring?

gpvos
0 replies
2d8h

When I first used Unix it started with 6. I feel old.

diego_sandoval
0 replies
2d20h

It makes more sense to celebrate when a (relatively) high order bit changes from 0 to 1, not when the decimal representation changes.