return to table of content

Let's stop counting centuries

gcp123
115 replies
1d

Author makes a good point. "1700s" is both more intuitive and more concise than "18th century". The very first episode of Alex Trebeck's Jeopardy in 1984 illustrates how confusing this can be:

https://www.youtube.com/watch?v=KDTxS9_CwZA

The "Final Jeopardy" question simply asked on what date did the 20th century begin, and all three contestants got it wrong, leading to a 3-way tie.

tempodox
42 replies
12h22m

Logic is in short supply and off-by-one errors are everywhere. Most people don't care. I think it's more doable to learn to just live with that than to reprogram mankind.

swyx
24 replies
3h59m

oooor we can slowly migrate towards sensibility as we did celsius and centimeters

jonathan_landy
22 replies
3h46m

Re temp, I’m glad we use F for daily life in the USA. The most common application I have for temp is to understand the weather and I like the 0-100 range for F as that’s the typical range for weather near me.

For scientific work I obviously prefer kelvin.

Celsius is nearly useless.

abraae
6 replies
3h34m

Celsius is nearly useless.

That's like ... your opinion man.

Personally I like knowing that water boils at exactly 100 degrees.

prerok
4 replies
3h2m

At sea level, yes :)

I do agree, though I live in Europe and C is the norm. I could never wrap my head around F.

That said, I think 0 is more important in daily life, below or above freezing. How much is that in F again?

dboreham
2 replies
2h26m

As a dweller of a cold place in the USA, F is pretty handy because "freezing" isn't terribly cold. Having 0F be "actually quite seriously cold" is useful.

ajuc
0 replies
35m

My parents care a lot about "przymrozek" - which is when it gets sub-zero C at night and you need to cover the plants and close the greenhouse doors and put a heater there so the plants survive. They give warnings in radio when this happen outside of regular winter months.

There's also special warning for drivers if it was sub-zero because then the water on the roads freezes and it's very hard to break.

I'd say it's way more important a distinction than anything that F makes obvious.

User23
0 replies
2h12m

Also, conveniently, freezer temperature is 0F not 32F.

autoexec
0 replies
1h46m

We just need a new scale just for weather where 100 is 100F and 0 is 32F/0C then everyone can be happy. We'd have a lot more days with subzero temperatures though

runarberg
0 replies
2h23m

You just use one thing and you’ll learn it. When I was a kid my country changed from archaic 12 point “wind levels” to m/s. It took everybody a few weeks to adjust but it wasn’t hard. It was a bit harder for me after moving to America to adjust to Fahrenheit, but as you experience a temperature, and are told it is so many Fahrenheit, you’ll just learn it. I have no idea at what temperature water boils in F simply because I never experience that temperature (and my kettle doesn’t have a thermometer).

That said I wished USA would move over to the unit everyone else is using, but only for the reason that everyone else is using it, that is the only thing that makes it superior, and it would take Americans at worst a couple of months to adjust.

creeble
4 replies
1h53m

I agree. For ambient temp, F is twice as accurate in the same number of digits. It also reflects human experience better; 100F is damn hot, and 0F is damn cold.

Celsius is for chemists.

lye
2 replies
1h43m

There's very little difference between e.g. +25°C and +26°C, not sure why you would need event more accuracy in day to day life. There are decimals if you require that for some reason.

Celsius works significantly better in cold climates for reasons mentioned in another comment.

_gabe_
1 replies
32m

If that’s the case why do the Celsius thermostats I used while on vacation in Canada use 0.5C increments? The decimals are used, because the change between 25C and 26C is actually pretty big :)

In my old apartment, the difference between 73F and 74F was enough to make me quite cold or hot. And that’s a difference of about 0.5C. I’m not arguing that Farenheit is better, but I definitely do prefer it for setting my thermostat (which is a day to day thing) , but then again I grew up using it so that could be why I prefer it too.

ajuc
0 replies
29m

If that’s the case why do the Celsius thermostats I used while on vacation in Canada use 0.5C increments?

Probably because they were made for US and changed the labels? I've never seen a thermostat with 0.5 C increments in Europe.

the change between 25C and 26C is actually pretty big

I would maybe be able to tell you if it's 23 or 27, certainly I can't tell 1 C difference.

ajuc
0 replies
38m

The difference between -1 C and +1 C is VASTLY more important in daily life than the difference between 26.5 and 27 C.

Farmers, drivers, people with gardens need to know if it will get subzero at night.

Nobody cares if it's 26.5 C or 26 C.

lye
3 replies
1h47m

What the hell are you talking about. If it's 0°C outside (or below that), I know that it's high time to put winter tires on because the water in the puddles will freeze and driving on summer tires becomes risky. I had to look it up, but apparently that's +32 °F. Good luck remembering that.

+10°C is "it's somewhat cold, put a jacket on". +20°C is comfortable in light clothing. +30°C is pretty hot. +40°C is really hot, put as little clothing as society permits and stay out of direct sun.

Same with negatives, but in reverse.

Boiling water is +100°C, melting ice is very close to 0°C. I used that multiple times to adjust digital thermometers without having to look up anything.

It's the most comfortable system I can imagine. I tried living with Fahrenheit for a month just for fun, and it was absolutely not intuitive.

zdragnar
2 replies
1h36m

You'll want winter tires on well before the air temperature hits freezing for water. Forecasts aren't that predictable, and bridges (no earth heat sink underneath) will ice over before roads do.

40 F is a good time for getting winter tires on.

As someone who lives in a humid, wet area that goes from -40 at night in winter to 100+ F in summer, I also vastly prefer Fahrenheit.

The difference between 60, 70, 80 and 90 is pretty profound with humidity, and the same is true in winter. I don't think I've ever set a thermometer to freezing or boiling, ever. All of my kitchen appliances have numbers representing their power draw.

lye
1 replies
1h19m

Well, it's been working fine for me for about 15 years, let's agree to disagree here. I would still find it easier to remember to change the tires at +1°C than whatever the hell it comes down to in Fahrenheit.

I too live in a region with 80 (Celsius) degree yearly variation (sometimes more; the maximum yearly difference I've lived through is about 90 degrees IIRC: -45 in January to +43 in July), and Fahrenheit makes absolutely no sense to me in this climate.

happyraul
0 replies
22m

Winter tyres are less to do with freezing water and more to do with the way the tire compound in summer tires hardens/loses elasticity and therefore grip in lower temperatures, around 7 degrees Celsius.

ryukoposting
0 replies
46m

I don't see enough love for feet and inches.

A foot can be divided cleanly into 2, 3, 4, and 6. Ten is a really sucky number to base your lengths on. It only divides nicely into 2 and 5.

inglor_cz
0 replies
1h32m

"Celsius is nearly useless."

http://i.imgur.com/3ZidINK.png?1

For anyone not living in the US or Jamaica or Belize, it is Fahrenheit that is completely useless. Which is something like 7,7 billion people.

0 = water freezing temp is hugely useful heuristics for anyone living in moderate climate.

dtech
0 replies
2h17m

95% of the world uses Celcius without problems because they're used to it. You'd either also be fine with it or you belong to a sub-5th percentile which couldn't figure it out, take your pick.

SkeuomorphicBee
0 replies
21m

For me the best feature of Celsius, the one that makes it much better for weather, is the zero on the freezing point of water. Everything changes in life when water start to freeze, roads get slippery, pipes burst, crops die. So it is important that such a crucial threshold is represented numerically in the scale. In other words, going from 5 to -5 in Fahrenheit is just getting 10° colder, nothing special, while going from 2 to -2 in Celsius is a huge change in your daily life.

MostlyStable
0 replies
3h12m

I agree that for weather F is better, but I don't think it's so much better as to be worth having two different temp scales, and unlike K, C is at least reasonable for weather, and it works fine for most scientific disciplines.

LocalH
0 replies
3h34m

At least conversion between Celsius degrees and Kelvin is easy and lossless

User23
0 replies
2h13m

It’s been tried. The “rational” calendar reform was something of a failure.

zepolen
5 replies
11h23m

That's what most people think and the world keeps trucking along.

It's the rare people that don't who actually change the world.

tempodox
4 replies
10h59m

You can change the world if you make it easier to meet a need enough people have. Persuading everyone they're holding it wrong is not that.

zepolen
1 replies
10h36m

Those are exactly the same thing if you convince everyone they are holding it wrong by holding it right.

lukan
0 replies
9h52m

"You can change the world if you make it easier to meet a need enough people have"

True and should not be forgotten in this debate.

But clear communication is a need many people have.

__MatrixMan__
0 replies
4h39m

Persuasion by argument, maybe not. But if you simply ask for clarification when you hear "nth century" but not when you hear "n-hundreds" then you've effectively made it easier for the speaker to meet their need one way over the other way.

Same thing for "this weekend" when. Not spoken during a weekend.

phkahler
3 replies
3h44m

Nobody's asking to reprogram anyone. Just stop using one of two conventions. The reason to do it is simple and obvious. I'm really baffled at the responses here advocating strongly for the current way. But I guess that's just a "people thing"

1over137
2 replies
1h35m

Asking me to stop using my preferred convention is tantamount to 'reprogramming' me.

mananaysiempre
0 replies
59m

Reprogramming mankind is unreasonable. Reprogramming you may not be.

BobaFloutist
0 replies
48m

I'm amazed you didn't even hedge by saying "telling me to", claiming that a request to shift convention is tantamount to a reprogramming is certainly a bold, provocative claim.

jodrellblank
2 replies
3h52m

What's "more logical" about "the seventeenth century" compared to "the sixteen hundreds"?

hombre_fatal
1 replies
3h25m

I’d say more sensible. It’s always weird to me to use the number 17 to talk about years that start with 16. Makes more sense to just say the 1600s.

dantyti
0 replies
3h1m

After their 16th birthday, the person is going through their 17th year.

Just like 11:45 can be told as "a quarter to 12"

ryukoposting
0 replies
53m

The publishing industry already has style guides for large swaths of the industry.

Imagine, for a moment, that AP adopted the OP's "don't count centuries" guidance. An enormous share of English-language publishing outfits would conform to the new rule in all future publications. Within a couple months, a large share of written media consumption would adopt this improved way of talking about historical time frames.

The best part? Absolutely no effort on the part of the general public. It's not like the OP is inventing new words or sentence structures. There's zero cognitive overhead for anyone, except for a handful of journalists and copywriters who are already used to that kind of thing. It's part of their job.

I think a lot of people take these sorts of ideas to mean "thou shalt consciously change the way you speak." In reality, we have the systems in place to make these changes gradually, without causing trouble for anyone.

If you don't like it, nobody's trying to police your ability to say it some other way - even if that way is objectively stupid, as is the case with counting centuries.

leereeves
0 replies
6h22m

I think it's more doable to learn to just live with that than to reprogram mankind.

Why not just fix the calendar to match what people expect?

There was no time when people said "this is year 1 AD". That numbering was created retroactively hundreds of years later. So we can also add year 0 retroactively.

dgb23
0 replies
5h7m

Specifically I agree, but generally I disagree. I’m very glad we got the metric system, standards for commonly used protocols and so on.

adamomada
0 replies
40m

You just made me realize that the common saying “the eleventh hour” isn’t what anyone thinks it is

semireg
27 replies
1d

The right answer was, and still is: Jan 1, 1901

hgomersall
21 replies
22h6m

No, the first century began Jan 1, 0000. Whether that year actually existed or not is irrelevant - we shouldn't change our counting system in the years 100, 200 etc.

Izkata
19 replies
21h13m

The calendar goes from 1 BC to 1 AD, there is no year 0.

rcoveson
8 replies
20h52m

There is no year zero according to first-order pedants. Second-order pedants know that there is a year zero in both the astronomical year numbering system and in ISO 8601, so whether or not there is a year zero depends on context.

It's ultimately up to us to decide how to project our relatively young calendar system way back into the past before it was invented. Year zero makes everything nice. Be like astronomers and be like ISO. Choose year zero.

worstspotgain
2 replies
17h28m

Yes but, is there such a thing as a zeroth-order pedant, someone not pedantic about year ordinality? As a first-order meta-pedant, this would be my claim.

Moreover, I definitely find the ordinality of pedantry more interesting than the pedantry of ordinality.

nativeit
0 replies
14h19m

Thank you for your service.

jessekv
0 replies
3h17m

Interesting indeed. I suppose third-order pedantry must be "jerk".

m2f2
1 replies
5h18m

Talking about standards let's not pick and choose.

First, let's get rid of miles and feet, then we could even discuss this.

lobsterthief
0 replies
4h14m

If only—I think most US citizens who actually work with units of measurement on a daily basis would love to switch to the metric system. Unfortunately, everyone else wants to keep our “freedom units” (and pennies)

marc_abonce
0 replies
18h56m

It's ultimately up to us to decide how to project our relatively young calendar system way back into the past before it was invented. Year zero makes everything nice. Be like astronomers and be like ISO. Choose year zero.

Or, just to add more fuel to the fire, we could use the Holocene/Human year numbering system to have a year zero and avoid any ambiguity between Gregorian and ISO dates.

https://en.wikipedia.org/wiki/Holocene_calendar

jl6
0 replies
19h54m

We are all defacto ISO adherents by virtue of our lives being so highly computer-mediated and standardized. I’m fully on board with stating that there absolutely was a year zero, and translating from legacy calendars where necessary.

hollerith
0 replies
19h51m

I vote for a year zero and for using two's complement for representing years before zero (because it makes computing durations that span zero a little easier).

hgomersall
8 replies
21h6m

What does that even mean? Do we allow for the distortion due to the shift from the Julian to Gregorian calendars, such that the nth year is 11 days earlier? Of course not, because that would be stupid. Instead, we accept that the start point was arbitrary and reference to our normal counting system rather than getting hung up about the precise number of days since some arbitrary epoch.

pdonis
7 replies
21h0m

> What does that even mean?

It means just what it says. In the common calendar, the year after 1 BC (or BCE in the new notation) was 1 AD (or CE in the new notation). There was no "January 1, 0000".

hgomersall
6 replies
20h57m

As I said twice, whether that date actually existed or not is irrelevant.

pdonis
5 replies
20h53m

> whether that date actually existed or not is irrelevant.

No, it isn't, since you explicitly said to start the first century on the date that doesn't exist. What does that even mean?

lupire
3 replies
18h31m

The first day of the 1st Century is Jan 1, 1 AD.

The point is that some days got skipped over the centuries, but there's no need to make the Centuries have weird boundaries.

pdonis
2 replies
18h9m

> The first day of the 1st Century is Jan 1, 1 AD.

That's not what the poster I originally responded to is saying. He's saying the 1st Century should start on a nonexistent day.

hgomersall
0 replies
12h1m

No, I'm saying we ignore when it actually started and instead use the normal rules of counting to decide what to call the respective centuries.

antonvs
0 replies
15h5m

You can make this work by having the 1st century start on the last day of 1 BC. Think of it as an overlap if you like; it doesn't really matter.

That allows for consistent zero-indexed centuries. It doesn't have any other practical consequences that matter.

zarzavat
0 replies
2h29m

0 CE = 1 BCE

10 C = 50 F = 283.15 K

1 = 0.999…

Things can have more than one name. The existence of the year 0 CE is not in question. What’s in question is whether that’s a good name for it or not.

coldtea
0 replies
11h0m

Hence why the parent wrote "Whether that year actually existed or not is irrelevant".

They might or might not have a point, but they already addressed yours.

readthenotes1
3 replies
23h29m

How can that be if 15 of those centuries are on the Julian calendar?

whycome
0 replies
22h1m

Also, when they switched things in 1582:

https://www.britannica.com/story/ten-days-that-vanished-the-....

The most surreal part of implementing the new calendar came in October 1582, when 10 days were dropped from the calendar to bring the vernal equinox from March 11 back to March 21. The church had chosen October to avoid skipping any major Christian festivals.
pdonis
0 replies
21h2m

The century in which the switch occurred (which was different in different countries) was shorter than the others. As were the decade, year, and month in which the switch occurred.

Archelaos
0 replies
3h57m

The "original" Julian calendar was indifferent to year number systems. The Romans typically used the consular year, although Marcus Terentius Varro "introduced" the ab urbe condita (AUC) system in the 1st century BC, which was used until the Middle Ages. From the 5th to the 7th century, the anno Diocletiani (also called anno martyrum) after emperor Diocletian was used primarily in the eastern empire (Alexandria), or the anno mundi (after the creation of the world). It was Dionysius Exiguus in the 6th century, who replaced the anno Diocletiani era with the Anno Domini era. His system become popular in the West, but it took a long time until it also was adopted in the East. Its application to years before the birth of Christ is very late: we come across it first in the 15th century, but it was not widespread before the 17th century.

All these systems used the Julian system for months and days, but differed in terms of the year and (partialy) in the first day of the year.

glitcher
0 replies
19h31m

Incorrect, this answer wasn't given in the form of a question ;)

jrockway
14 replies
21h23m

On the other hand "1700s art" sounds like trash compared to "18th century art".

rz2k
7 replies
20h32m

How about if you say "settecento"? Maybe it is a new confusion that they drop a thousand years, and maybe it would imply Italian art specifically.

lucb1e
6 replies
17h50m

Just to make sure I understood this, that would be used as "17th settecento" to mean 1700s right?

(This Xth century business always bothered and genuinely confused me to no end and everyone always dismissed my objections that it's a confusing thing to say. I'm a bit surprised, but also relieved, to see this thread exists. Yes, please, kill all off-by-one century business in favor of 1700s and 17th settecento or anything else you fancy, so long as it's 17-prefixed/-suffixed and not some off-by-anything-other-than-zero number)

rz2k
2 replies
16h3m

Think of it as the 700s, which is a weird way to refer to the 1700s, unless you are taking a cue from the common usage. That’s just how the periods are referenced by Italian art historians.

psychoslave
0 replies
8h26m

Not much different from 60s refering to 1960 to 1969, to my mind

hk__2
0 replies
9h18m

That’s just how the periods are referenced by Italian art historians.

And Italian people in general.

animaomnium
1 replies
16h26m

"settecento" can be read as "seven hundred" in Italian; gramps is proposing to use a more specific word as a tag for Italian art from the 1700s. Of course, 700 is not 1700, hence the "drop 1000 years". The prefix seventeen in Italian is "diciassette-" so perhaps "diciasettecento" would be more accurate for the 1700s. (settecento is shorter, though.)

Hope this clarifies. Not to miss the forest for the trees, to reiterate, the main takeaway is that it may be better to define and use a specific tag to pinpoint a sequence of events in a given period (e.g. settecento) instead of gesturing with something as arbitrary and wide as a century (18th century art).

worstspotgain
0 replies
10h28m

You're looking for millesettecento [1]. Italian doesn't do 10-99 hundreds, just 1-9 hundreds and 1-99 thousands.

[1] https://www.youtube.com/watch?v=LMIGnMs4VZA

coldtea
0 replies
11h5m

settecento means "700". Just proposed above as a way to say 18th century or 1700s, same as we sometimes remove the "2000" and just say "the 10s" for the decade starting 2010 (nobody cares for the 2011-as-start convention except people you don't want to talk to in the first place).

burkaman
3 replies
19h3m

I think that's good, because it helps you realize that categorizing art by century is kind of arbitrary and meaningless, and if possible it would be more useful to say something like "neoclassical art from the 1700s". "18th century" isn't an artistic category, but it kind of sounds like it is if you just glance at it. "Art from the 1700s" is clearly just referring to a time period.

darby_nine
1 replies
14h27m

I think that's good, because it helps you realize that categorizing art by century is kind of arbitrary and meaningless

no it won't lol, people will pay just as much through the new dating system as they would through the old.

coldtea
0 replies
11h7m

People pay as much for art because they are the rare combination of educated person with money which values the aesthetics and artifacts of an era, or as something to signal their wealth to others, or as a way to launder money.

BlarfMcFlarf
0 replies
4h0m

Agreed. The haiku is “18th century art” as that’s when it was first invented. So it’s either a uselessly broad category, or an indefensibly Eurocentric one.

cm2187
0 replies
4h9m

And 1700s already has a different meaning, i.e. early 18th century.

bandyaboot
0 replies
20h21m

If using “1700s”, I’d write it as “art of the 1700s”.

d0mine
11 replies
12h19m

There is no "0" year, 1 is the 1st year, so 100th year is still the 1st century, therefore 2nd century starts in 101 and 20th in 1901.

notfed
6 replies
12h7m

I find this decree frustrating. Someone could have just as easily said "the 'first' century starts at 1 BC" to account for this.

38
2 replies
11h59m

Or better yet just year 0, why not? Do we say the 80s start in 1981?

rdlw
1 replies
11h39m

The concept of zero was not popularized in 500s Europe, when the system was devised.

karatinversion
0 replies
9h37m

And also, the system is a direct descendant of regnal numbering, where zero wouldn’t have made sense even if invented (there is no zeroth year of Joe Biden’s term of office).

karatinversion
1 replies
9h35m

Do you also count the first decade of your life from January 1st of the year before you were born?

Thorrez
0 replies
9h48m

Then what is the last year of the first century BC? 2 BC? Now there's an off-by-2!

hypertele-Xii
2 replies
8h26m

What? 0 is the year Jesus Christ was born.

incognito124
1 replies
6h58m

No, jesus was born in 1AD

coldtea
0 replies
11h3m

Doesn't matter, we can just agree the first century had 99 years, and be done with it.

We have special rules for leap years, that would just be a single leap-back century.

At the scale of centuries, starting 2nd century at 100 as opposed to 101 is just an 1% error, so we can live with it. For the kind of uses we use centuries for (not to do math, but to talk roughly about historical eras) it's inconsequential anyway.

Kwpolska
6 replies
22h30m

very first

It’s actually the second.

Trebeck's

Trebek's*

card_zero
5 replies
21h41m

Let's reform Alex Trebek's name, it's difficult.

c0balt
4 replies
20h55m

And while we are at it, Tim Apple

bitwize
3 replies
18h50m

There was an airport-novel series about a future where people's surnames are the company they work for. It was called Jennifer Government.

Some of the characters in Death Stranding, namely the main one, have a given-name, profession, employer convention -- as in Sam Porter Bridges.

nkrisc
0 replies
8h18m

And in the far future of that future surnames like Government or PepsiCo or Alcoa will be as common as Smith, Fletcher, and Miller.

monkeyfun
0 replies
18h11m

Ahhh, reminding me of Nation-States too. What a curious little website / online community.

krsdcbl
0 replies
15h46m

Death strandings naming is not too far from very common naming conventions throughout history, it's a nicely subtle touch.

Glenn Miller, Gregory Porter and Sam Smith just happen to have been more inclined to make music.

zozbot234
1 replies
2h21m

1700s means 1700–1709, i.e. roughly the first decade in the 18th century. Just like '2000s'. The OP acknowledges this issue and then just ignores it.

Viliam1234
0 replies
2h9m

I have a solution that would work in writing, but not sure how to pronounce it:

1700s means 1700–1709

1700ss means 1700–1799

To go one step further:

2000s means 2000-2009

2000ss means 2000-2099

2000sss means 2000-2999

drivers99
1 replies
22h36m

There are numerous common concise ways to write the 18th century, at the risk of needing the right context to be understood, including “C18th”, “18c.”, or even “XVIII” by itself.

anyfoo
0 replies
22h21m

These are even more impractical, so I wonder what your point is? I can come up with an even shorter way to say 18th century, by using base26 for example, so let's denote it as "cR". What has been gained?

swyx
0 replies
4h0m

that is fascinating trivia. you could do a whole Jeopardy on Jeopardy facts alone

runarberg
0 replies
21h57m

In Icelandic the 1-based towards counting is used almost everywhere. People do indeed say: “The first decade of the 19th century” to refer to the 18-aughts, and the 90s is commonly referred to as “The tenth decade”. This is also done to age ranges, people in their 20s (or 21-30 more precisely) are said to be þrítugsaldur (in the thirty age). Even the hour is sometime counted towards (though this is more rare among young folks), “að ganga fimm” (or going 5) means 16:01-17:00.

Speaking for my self, this doesn’t become any more intuitive the more you use this, people constantly confuse decades and get insulted by age ranges (and freaked out when suddenly the clock is “going five”). People are actually starting to refer to the 90s as nían (the nine) and the 20-aughts as tían (the ten). Thought I don’t think it will stick. When I want to be unambiguous and non-confusing I usually add the -og-eitthvað (and something) as a suffix to a year ending with zero, so the 20th century becomes nítjánhundruð-og-eitthvað, the 1990s, nítíu-og-eitthvað and a person in their 20s (including 20) becomes tuttugu-og-eitthvað.

masswerk
0 replies
15h44m

So shouldn't this be the "0-episode"? ;-)

(0, because only after the first question, we have actually 1 episode performed. Consequently, the 1-episode is then the second one.)

darby_nine
0 replies
14h31m

Author makes a good point. "1700s" is both more intuitive and more concise than "18th century".

Yea, but a rhetorical failure. This sounds terrible and far worse than alternatives.

If we want a better system we'll need to either abandon the day or the Gregorian (Julian + drift) caliendar.

dantyti
0 replies
3h6m

What about languages that don’t have an equivalent to “the Xs” for decades or centuries?

Also, 1799 is obviosly more than 1700, as well as 1701 > 1700 – why should the naming convention tie itself to the lesser point? After one’s third birthday, the person is starting their fourth year and is not living in their third year.

I feel this is relevant https://xkcd.com/927/

ajuc
0 replies
40m

Depends on the language. Century being 3 syllables really makes it long in English, but it's still 5 syllables vs 5 syllables.

In Polish: [lata] tysiącsiedemsetne (6 [+2] syllables) vs osiemnasty wiek (5 syllables).

milliams
61 replies
1d

It's easy, we should have simply started counting centuries from zero. Centuries should be zero-indexed, then everything works.

We do the same with people's ages. For the entire initial year of your life you were zero years old. Likewise, from years 0-99, zero centuries had passed so we should call it the zeroth century!

At least this is how I justify to my students that zero-indexing makes sense. Everyone's fought the x-century vs x-hundreds before so they welcome relief.

Izzard had the right idea: https://youtu.be/uVMGPMu596Y?si=1aKZ2xRavJgOmgE8&t=643

mmmmmbop
26 replies
22h33m

We do the same with people's ages.

No, we don't.

When we refer to 'the first year of life', we mean the time from birth until you turn 1.

Similarly, you'd say something like 'you're a child in the first decade of your life and slowly start to mature into a young adult by the end of the second decade', referring to 0-9 and 10-19, respectively.

Uehreka
11 replies
22h18m

No, we don't.

But practically speaking we usually do. I always hear people refer to events in their life happening “when I was 26” and never “in the 27th year of my life”. Sure you could say the latter, but practically speaking people don’t (at least in English).

mjmahone17
7 replies
21h43m

“Half one” is archaic English, and common German, for 12:30. Similarly “my 27th year” just sounds archaic to me: I wonder if you went through a bunch of 19th century writing if you’d see ages more often be “Xth year” vs “X-1 years old”.

There may be something cultural that caused such a shift, like a change in how math or reading is taught (or even that it’s nearly universally taught, which changes how we think and speak because now a sizeable chunk of the population thinks in visually written words rather than sounds).

einherjae
3 replies
19h38m

Isn’t “half one” used as a short form of “half past one” these days, I.e. 01:30? That has been a source of confusion for someone used to the Germanic way.

larusso
0 replies
12h3m

I had this exact topic with an Irish coworker who lives in Germany and has issues to convey the right time. For me as a German „half one“ is half of one so 12:30. Same for „Dreiviertel eins“ -> „threequarter one“ being 12:45 and „Viertel eins“ -> „quarter one“ being 12:15. To be fair the logic behind this is also under constant confusion as some parts of Germany rather use „viertel vor“ or „viertel nach“ -> „quarter to“ „quarter after“ and have no understanding of the three quarter business.

beAbU
0 replies
5h45m

The Irish like to say "half one" meaning "half past one". In my native timekeeping parlance "half een" means 12h30. Germanic/Dutch origin.

So whenever I talk time with the locals here I repeat the time back in numerical style to avoid confusion.

"The shop opens tomorrow at half ten".

"Thanks, store opens at nine thirty. See you then."

"No..."

OJFord
0 replies
59m

In the UK yes, I think not in AmE? At least I'm pretty sure they don't say 'quarter to' or 'quarter past', and do say 'a half after'.

(I had some confused conversation with a bus driver once. Bizarre experience to have so much language barrier between two EFL speakers, in English!)

Unbefleckt
1 replies
18h23m

Had no idea myself, my peers, my family and my community used archaic English.

stavros
0 replies
17h44m

Do you say "half ten" to refer to 9:30? If so, you're using archaic English, yep!

HPsquared
0 replies
7h55m

A lot of European languages say "I have x years" instead of "I am x years old". It emphasises the "milestone" nature, as in "I have x full years".

xg15
0 replies
6h31m

I think of the age number "practically" as the number of "birthday celebrations" I have experienced, excluding the actual day of birth. That's the same as the amount of completed years I've lived on this earth, and one less than the year I'm living in, because that year is not yet completed. (Except of course on birthdays)

But I think this also illustrates just how averse our culture is to using zero-indexing in counts: The age number absolutely is zero-indexed - a baby before before the first birthday is zero years old. But no one calls it like that, instead we drop the year count entirely and fall back to the next-largest nonzero unit, i.e. we say the baby is so-and-so-many months old. And for newborns not yet a month old, we count in weeks, etc.

I think, culturally, it's not that surprising as this method of counting is older than the entire concept of "zero". But I think it shows that there is little hope of convincing a large number of non-nerd people to start counting things with zeros.

mtlmtlmtlmtl
0 replies
20h21m

That's not really indexing from 0 though. It's just rounding the amount of time you've lived down to the nearest year. You get the same number, but semantically you're saying roughly how old you are, not which year you're in. This becomes obvious when you talk to small children, who tend to insist on saying e.g "I'm 4 and a half". And talking about children in their first year, no one says they're 0. They say they're n days/weeks/months old.

SllX
0 replies
9h48m

In an indirect manner, we do mark having lived the 27th year in the following forms, we just don’t say it exactly the way you phrased it:

1. On your 26th Birthday, when you say you turned 26 what it means is that you have now lived 26 years. People generally understand this, even if they are going to be spending the next year saying they are 26.

2. It is not uncommon for people to demarcate their age on their birthday in revolutions around the Sun, as a kind of meme. “I’ve now traveled around the Sun twenty-six times.” or something like that, when reflecting on their lives on their Birthday.

The colloquial usage is our legally-defined age. A shortcut for our laws to take, the age-gating ones anyway. It hasn’t replaced our cultural understanding of what the first year of our life actually was.

furyofantares
8 replies
21h15m

On your sixth birthday we put a big 5 on your cake and call you a 5 year old all year.

Can't say I've ever had to refer to someone's first year or first decade of their life, but sure I'd do that if it came up. Meanwhile, 0-indexed age comes up all the time.

drdec
4 replies
20h14m

On your sixth birthday we put a big 5 on your cake and call you a 5 year old all year.

If you are going to be that pedantic, I would point out that one only has one birthday.

(Well, unless one's mother is extremely unlucky.)

naniwaduni
3 replies
19h52m

"Birthday" does not mean the same thing as "date of birth".

furyofantares
0 replies
19h8m

Well regardless the number we constantly use is 0-indexed.

copperx
0 replies
11h39m

yes, birthday and birth day are different things. Just like everyday and every day have different meanings, and it isn't confusing (to most people).

Detrytus
0 replies
7h23m

“Birthday” is just short for “birth day anniversary”, I guess…

layer8
1 replies
17h33m

birthday != date of birth

“Birthday” really means “anniversary of the date of birth”.

samatman
0 replies
14h46m

Spanish has us beat on this one: cumpleaños, "completed year" basically.

subroutine
0 replies
8h48m

The number we put on the cake represents the number of "years old" (i.e. the number of birthday anniversaries) not the number of birth days someone had (obviously). Zero year-olds are 0, one year-olds are 1, ...

jcelerier
2 replies
22h13m

The first year of life is the year indexed with zero, just like the first centimeter/inch in a ruler is the centimeter/inch indexed with zero

mmmmmbop
0 replies
21h57m

I agree, that was my point.

Sardtok
0 replies
21h49m

And so is the first century of the zero-indexed calendar.

kelnos
0 replies
13h47m

When we refer to 'the first year of life', we mean the time from birth until you turn 1.

Sure, but no one ever uses that phrasing after you turn one. Then it's just "when they were one", "when they were five", whatever.

So sure, maybe we can continue to say "the 1st century", but for dates 100 and later, no more.

daynthelife
0 replies
14h12m

My preference is semi-compatible with both conventions:

First = 0 Second = 1 Toward = 2 Third = 3 …

This way, the semantic meaning of the words “first” (prior to all others) and “second” (prior to all but one) are preserved, but we get sensical indexing as well.

kstrauser
16 replies
22h32m

We don't 0-index people's ages. There are a million books about "baby's first year", while they're still 0 years old.

Terretta
9 replies
20h36m

Except we do, as soon as we need the next digit.

In "figure of speech", or conventual use, people start drinking in their 21st year, not their 22nd. In common parlance, they can vote in their 18th year, not their 19th.

We talk of a child in their 10th year as being age 10. Might even be younger. Try asking a people if advice about a child in their "5th year of development" means you're dealing with a 5 year old. Most will say yes.

So perhaps it's logical to count from zero when there's no digit in the magnitude place, because you haven't achieved a full unit till you reach the need for the unit. Arguably a baby at 9 months isn't in their first year as they've experienced zero years yet!

Similarly "centuries" don't have a century digit until the 100s, which would make that the 1st century and just call time spans less than that "in the first hundred years" (same syllables anyway).

It's unsatisfying, but solves the off by one errors, one of the two hardest problems in computer science along with caching and naming things.

kdmccormick
5 replies
19h46m

What? No. When you are 0, it is your first year. When you are 21, you have begun your 22nd year. In the US you are legal to drink in your 22nd year of life.

You are correct that nobody says "22nd year" in this context, but nobody says "21st year" either. The former is awkward but the latter is just incorrect.

Terretta
3 replies
18h43m

nobody says "21st year" either

On the contrary, enough people say it, it's a quora question:

https://www.quora.com/What-does-it-mean-to-be-in-your-twenty...

Authors love phrases like this. Which, in turn, comes from another ordinal/cardinal confusion stemming back to common law:

"A person who has completed the eighteenth year of age has reached majority; below this age, a person is a minor."

That means they completed being 17, but that's just too confusing, so people think you stop being a minor in your 18th year.

bregma
1 replies
14h54m

It's just not true. You've completed being 17 years old on your 18th birthday, when you enter your 19th year and can count 18 years under your belt.

Consider a newborn. As soon as they're squeezed out they are in their first year of life. That continues until the first anniversary of their decanting, at which point they are one year old and enter their second year of life.

There is nobody, nobody, who refers to a baby as being in their zeroth year of life. Nor would they refer to a one-year-old as still being in their first year of life as if they failed a grade and are being held back.

The pattern continues for other countable things. Breakfast is not widely considered the zeroth meal of the day. Neil Armstrong has never been considered the zeroth man on the moon nor is Buzz Aldrin the first. The gold medal in the Olympics is not awarded for coming in zeroth place.

kelnos
0 replies
13h43m

It's just not true.

No one's saying it's true! All that's being claimed is that writers will often use phrases like "became an adult in their 18th year" or "was legally allowed to drink in their 21st year".

It's completely incorrect, but some people use it that way, and ultimately everyone understands what they actually mean.

antonvs
0 replies
15h0m

The top response in your Quora link is that your 21st year "means you’re 20. You have had your 20th birthday, but not yet your 21st." That is the conventional definition.

People commonly make the mistake of thinking otherwise, but that's all it is. A mistake.

kergonath
1 replies
19h23m

In "figure of speech", or conventual use, people start drinking in their 21st year, not their 22nd. In common parlance, they can vote in their 18th year, not their 19th.

That’s not the case, though. They can vote (and drink, in quite a few countries) when they are at least 18 years old, not when they are in their 18th year (who would even say that?)

People are 18 years old (meaning that 18 years passed since their date of birth) on their 18th birthday. There is no need of shoehorning 0-based indexing or anything like that.

Most will say yes.

Most people say something stupid if you ask tricky questions, I am not sure this is a very strong argument. Have you seriously heard anybody talking about a child’s “5th year of development”, except maybe a paediatrician? We do talk about things like “3rd year of school” or “2nd year of college”, but with the expected (1-indexed) meaning.

So perhaps it's logical to count from zero when there's no digit in the magnitude place, because you haven't achieved a full unit till you reach the need for the unit. Arguably a baby at 9 months isn't in their first year as they've experienced zero years yet!

It’s really not. To have experienced a full year, you need a year to have passed, which therefore has to be the first. I think that’s a cardinal versus ordinal confusion. The first year after an event is between the event itself and its first anniversary. I am not aware of any context in which this is not true, but obviously if you have examples I am happy to learn.

It's unsatisfying, but solves the off by one errors, one of the two hardest problems in computer science along with caching and naming things.

Right. I know it is difficult to admit for some of us, but we are not computers and we do not work like computers (besides the fact that computers work just fine with 1-indexing). Some people would like it very much if we counted from 0, but that is not the case. It is more productive to understand how it works and why (and again cardinals and ordinals) than wishing it were different.

Terretta
0 replies
18h48m

who would even say that?

Writers.

And yes, cardinal versus ordinal is my point. The farther from the origin, the less people are likely to want them different.

afiori
0 replies
5h35m

This is an instance of the general problem that a path of lenght n has n+1 nodes

mixmastamyk
0 replies
21h2m

How old is the baby now? Six months…

copperx
0 replies
11h36m

Ages are zero indexed, but people avoid saying zero by counting age in months in year 0, then switch to years in year one.

bmacho
0 replies
22h23m

If you point at year long intervals, then those will be year long intervals indeed.

Nevertheless the traditional "how old are you" system uses a number 1 less.

bitwize
0 replies
18h48m

You are, in general, n-1 years old in your nth year. Only when you complete your nth year do you turn n years old.

User23
0 replies
22h23m

We also talk about someone’s first day at work during that day.

OJFord
0 replies
54m

Yeah we do, because their 'first year' isn't their age. We do their age in (also zero-indexed) months/weeks/days.

In Indian English terms, we do 'complete' age - aiui more common in India is to one-index, i.e. you're the age of the year you're in, and to disambiguate you might hear someone say they're '35 complete', meaning they have had 35 anniversaries of their birth (36 incomplete).

drewcoo
12 replies
21h58m

we should have simply started counting centuries from zero

Latin, like Lua, is 1-indexed.

IshKebab
10 replies
20h19m

I feel like the Romans had an excuse for that mistake. Not sure about Lua.

bigstrat2003
9 replies
16h49m

I don't think it is a mistake for Lua. The convention to zero-index arrays is not sacrosanct, it's just the way older languages did it (due to implementation details) and thus how people continue to do it. But it's very counter-intuitive, and I think it's fair game for new languages to challenge assumptions that we hold because we're used to past languages.

antonvs
4 replies
14h54m

It's not counter-intuitive at all, it only seems that way because people are now used to languages with zero-based indexing. That's almost entirely because of the C language, which used pointer offset arithmetic with its arrays.

Outside of that machine context, where an array is a contiguous block of RAM that can be indexed with memory pointers, there's no particular reason to do offset indexing. 1-based works just fine - "first element, second element" - works just fine and is perfectly intuitive.

Different types of indexing can make sense in different situations. Some languages even allow that. In Ada, for example, arrays can start at whatever index you define.

IshKebab
2 replies
10h44m

1-based works just fine

It really doesn't. You can make it work obviously but you end up with much less elegant code, with +1 and -1 all over the place. E.g. for accessing a row of a matrix you get [width(i-1)+1, widthi+1) instead of the far saner [widthi, (width+1)i)

Generic code also becomes much more awkward.

antonvs
1 replies
9h12m

Both are less elegant in different scenarios. In many business scenarios with zero-based indexes, you need i+1 everywhere because no-one talks about the e.g. the zeroth year of a company's operation.

Neither is a true one-size-fits-all solution. They're different kinds of indexes that serve different purposes. The choice of zero-based everywhere is an engineering tradeoff, nothing more.

IshKebab
0 replies
3h43m

In many business scenarios with zero-based indexes, you need i+1 everywhere because no-one talks about the e.g. the zeroth year of a company's operation.

Perhaps, but this is extremely rare compared to tasks that are far more elegant with 0-based indexing. Also the worst you can get there is a single +1 in the display code, while trying to shoe-horn algorithms into 1-based code can get much more awkward.

demurgos
0 replies
10h36m

There are reasons unrelated to pointer implementations such as the interval argument from Dijkstra's article or conversions between flat and multidimensional array indexes. There's a reason why most mathematical sequences start at zero: it leads to simpler expressions. Vec (and especially matrix) indexing should have been zero-based.

IshKebab
1 replies
10h43m

It's not sacrosanct but by the time Lua was created it was very clearly the right choice.

afiori
0 replies
5h38m

It is the best choice, but the difference is mostly in working manually with slices and offsets and array windows, for indexing and iterating they are mostly the same with maybe a small benefit for mathy notation (the reason why Julia is 1 indexed)

worstspotgain
0 replies
8h39m

Zero-based arrays are counter-intuitive for a while, but if you deal with a lot of data, you typically realize that it's a small price to pay to make manipulation much easier in many contexts. For instance, if you have a ring buffer of size N and an unwrapped position P, the wrapped position is:

Zero-based: P % N

One-based: ((P - 1) % N) + 1

It might seem trivial, but each +/-1 is an opportunity for confusion and a bug nest. With zero-based arrays, it's often the case that the only required +/-1's are when producing and consuming human-readable one-based text.

The next stop on the zero-based epiphany train is the realization that a convenient way to store a range is a { first, first_past } tuple. The size of the range is (first_past - first). The whole-array range is { 0, size }, while a simple empty range is { 0, 0 } (zero is often the default initialization, simplifying things further.)

Both elements are indices, so they can be similarly manipulated, compared and range-checked, making many 'if' clauses easier to think about and verify. If there is a bug, it often ends up being harmless because of the arithmetic properties of this scheme.

Once you start dealing with multiple ranges, the advantages are even more obvious. Two ranges are adjacent iff (first_a == past_b || first_b == past_a). The intersection of two ranges is { max(first_a, first_b), min(past_a, past_b) }, which is nonempty iff they overlap. An array of M adjacent ranges is stored as a uniform (M+1)-tuple.

This realization has become so second-nature for me that I'm probably overlooking four or five even better examples here.

weinzierl
0 replies
11h50m

"it's just the way older languages did"

It's a C family (predecessors and descendants) idiosyncrasy that very unfortunately got out of hand. Most other old languages had either 1-based indexing or were agnostic. Most notably FORTRAN, which is the language for numerical calculations is 1-based.

The seminal book Numerical Recipies was first published as 1- based for FORTRAN and Pascal and they only latter added a 0-based version for C.

Personally, coming from Pascal, I think the agnostic way is best. It is not only about 0-based or 1-based but that the type system encodes and verifies the valid range of index values, e.g. in Pascal you define an array like this:

    temperature = array [ -35 .. 60 ] of real;
You will get an immediate compule-time error if you use

   temperature[61];
At least with Turbo Pascal you could chose if you wanted run-time checks as well.

I have a hard time wrapping my head around the fact that this feature is pretty much absent from any practically used language except ADA.

munchler
0 replies
20h47m

If people start saying “zeroth century”, it’s only going to create confusion, because “first century” will then become ambiguous.

eddieroger
0 replies
3h27m

Your metaphor is comparing apples and oranges. When we could life, it's "one year old" or "aged one year," both of which mark the completion of a milestone. Using the term "18th century" is all-encompassing of that year, which is a different use case. When one recollects over the course of someone's life, like in a memoir, it would be normal to say "in my 21st year", referring to the time between turn 20 years old and 21 years old.

Doctor_Fegg
28 replies
21h8m

Did the American revolution happen before, during, or after the Enlightenment?

I’ve no idea. When did the American revolution happen?

Not everyone’s cultural frame of reference is the same as yours. I can tell you when the Synod of Whitby happened, though.

mixmastamyk
24 replies
20h59m

The American and French Revolutions are a pretty big deal on the road to modern democracy, as well as being tied to 1700s Enlightenment ideals. Everyone educated should know this.

IshKebab
9 replies
20h16m

A pretty big deal in America. I don't think knowledge of the exact date of the American Revolution is a requirement for education outside America. At least no more than "17something...ish".

Dalewyn
7 replies
18h44m

For that matter, a lot of historical dates we consider important are only important to us because A) we're westerners and B) we got them drilled into us by textbooks and classes.

The remaining majority of the world (the west is a minority) sincerely couldn't care less about the American or French or Industrial Revolutions or Columbus (re)discovering America or the Hundred Years War or the Black Death or the Fall of Rome or whatever else.

Kind of like how we as westerners generally couldn't care less about Asian, African, Middle Eastern, Indian, or Polynesian histories.

The culture we grow up in and become indoctrinated by determines what is important and what is not.

And just so we're clear, this bit of ignorance is perfectly fine: Life is short, ain't nobody got time for shit that happened to people you don't even know who lived somewhere you will never see.

mixmastamyk
6 replies
17h40m

Appeal to ignorance and anti-intellectualism, congrats.

Dalewyn
4 replies
16h46m

Let me put it this way: Can you really blame someone for not knowing a historical fact that is completely irrelevant to their life, especially when they probably have more pressing concerns to learn and care about?

We only have so many hours in a day and so many days in a lifetime, while knowledge is practically infinite.

mixmastamyk
3 replies
15h15m

You could say that about anything.

I limited my initial statement to educated folks, and presumably those who would like to be one.

/history/democracy/milestones -> relatively important.

Dalewyn
2 replies
14h31m

As very broad subjects? Yes. But any particular factoid about them is practically irrelevant for most people who don't actually have anything to do with that factoid.

Hell, I would even go as far as to say the American Revolution is irrelevant even for most Americans because it has nothing of practical value. We (Americans) all know about it to varying degrees, but again that is due to growing up and being indoctrinated in it.

mixmastamyk
1 replies
8h3m

"Nothing is important to know besides my lunch time..." yawn

im3w1l
0 replies
3h11m

This is such a boring dismissal of a very interesting subject - what information is important to whom and why.

A pretty important reason for learning about the history of democracy is to learn how and why to preserve it.

bowsamic
0 replies
12h17m

Yes, those things cannot be blamed

IIAOPSW
0 replies
20h9m

"17something...ish" is enough to answer (or at least make a high confidence guess at) the original question (was the American Revolution contemporary with the enlightenment?)

bowsamic
6 replies
12h18m

No one that I know of in Europe gets taught this

kzrdude
2 replies
6h28m

Why not learn it anyway. Broader 'bildung' is a lifelong pursuit.

bowsamic
1 replies
2h3m

I’m from working class England and that’s absolutely not the culture for us. Education is considered superfluous unless it makes you money

kzrdude
0 replies
32m

Right, and I'm from a (partly) academic family. I guess it's a culture/class thing too.

dboreham
1 replies
2h17m

I grew up in Scotland. We studied American History in school. In fact we didn't study English History, except as it pertained to Scottish History, so to this day I'm hazy on Magna Carta, turbulent priests and so on.

bowsamic
0 replies
2h4m

I’m glad to be proven wrong. I’m English btw. We learnt nothing about America but we learnt everything about King Henry VIII

scbrg
0 replies
10h17m

Swede here. Both the American an the French revolution were taught when I went to school in the ninth and tenth decades of the twentieth century. As GP points out, they're both fairly significant events that had side effects relevant even to us up here.

I would be extremely surprised if Sweden was unique among European nations in this regard.

Tainnor
6 replies
19h58m

Of course, they are important, but so are many other things - and speaking e.g. from a European POV, a lot of other events are simply much more salient and commonplace - and the same is probably even more true for other continents (would a random reasonably educated American or European person know when the Meiji restauration happened or when Latin America became independent?). You can't expect everyone to have memorised all the important dates.

mixmastamyk
5 replies
19h50m

America was a backwater at the time and therefore the best place to experiment with European enlightenment ideals. Which it did, and was a direct factor in the French Revolution. I also learned about numerous revolutions in Latin America from Mexico to Bolivar to San Martin over the early 1800s.

The events that directly affect the modern world should be covered in school. I’d say revolutions that created large modern states would be among them.

kergonath
3 replies
19h14m

I’d say revolutions that created large modern states would be among them.

The Russian revolution as well. And the Chinese one. It’s quite difficult to make sense of the late 20th century (yes, I know) without them. Or the early 21th.

mixmastamyk
2 replies
15h5m

Agreed, and it doesn't mean memorizing a date as folks here keep mentioning. The nearest decade is fine in most cases.

kergonath
0 replies
7h56m

Indeed. The order of things (what influenced what, and what directly caused what) is much more important than the accurate date

Tainnor
0 replies
6m

do you remember the decade of the Treaty of Westfalia? Because if you live in central Europe, that's at least as important as the American Revolution.

Tainnor
0 replies
19h15m

Of course, we learn about the American Revolution in schools, but people aren't going to remember every date they were taught in schools. The founding of Rome or the Punic Wars are also hugely important for today's world, but not everyone can place them.

The reason most US Americans probably can place the American Revolution is because I assume it's so often commememorated there. In Germany, people would be much more likely to remember the years 1933, 1949 and 1989, because of how often they're referenced.

kelnos
2 replies
13h33m

It's tiresome when people seem to think it's necessary to be annoyed that others make reference to their own cultural frame of reference in their writing.

Even more tiresome when they feel the need to comment about it.

Most people here, even non-Americans, likely have at least a rough idea of when the American Revolution was. And those who don't will either just gloss over it and think no more of it, or find the answer on the internet in a shorter amount of time than it took me to type this sentence. And then there are people like you. Look at the completely useless subthread you've spawned! Look at the time I've bothered to waste typing this out! Sigh.

jcul
0 replies
11h22m

Also the author does give the year in the next paragraph. So no googling required.

I also didn't know what the date of the American revolution was, but I understood it was just an example.

if you’re like me, you’ll find the question much easier to answer given the second version of the sentence, because you remember the American revolution as starting in 1776, not in the 76th year of the 18th century.
Doctor_Fegg
0 replies
5h10m

Yes, that's entirely the point.

The article is "Here's a thing I don't understand! Let's say how silly it is by comparing it to a thing I do understand."

I mean, thanks for that. I could write an article about "Why do people insist on quoting the American revolution as a reference date when they could just say 'late 18th century'". Which would make the same point and get us precisely as far along as this article did.

Look at the time I've bothered to waste typing this out!

I feel the same way, dude. I feel exactly the same.

MarkLowenstein
14 replies
22h20m

A lot of this runaround is happening because people get hung up on the fact that the "AD" era began as AD 1. But that year is not magic--it didn't even correlate with the year of Jesus's birth or death. So let's just start the AD era a year before, and call that year "AD 0". It can even overlap with BC 1. BC 1 is the same as AD 0. Fine, we can handle that, right? Then the 00s are [0, 100), 100s are [100, 200), etc. Zero problem, and we can start calling them the 1700s etc., guilt free.

card_zero
4 replies
21h48m

This reminds me that centuries such as "the third century BC" are even harder to translate into date ranges. That one's 201 BC to 300 BC, inclusive, backward. Or you might see "the last quarter of the second millennium BC", which means minus 2000 to about minus 1750. [Edit: no it doesn't.]

In fact archeologists have adapted to writing "CE" and "BCE" these days, but despite that flexibility I've never seen somebody write a date range like "the 1200s BCE". But they should.

localhost8000
1 replies
20h46m

you might see "the last quarter of the second millennium BC", which means minus 2000 to about minus 1750.

From comparing some online answers (see links), I'd conclude that even though the numbers are ordered backward, "first"/"last"/"early"/"late" would more commonly be understood to reference the years' relative position in a timeline. That is, "2000 to about minus 1750" would be the first quarter of the second millennium BC.

https://en.wikipedia.org/wiki/1st_century_BC (the "last century BC") https://www.reddit.com/r/AskHistorians/comments/1akt4zm/this... https://www.quora.com/What-is-the-first-half-of-the-1st-cent... https://www.quora.com/What-is-meant-by-the-2nd-half-of-the-5... etc

card_zero
0 replies
20h38m

Oh you're right, I tripped up. "The last quarter of the second millennium BC" means about minus 1250 to minus 1001.

I often get excited by some discovery sounding a lot older than it actually is, for reasons like this.

arp242
1 replies
21h40m

Some people have proposed resetting year 1 to 10,000 years earlier. The current year would be 12024. This way you can have pretty much all of recoded human history in positive dates, while still remaining mostly compatible with the current system. It would certainly be convenient, but I don't expect significant uptick any time soon.

For earlier dates "n years ago" is usually easier, e.g. "The first humans migrated to Australia approximately 50,000 years ago".

abhinavk
0 replies
7h15m

Human Era. There is a Wendover Productions video about this. They also sell a calendar.

jrockway
2 replies
21h22m

I would also accept that the 1st century has one less year than future centuries. Everyone said Jan 1, 2000 was "the new millenium" and "the 21st century". It didn't bother anyone except Lua programmers, I'm pretty sure.

tetris11
1 replies
8h29m

It didn't bother anyone except Lua programmers, I'm pretty sure.

What's this reference to? Afaik, Lua uses `os.time` and `os.date` to manage time queries, which is then reliant on the OS and not Lua itself

jrockway
0 replies
8h22m

Array indexes start at 1 by default, instead of 0.

arp242
2 replies
21h35m

Things like "17th century", "1600s", or "1990s" are rarely exact dates, and almost always fuzzy. It really doesn't matter what the exact start and end day is. If you need exact dates then use exact dates.

A calendar change like this is a non-starter. A lot of disruption for no real purpose other than pleasing some pedantics.

jhbadger
1 replies
17h15m

Exactly. Historians often talk about things like "the long 18th century" running from 1688 (Britian's "Glorious Revolution") to 1815 (the defeat of Napoleon) because it makes sense culturally to have periods that don't exactly fit 100-year chunks.

https://en.wikipedia.org/wiki/Long_eighteenth_century

chefandy
0 replies
15h1m

I thought the article was going to argue against chunking ideas into centuries because it's an arbitrary, artificial construct superimposed on fluid human culture. I could get behind that, generally, while acknowledging that many academic pursuits need arbitrary bins other people understand for context. I did not expect to see arguments for stamping out the ambiguities in labellibg these arbitrary time chunks. Nerdy pub trivia aside, I don't see the utility of instantly recalling the absolute timeline of the American revotion in relation to the enlightenment. The 'why's— the relationships among the ideas— hold the answers. The 'when's just help with context. To my eye, the century count labels suit their purpose for colloquial usage and the precise years work fine for more specific things. Not everything has to be good at everything to be useful enough for something.

pictureofabear
1 replies
19h57m

We're too deep into this now. Imagine how much code would have to be rewritten.

hypertele-Xii
0 replies
8h18m

Imagine how much code is being rewritten all the time for a variety of reasons. Code is live and must be maintained. Adding one more reason isn't much of a stretch.

hgomersall
0 replies
22h4m

This is the right answer.

wavemode
9 replies
22h47m

I do tend to say "the XX00s", since it's almost always significantly clearer than "the (XX+1)th century".

There’s no good way to refer to 2000-2009, sorry.

This isn't really an argument against the new convention, since even in the old convention there was no convenient way of doing so.

People mostly just say "the early 2000s" or explicitly reference a range of years. Very occasionally you'll hear "the aughts".

conception
6 replies
22h42m

The 2000-2009’s are the aughts!

kmoser
2 replies
22h11m

I think you mean "twenty-aughts" (to differentiate them from the nineteen-aughts, 1900-1909).

jhbadger
1 replies
17h12m

I wonder at what point we can just assume decades belong to the current century. Will "the twenties" in the US always primarily mean Prohibition, flappers, and Al Capone or will it ever mean this decade?

fragmede
0 replies
16h25m

I say give it 11 years or so for 2020's kids to starting come into age, and twenties babies will refer to babies born in the 2020's and not centenarians.

buzzy_hacker
1 replies
22h38m

The noughties!

hansvm
0 replies
22h35m

You ought naught to propose such noughts!

savanaly
0 replies
19h59m

You can always just say "the 2000s" for 2000-2010. If the context is such that you might possibly be talking about the far future then I guess "the 2000's" is no longer suitable but how often does that happen in everyday conversation?

greenbit
0 replies
7h32m

How about the 20-ohs?

Think of how individual years are named. Back in for example 2004, "two thousand and four" was probably the most prevalent style. But "two thousand and .." is kind of a mouthful, even if you omit the 'and' part.

Over time, people will find a shorter way. When 2050 arrives, how many people are going to call it "two thousand and fifty"? I'd almost bet money you'll hear it said "twenty fifty". Things already seem to be headed this way.

The "twenty ___" style leads to the first ten years being 20-oh-this and 20-oh-that, so there you have it, the 20-ohs.

(Yes, pretty much the same thing as 20-aughts, gotta admit)

ineedaj0b
6 replies
6h18m

"There’s no good way to refer to 2000-2009, sorry."

I believe this time is called 'the aughts', at least online. I say it in person but I might be the outlier.

shrimp_emoji
4 replies
6h15m

Fuck that. It's "the 2000s".

linearrust
1 replies
3h24m

That only works for now because we are so close to 2000. So there is little ambiguity whether "the 2000s" refer to the century or the decade. But in the future, "the 2000s" will refer to the century. Just like the 1900s refer to the 20th century rather than the decade (1900-1909).

wasmitnetzen
0 replies
2h36m

Not my problem, that can be fixed by a future generation of pedants. It'll be good enough for 50 years or so.

baobabKoodaa
1 replies
6h11m

I would have thought that extends beyond 2009

saalweachter
0 replies
5h53m

We don't need to refer to the third millennium as a whole right now, being in the first part of it. And by the time people need to refer to the 2000s, they will no longer have a reason to reference the 2000s.

You can also refer to, in speech, the twenty-hundreds, without ambiguity; I suppose, if you want it more compact in text, you could always write the 20-00s.

Ylpertnodi
0 replies
2h57m

The naughties.

cvoss
6 replies
1h7m

The same off-by-one annoyance under discussion bites the author in this very article and he didn't even notice: He calls 1776 the 76th year of the 18th century. But it's not! It's the 77th year of that century!

bonzini
4 replies
1h4m

The 18th century started in 1701 and ended in 1800.

OJFord
3 replies
1h2m

I could give you 'ended in 1800' (I'd prefer 'at') but it very much started in 1700, not 1701.

bonzini
2 replies
58m

I didn't make the convention. The 17th century started on January 1, 1701 and ended on December 31, 1800.

That's because the purported year of the birth of Christ was the "first year of the lord", or AD 1, and that's when the first century started. In turn that's because at the time Latin had a word for nothing but not a word for zero, so you couldn't count years starting at zero.

That also means that those old enough to have partied on December 31, 1999 were technically partying for the beginning of the last year of the second millennium.

OJFord
1 replies
50m

Only American Christians say 'year of our lord' after the date, if that's what that means to you and is how you want to think of it that's fine, but be aware nobody else is doing that, even though they're working in AD/BC.

bonzini
0 replies
10m

"Year of the lord" (not "our" lord, please double check what I wrote) is the literal translation of Anno Domini and it's the reason why years are counted from one instead of zero. I quoted the expression, wrote "lord" in lowercase and added "purported" to make it clear that the religious reference was only for historical reasons, I don't know what else I could have done.

I am also not American and not a native English speaker. In fact Latin languages say in expanded form "après Jésus-Christ", "dopo Cristo", "despues de Cristo" (abbreviation is only used in writing), so even speakers who are not religious very much know the reference even if they couldn't care less. We're stuck with it.

I agree that it's more of a "who wants to be a millionaire" quirk than something that actually matters, but this case of correcting someone was one of the few cases where it matters.

satvikpendem
0 replies
54m

You didn't read far enough because he specifically notes this. The 18th century started on January 1, 1701, therefore 1776 is indeed the 76th year of that century, not the 77th, as the year 1700 is part of the 17th century.

coldtea
6 replies
11h9m

The issue, of course, is that “counted centuries” are off by one from how we normally interact with dates—the 13th century starts in AD 1201. There’s a simple solution. Avoid saying “the 18th century”, and say “the 1700s” instead. Besides being easier to understand, it’s also slightly shorter.

The kind of person who cares and reads about "the Xth century" can also trivially understand the date range involved.

The kind of person who can't tell 18th century is the 1700s and 21st century is 2000s, it would make them little good to read history, unless they get the basics of counting, calendars, and so on down.

Perseids
2 replies
10h14m

I'm sorry, but that is just elitist bullshit. First, even if we accept your implicit premise, that it is a training hurdle only, there is enormous value in accessible science, literature and education. In our connected society and in a democracy everyone benefits from everybody else understanding more of our world. In software engineering we have a common understanding that accidental complexity reduces our ability to grasp systems. It's no different here.

Second, your implicit premise is likely wrong. Different people have different talents and different challenges. Concrete example: In German we say eight-and-fighty for 58. Thus 32798 gets two-and-three-thirty-seven-hundred-eight-and-ninety where you constantly switch between higher and lower valued digits. There are many people, me included that not-seldomly produce "Zahlendreher" – transposed digits – because of that, when writing those numbers down from hearing alone, e.g. 32789. But then, there are also people for whom this is so much of a non-issue that when they dictate telephone numbers they read it in groups of two: 0172 346578 becomes zero-one-seven-two-four-and-thirty-five-and-sixty-eight-and-seventy. For me this is hell, because when I listen to these numbers I need to constantly switch them around in my head with active attention. Yet others don't even think about it and value the useful grouping it does. My current thesis is that it is because of a difference between auditory and visual perception. When they hear four-and-thirty they see 34 in their head, whereas I parse the auditory information purely auditory.

What I want you to take from my example, is that these issue might not be training problems alone. I have learned the German number spelling from birth and have worked in number intensive field and yet I continue to have these challenges. While I have not been deeply into history, I suspect that my troubles with Xth century versus x-hundreds might persist, or persist for a long time, even if I get more involved in the field.

coldtea
1 replies
9h25m

I'm sorry, but that is just elitist bullshit.

That's fine, the thought-stopping accusation of "elitism" doesn't bother me. It's a preoccupation for people preferring equality based on dumbing things for the lowest common denominator, lest - god forbid - someone has to make an effort.

I don't think lowly and condescedingly of people like that, I think they're capable of learning and making the effort - they're just excused and encouraged not to.

People who actually have learning difficulties (because of medical conditions or other issues) or people from diffirent cultures accustomed to other systems, are obviously not the ones I'm talking about - and don't excuse the ones without such difficulties, and in countries that have used this convention for 1000+ years.

Second, your implicit premise is likely wrong. Different people have different talents and different challenges

The huge majority that confuses this doesn't do it because they have a particular challenge or because their talents lie elsewhere. They do it because they never bothered, same way they don't know other basic knowledge, from naming the primary colors to pointing to a major country on the map. They also usually squander their talents in other areas as well.

Besides, if understanding that 18th century is the 1700th is "a challenge", then the rest of history study would be even more challenging. This is like asking to simplify basic math for people who can't be bothered to learn long division, thinking this will somehow allow them to do calculus.

card_zero
0 replies
3h16m

There are a lot of shibboleths and pointless conventions in the world. Flutists are called "flautists" because classical musicians aspire to be Italian, and if you don't pretend to be Italian too you'll embarrass yourself. Minute hands were originally long because they pointed to an outer dial of minutes, while hour hands were distinguished by being decorated, but now being slightly longer is just a stylization that means minute hand (even though "minute" means "small") and we have two pointers using the same dial for different enumerations, one without the relevant numbers and distinguished by fractional differences in its width and length. British English spellings are substantially French, and this is perpetuated as a matter of national pride.

Like the word shibboleth, these examples are all kinds of language. Even the clock hands are a sort of visual language. Nth century is another language element. The conventions make outsiders stumble, but for insiders they're familiar and shedding them would be disturbing. Over time they become detached from their origins, and more subtle and arbitrary.

In programming we have "best practise", which takes good intentions and turns them into more arbitrary conventions. These decisions are unworked again later by people saying "no that's dumb, I'm not going to do that", even if it is "how we do it" and even if learning it is a sign of cleverness. We have to be smart to learn to do dumb pointless things like all the other smart people.

Is this good? Keeps us on our toes, maybe? Or keeps us aligned with bodies of knowledge? I think it's definitely good that we have the force of reformist skeptics to erode these pointless edifices, otherwise we'd be buried in them. But new ones are clearly being built up, naturally, all the time. Is that force also good? Alright, yes, it probably is. Put together, this is a knowledge-forming process with hypothesises (I don't like using Latin plurals, personally) and criticisms, and it's never clear whether tradition or reform is on the dumb or overcomplicated side: it remains to be seen, as each case is debated (if we can be bothered).

yard2010
0 replies
8h58m

My high school history teacher taught me a trick. Just subtract 1 from the century to get the correct year. I never remember if it's subtract or add though, so I'm trying both with 21th century and see if I'm right first.

cobbaut
0 replies
5h16m

I was expecting this as the top comment. We learned about centuries in school when I was 9 or 10 or something and nobody found this a hard issue to tackle.

UberFly
0 replies
10h36m

About 20% would understand you. About 50% would tell you the 18th century was the 1800s, and 30% would just stare at you confused.

James_K
6 replies
1d

Another ambiguity, though perhaps less important, is that "2000s" could refer to 2000-2999.

MarkLowenstein
1 replies
22h14m

Could make the convention to say "the twenty hundreds" when referring to [2000, 2100), and "the two thousands" for [2000, 3000).

kelnos
0 replies
13h21m

I actually really like that!

But then how about [2000, 2010)? During the current time period, I expect people are more likely to refer to the decade rather than the century or millennium.

whycome
0 replies
21h57m

life in the 20s is weird.

riffic
0 replies
21h0m

Alternatively: the third millennium.

lucb1e
0 replies
17h41m

Took me a while to understand what's ambiguous about that. For anyone else, what they're saying is that this often (typically, I guess) refers to 2000-2099 and not the other 900 years.

kelnos
0 replies
13h24m

Not "another"; the article addresses this, and admits to not having a good solution, aside from just writing out the actual dates as you did.

kleiba
5 replies
22h20m

Meh, too small of an issue to be bothered about it.

OmarShehata
4 replies
22h16m

small issues are easy to solve!

I think it isn't as sexy/interesting as what I thought the article was going to be about (about a different way of talking about our history, in eras maybe vs centuries or something).

this strikes me as kind of like a small PR to our language that makes an incremental improvement to a clearly confusing thing. Should be easy to merge :)

kleiba
0 replies
5h18m

Maybe not easy to solve, but not of enough importance/impact that would justify spending an extended amount of time changing the current practice.

My original post got downvoted, but I stand by it because empirically, it seems to be clear: the current practice may not be perfect but has obviously stood the test of time. So let's move on, there's more important things to worry about.

kergonath
0 replies
19h8m

Should be easy to merge :)

Famous last words.

Seriously, though, we should have learnt at this point: we cannot solve social issues with technology. Everybody is working on a different tree, and social dynamics that govern the spread of idioms and from which fork you cherry-pick or merge is much more complex than what git is designed to handle.

bowsamic
0 replies
12h22m

small issues are easy to solve!

Citation heavily needed

AnimalMuppet
0 replies
20h42m

No, not easy. You want to change ao social convention? That is not easy.

Even changing all the places it got encoded into software probably wouldn't be easy.

croemer
5 replies
16h46m

I think this is incorrect. Don't centuries start with the 00? In that case the first year of a century is 0, and the 76th year would be 75, not 76 as the author writes:

starting in 1776, not in the 76th year of the 18th century.
mason55
2 replies
16h39m

No, because there was no year zero. The first century started in the year 1 AD.

labster
1 replies
16h23m

There was an astronomical year zero, it’s mainly historians who don’t want the year zero to happen.

croemer
0 replies
15h36m

And there is a year zero in the ISO 8601:2004 system, the interchange standard for all calendar numbering systems (where year zero coincides with the Gregorian year 1 BC; see conversion table). [via Wikipedia]

globular-toast
0 replies
10h30m

I had no idea that the "21st century" started in 2001. Totally thought it started in 2000. Is that why so many things refer to 2001? Mind blown...

t_mann
4 replies
21h55m

There’s no good way to refer to 2000-2009

I like the German Nullerjahre (roughly, the nil years). Naught years or twenty-naughts works pretty well too imho.

Georgelemental
3 replies
19h13m

In the USA we say “the aughts”

kelnos
2 replies
13h26m

Speak for yourself. I refuse to use that abominable construct!

zuminator
1 replies
10h57m

I'd agree to the extent that it sounds kind of twee and affected, but what would you use in its place?

greenbit
0 replies
7h21m

The "ohs". Twenty-ohs. Rhymes with that oaty breakfast cereal.

ash
4 replies
9h3m

The article does not really resolve the ambiguity with 2000s which usually means 2000-2009, not 2000-2099.

That said, in Finnish language people never count centuries. It's always "2000-luku" and "1900-luku", not 21th and 20th.

tetris11
2 replies
8h34m

This leaves ambiguous how to refer to decades like 1800-1809. For these you should ~~specify the wildcard digits as “the 180*s” manually~~ write out the range. Please do not write “the 181st decade”.

I think it wouldn't be wrong to say "the 1800s decade" or "the primus 2000s" or "the alpha 1900s"

ash
1 replies
8h22m

I don't find it persuasive. First of all, people are not going to start writing and saying wildcards. Second, there is an established convention to say "2000s" and mean 2000-2009.

tetris11
0 replies
1h32m

no, that part was crossed out as a joke (hence my attempt to add '~' before and after the crossed out section)

danschuller
0 replies
7h49m

It means that now but that meaning will fade away as the decades advance.

BurningFrog
4 replies
13h31m

Immigrating from a country that uses "1700s", it probably took a decade before I had internalized to subtract 1 to get the real number.

I will resent it till I die.

bowsamic
1 replies
12h23m

I find it weird when people take a long time for these little things. My wife still struggles with the German numbers (85 = fünfundachtzig) and the half thing with time (8:30 = halb neun) even though I managed to switch over to those very quickly. I think it depends on the person how hard it is

BurningFrog
0 replies
10h5m

I think this one was hard for me because centuries don't come up that often.

If it was something I saw/heard every day, I would have adapted much faster.

dinkumthinkum
0 replies
1h43m

Resent is a little strong isn’t it? I feel like today we have such a victim culture that we are oppressed by the most trivial of matters.

Too
0 replies
12h6m

Here we say something like the "ninteen-hundred-era" for the 1900s, "ninteen-hundred-ten-era", for 1910s, "ninteen-hundred-twenty-era", etc. In writing 1900-era, 1910-era, 1920-era. The most recent decades are referred to with only the "70-era" for the 70s. The word for age/epoch/era in our language is a lot more casual in this setting.

The 20xx vs 200x does indeed leave some room for ambiguity in writing, verbally most people say 20-hundred-era vs 20-null-null-era.

hamasho
3 replies
17h2m

It's funny that for me it feels right to associate the 20s with 2020 and the 40s with 1940, but somehow, the 30s is very foreign, and I can't think of 1930 or 2030 either.

kelnos
2 replies
13h27m

Funny, when you say "20s" I think of the 1920s (aka the "Roaring 20s" here in the US). I wonder if it's an age thing (I'm in my 40s), or perhaps a cultural or regional thing.

The 30s as 1930s seems pretty solid to me: US Great Depression, start of WWII, not to mention many of the smaller conflicts that led up to it.

greenbit
0 replies
7h19m

When you find yourself putting a sticker that says 2030 on your license plate, chances are 30s won't mean 1930s very much longer.

globular-toast
0 replies
10h32m

I definitely think of the 1920s too, although I'm only a bit younger. I don't get why anyone would refer to now as the 20s because it's still happening and hasn't necessarily been shaped yet. That and it's close enough to still refer to individual years.

What is odd is that I never feel the need to refer to the 00s. I've never said "naughties" or anything else. In the 90s, people already talked about the 80s, but for me it's all been pretty much the same since the 00s. I don't even know what I'd talk about from the 00s. The music? 2001 by Dr. Dre came out in 1999. It could have been made today. Fight Club was 1999. Could have been made today. The fashion? People have straight hair and wear t-shirts still today.

TheAceOfHearts
3 replies
3h45m

Another example of confusing numeric systems emerges from 12-hour clocks. For many people, asking them to specify which one is 12AM and which one is 12PM is likely to cause confusion. This confusion is immediately cleared up if you just adopt a 24-hour clock. This is a hill I'm willing to die on.

runarberg
0 replies
2h17m

You usually know it from context, and if not, 12 noon or 12 midnight is quite common.

But I do wished people would stop writing schedules in the 12 hour system. You get weird stuff like bold means PM etc. to compensate for the space inefficiency of 12 hour system

flakes
0 replies
1h37m

A few months ago, my girlfriend and I missed a comedy show because we showed up on the wrong day. The ticket said Saturday 12:15am, which apparently meant Sunday 12:15am, as part of the Saturday lineup. Still feel stupid about that one.

cvdub
0 replies
2h8m

That’s why I always say “12 noon” when writing out scheduling instructions for something at 12PM.

MobiusHorizons
3 replies
7h22m

there's no good way to write 2000 - 2009

I have heard people use "the aughts" to refer to this time range [1]. I guess if I was trying to be specific about which century one could say "the two thousand aughts" or "the eighteen hundred aughts". But I think in that context i'd be more likely to say "in the first decade of the 1800s"

1: https://en.wikipedia.org/wiki/Aughts

navane
2 replies
6h10m

I call them the two thousand zeroes (2000s). Or, when people ask me from which year my car is, I say two thousand zero -- emphasizing the smallest digit.

PopAlongKid
1 replies
5h10m

Why not twenty hundred for 2000, just like 1900 is nineteen hundred?

Similarly, I never understood why people still say things like "two thousand ten" instead of twenty-ten for year 2010. No one ever went around saying "one thousand nine hundred ten" for 1910, did they?

navane
0 replies
3h10m

I don't think they said ten hundred for the year 1000 either. I do use the twenty a for individual years though.

James_K
3 replies
1d

One point, the singular they has been in use for centuries, where this essay suggests it's a recent invention.

networked
2 replies
23h52m

The essay doesn't really say anything about when the singular "they" was invented. What it says is that it used to be low-status and unsophisticated language.

In the 1970s, fancy people would have sniffed at using “they” rather than “he” for a single person of unknown sex like this. But today, fancy people would sniff at not doing that. How did that happen?

I think “they” climbed the prestige ladder—people slowly adopted it in gradually more formal and higher-status situations until it was everywhere.
dahart
0 replies
22h21m

The essay’s narrative is overly simplified and misleading about details. Singular they has been common for centuries. The idea that it was lower status is a more recent invention. The author might be referring more to use of they as a personal pronoun. Anyway, whatevs, language changes, that part of author’s message is good.

Tao3300
0 replies
4h19m

Yeah, I'm old enough that I remember being taught not to use singular they in elementary school, but I'm young enough that he/she was given as the preferred alternative.

selimnairb
2 replies
6h46m

We should also stop talking about decades for the purposes of periodization. For example, the “‘80s” arguably ran from the election of Reagan in 1980 to the appointment of Bush W. by the supreme court in 2000. Politically, economically, and culturally, the “90’s” were an elaboration on the “80’s” (e.g. the deregulation and market reforms of Clintonism just being a whitewashed expansion of what Reagan did).

navane
0 replies
6h17m

The nineties clearly started in '89 with the fall of the Berlin wall and consequent "victory" of the cold war, and ended with the dotcom crash in 2000; the cyclical nature of economic growth wasn't bested after all.

ineedaj0b
0 replies
6h21m

The presidency of Bill Clinton was a much different time than that of Reagan in the US.

runarberg
2 replies
22h8m

There’s no good way to refer to 2000-2009, sorry.

The author is wrong here. The correct way (at least in spoken West Coast American English) is the Twenty-aughts. There is even a Wikipedia page dedicated to the term: https://en.wikipedia.org/wiki/Aughts If you want to be fancy you could spell it like the 20-aughts. I suppose there is no spelling it with only digits+s though, which maybe what the author was looking for.

whycome
0 replies
21h57m

I think you actually just agreed with them.

kelnos
0 replies
13h19m

I'm not sure why, but every time I hear someone use that term, I cringe. The word just feels... off... to me for some reason. Like it's an abomination.

networked
2 replies
1d

This leaves ambiguous how to refer to decades like 1800-1809.

There is the apostrophe convention for decades. You can refer to the decade of 1800–1809 as "the '00s" when the century is clear from the context. (The Chicago Manual of Style allows it: https://english.stackexchange.com/a/299512.) If you wanted to upset people, you could try adding the century back: "the 18'00s". :-)

There is also the convention of replacing parts of a date with "X" characters or an em dash ("—") or an ellipses ("...") in fiction, like "in the year 180X". It is less neat but unambiguous about the range when it's one "X" for a digit. (https://tvtropes.org/pmwiki/pmwiki.php/Main/YearX has an interesting collection of examples. A few give you the century, decade, and year and omit the millennium.)

Edit: It turns out the Library of Congress has adopted a date format based on ISO 8601 with "X" characters for unspecified digits: https://www.loc.gov/standards/datetime/.

dkdbejwi383
1 replies
10h30m

How do you speak it though? The “oh ohs”? “Noughts”? “Zeroes”?

PopAlongKid
0 replies
5h8m

Aughts.

ggm
2 replies
10h58m

Pick your battles. Is this easier or harder to win on than de-gendering romance languages?

How about Americans stop with dropping the "and" in nineteen hundred AND twenty?

psychoslave
0 replies
7h57m

I'm close to finish a linguistic project where I provide a different paradigm perspective, where gender is a subcategory of grammatical geste, and I give five additional flections to all nouns in French that refer to living entities which usually decline only under only two genders at most.

Maybe I should start thinking about my next battle. :)

UberFly
0 replies
10h39m

Nineteen Twenty is just so much more to the point. Don't have time for those dang extra baggage words.

Isamu
2 replies
1d

Well yeah, most of the time if I want to be understood I will say “the 1700s” because it is straightforward to connect with familiar dates.

We still say “20th century” though because that’s idiomatic.

macintux
1 replies
22h25m

The author's point is that idioms change, and this one should.

Isamu
0 replies
6h47m

I think you misunderstand me. I mean ONLY the specific idiom “the 20th century” because we’ve just spent more than a century using that and it is unambiguous, it’s clear.

Use “the 1700s” and the like for all other cases.

wryoak
1 replies
23h33m

I thought this article was railing against the lumping together of entire spans of hundreds of years as being alike (ie, we lump together 1901 and 1999 under the name ”the 1900s” despite their sharing only numerical similarity), and was interested until I learned the author’s real, much less interesting intention

endofreach
0 replies
22h47m

Many people find their own thoughts more interesting than the ones of others. Some write. Many don't.

redhippo
1 replies
16h19m

I thought the exact same thing... when I was eight. Then I just learned the mental gymnastics of hearing, say, "20th century" and associating it with "1900" at got along with it. Really, it's a bit dated, but if people can function with miles and furlongs, they can handle centuries...

kelnos
0 replies
13h25m

It's funny, I'm in my 40s and my brain still pauses longer than I'd expect is necessary to do that conversion. I think for 21st or 20th century I'm fine (since I've lived in both of them), but anything prior and it takes me a beat to figure out which date range it is.

nurtbo
1 replies
14h18m

The aughts or naughts (or aughties) are a pretty easy to understand way to refer to 2000-2009, though saying “the early aughts” is clearly more verbose than saying 2000-2003 (except that 2000-2003 looks more specific than is meant)

kelnos
0 replies
13h38m

I think that last point, "there’s no good way to refer to 2000-2009, sorry", was a bit tongue in cheek, refusing to acknowledge "the aughts", since it is a terrible, terrible, stupid way to refer to anything.

karaterobot
1 replies
17h37m

I have to say that I don't think this is really a problem.

kelnos
0 replies
13h26m

When compared to climate change, sure, not really. But it's definitely annoying sometimes.

istrice
1 replies
3h44m

There is no such thing as a word for "1700s" in most European languages.

Also in English it sounds weird, as you have to pronounce it "seventeen-hundreds" whereas the correct pronunciation is "one-thousand-seven-hundred". So 1700s is unsuitable for formal writing or speaking and doesn't map naturally to most languages of Western civilization.

But yeah, I guess the author finds it hard to subtract 1 in his mind :) I could go off about the typical US-centric arrogance that I see on this site, but I think it's already pretty funny as it is.

tyg13
0 replies
2h21m

Seventeen-hundreds doesn't sound stilted at all, to this native English speaker. I would use this even in a very formal context, and certainly no one would bat an eye. One-thousand-seven-hundreds is almost certainly incorrect in English -- I would actually find this to be a very clear marker of not speaking English very well. Are you a native speaker? I find your claims rather bold and quite frankly incorrect.

Also, curious to find out (from elsewhere in this thread) is that Finnish does not typically use centuries. Rather, they use a construction that maps directly to 1700s (1700-luku). I would be careful in accidentally applying your own cultural bias when accusing others of the same ;)

dash2
1 replies
2h9m

I use "the 1900s" to mean 1900-1910.

I understand the difficulty, but I don't think it is too terrible for us to get used to it and we aren't gonna change the past 500 years of literature that already did this.

Also, it's ironic for a bunch of people who literally count arrays from zero to be complaining about this... :-P

fsckboy
0 replies
53m

counting arrays from zero was C's contribution to the public consciousness (inherited from BCPL, and perhaps from ASM) people here generally hate C so it's triggering and they are trying to forget[1]

[1] this wasn't a footnote, it was forget array index 1.

cat_multiverse
1 replies
5h38m

Great article, in my Master's and PhD despite being in a stodgy philological field I always opted for this for clarity and conciseness. It can be hard for people to let go because they want to sound clever.

But oh, dear writer, slightly irksome that you learned copyediting but do not use en-dashes for your date ranges!

elric
0 replies
5h29m

Saying stuff like "the seventeen hundreds" works in English, but it doesn't necessarily work in other languages. In Dutch that would be "de jaren zeventienhonderd", which sounds like crap compared to "de achtiende eeuw".

because they want to sound clever

Citation needed. I've never considered "nth century" to be a product of trying to sound clever.

cabalamat
1 replies
4h17m

You need to count from 0.

1 BC should be renamed year 0. Then the years 0-99 are the 0th century, the years 1900-1999 are the 19th century, etc.

To avoid confusion between new style and old style centuries, create a new word, "centan", meaning "100 years" and use cardinal instead of ordinal numbers, for conciseness. Then the years 1900-1999 are the 19-centan.

im3w1l
0 replies
3h32m

It's always fun how to debate how to square circles, something has to give, but what? My proposed solution is to make the first "century" 99 years.

articlepan
1 replies
2h58m

Solution for 2000-2010: "the first decade of the 2000s".

I realize this is counting by decades/centuries again, but if we just do it for the first decade/century under a larger span it's easy to read.

ysofunny
0 replies
2h57m

nonetheless, typing out, and even saying "the first decade of the 2000s" takes too long

we need something shorter, like the double-ohs "00s"

yard2010
0 replies
9h1m

There are only 3 tough problems in engineering; 1) naming stuff 2) off by one errors

wwilim
0 replies
1d

1800-1899 is the eighteen-hundreds, 1800-1809 is the eighteen-noughties. Easy.

treve
0 replies
12h8m

This was confusing to me as a kid, especially as we entered the 21st. I also still remember learning about the Dutch golden age in elementary school, but can't remember if it was the 1600s or 16th century.

I'm running into a similar issue recently. Turns out that many people saying they are '7 months pregnant' actually mean they are in the 7th month, which starts after 26 weeks (6 months!)

transfire
0 replies
12h57m

We just need a new word to mean “0-indexed count of centuries”.

Not sure what a good word for this would be, but maybe just use what we already say — “hundreds”.

So, in the late 17th hundreds, …

the__alchemist
0 replies
4h29m

Confusing convention. I think most pre teens realize this when learning history, then move on to accept it as a quirk. I would prefer we stop propagating it, as the author says. Don't accept confusing notation when a better alternative is also in common use!

tempodox
0 replies
12h33m

Meh. I acknowledge that the author can split a hair with their bare hand while blindfolded. But to convince everyone else they would have to lift the rest of the world to their level of pedantry.

szundi
0 replies
13h1m

Resistance is futile

samatman
0 replies
15h3m

Technically, decades and centuries start in a January with one or two zeros at the end, respectively. So the 1700s and the 18th century are exactly the same interval of time.

ISO 8601-2:

Decade: A string consisting of three digits represents a decade, for example “the 1960s”. It is the ten-year time interval of those years where the three specified digits are the first three digits of the year.

Century: Two digits may be used to indicate the century which is the hundred year time interval consisting of years beginning with those two digits.
renewiltord
0 replies
18h22m

I'm going to count centuries but just call it the 0th century until 100 CE. I anticipate no problems.

psychoslave
0 replies
8h32m

A bit disappointing, as I was expecting something far more disruptive like an alternative calendar that makes century as a notion a useless tool.

I wish we had some calendar with a departure point far less anthropocentric. So instead of all the genocides of Roman empire, each look at a calendar would be an occasion to connect to the vastness of the cosmos and the vacuity of all human endeavors in comparison to that.

pentagrama
0 replies
13h55m

I got a fight recently with a philosophy teacher about that, I changed the dates like the op to be more clear on my writing, she took it so seriously, it was a big fight about clarity vs. tradition but really superficial and mean on both sides. Now I wish to be* more articulate and have a good debate. I wrote it in my way on the final exam and approved, she had to deal with it I guess.

* Sorry, I don't know how to write that in past, like haber sido in Spanish, my main language.

osigurdson
0 replies
17h31m

If I lived much longer than 100 years I might care more about the precision of such language. However, as it stands, I know what people mean when they say "remember the early 2000s". I know that doesn't mean the 2250s for example - a reasonable characterization for someone living in 3102 perhaps.

nvader
0 replies
10h58m

There’s no good way to refer to 2000-2009, sorry.

"Noughty", Naughty!

layer8
0 replies
17h17m

As a kid I came across a book titled “Scientists of the 20th century”, and I was intrigued how the authors knew about future scientists.

kingkawn
0 replies
13h51m

This is pretty easy to understand if you try whatsoever

kazinator
0 replies
7h35m

No one gets confused about what “the 1700s” means.

Well, there are a few longtermists who do. Do you mean the 01700s? Or some other 1700s that have been cut to four digits for conversational convenience?

:)

When someone's salary depends on being confused, there are ways to dissuade them. But when it's their hobby, forget it.

ivanwood
0 replies
6h3m

Nobody* outside of these circles cares about these questions. And spend hours arguing about.

* You know, these you wierdos call 'normies'

huma
0 replies
19h42m

Thankfully, most of us quit writing centuries in Roman numerals, it's about time we quit centuries as well :) Sadly, however, the regnal numbers continue to persist

hcfman
0 replies
1h27m

Because humanity won’t make it through another century ?

frithsun
0 replies
12h55m

I am emotionally invested in having been born in "the twentieth century" instead of "the 1900s."

dudeinjapan
0 replies
16h0m

There’s no good way to refer to 2000-2009, sorry.

In terms of music this is true.

dan-robertson
0 replies
9h20m

Terms like the ‘long 18th century’ feel like they make less sense when talking about 1700s. Though that mightn’t outweigh the confusion from the ordinals.

carrja99
0 replies
5h39m

We reset them every time a religious figure comes along anyway.

bowsamic
0 replies
12h21m

No. I like the current system and I will continue to use it even if OP somehow manages to make everyone switch over, which they won’t manage to

bee_rider
0 replies
12h59m

I’ve just taken to writing things like: 201X or 20XX. This is non-standard but I don’t care anymore, referencing events from 20 years ago is just too annoying otherwise.

In spoken conversation, I dunno, it doesn’t seem to come up all that often. And you can always just say “20 years ago” because conversations don’t stick around like writing, so the dates can be relative.

arp242
0 replies
21h44m

I've always written it like "1900s", and always considered "20th century" to be confusing. Having to mentally do c-- or c++ is confusing and annoying.

I deal with the "2000s-problem" by using "00s" to refer to the decade, which everyone seems to understand. Sometimes I also use "21st century"; I agree with the author that it's okay in that case, because no one is confused by it. For historical 00s I'd probably use "first decade of the 1700s" or something along those lines. But I'm not a historian and this hasn't really come up.

adolph
0 replies
1m

The issue, of course, is that “counted centuries” are off by one from how we normally interact with dates—the 13th century starts in AD 1201.

The author misses. The author would like to skip some significant interpretive steps, chiefly when our dynamically typed language uses number words or characters in different contexts. Suggested reading is about the differences between and use cases of nominal, ordinal, interval and ratio.

https://www.statisticshowto.com/probability-and-statistics/s...

acheron
0 replies
22h11m

Sure, then we can switch to French Revolutionary metric time.

_dain_
0 replies
4h9m

They should have names:

    - 1500s: The Columbian Century
    - 1600s: The Westphalian Century
    - 1700s: The Century of Enlightenment
    - 1800s: The Imperial Century
    - 1900s: The Century of Oil
    - 2000s: The Current Century (to be renamed in 2100)
You might think it's Eurocentric, and you'd be right. But every language gets to name them differently, according to local history.

Y_Y
0 replies
7h36m

Please do not write “the 181st decade”.

Well now I have to

Grom_PE
0 replies
20h36m

I agree. Another good point to get rid of counting centuries would be that in some languages (Russian) centuries are written in Roman numerals. It's annoying having to pause and think of conversion.