Author makes a good point. "1700s" is both more intuitive and more concise than "18th century". The very first episode of Alex Trebeck's Jeopardy in 1984 illustrates how confusing this can be:
https://www.youtube.com/watch?v=KDTxS9_CwZA
The "Final Jeopardy" question simply asked on what date did the 20th century begin, and all three contestants got it wrong, leading to a 3-way tie.
Logic is in short supply and off-by-one errors are everywhere. Most people don't care. I think it's more doable to learn to just live with that than to reprogram mankind.
oooor we can slowly migrate towards sensibility as we did celsius and centimeters
Re temp, I’m glad we use F for daily life in the USA. The most common application I have for temp is to understand the weather and I like the 0-100 range for F as that’s the typical range for weather near me.
For scientific work I obviously prefer kelvin.
Celsius is nearly useless.
That's like ... your opinion man.
Personally I like knowing that water boils at exactly 100 degrees.
At sea level, yes :)
I do agree, though I live in Europe and C is the norm. I could never wrap my head around F.
That said, I think 0 is more important in daily life, below or above freezing. How much is that in F again?
As a dweller of a cold place in the USA, F is pretty handy because "freezing" isn't terribly cold. Having 0F be "actually quite seriously cold" is useful.
My parents care a lot about "przymrozek" - which is when it gets sub-zero C at night and you need to cover the plants and close the greenhouse doors and put a heater there so the plants survive. They give warnings in radio when this happen outside of regular winter months.
There's also special warning for drivers if it was sub-zero because then the water on the roads freezes and it's very hard to break.
I'd say it's way more important a distinction than anything that F makes obvious.
Also, conveniently, freezer temperature is 0F not 32F.
We just need a new scale just for weather where 100 is 100F and 0 is 32F/0C then everyone can be happy. We'd have a lot more days with subzero temperatures though
You just use one thing and you’ll learn it. When I was a kid my country changed from archaic 12 point “wind levels” to m/s. It took everybody a few weeks to adjust but it wasn’t hard. It was a bit harder for me after moving to America to adjust to Fahrenheit, but as you experience a temperature, and are told it is so many Fahrenheit, you’ll just learn it. I have no idea at what temperature water boils in F simply because I never experience that temperature (and my kettle doesn’t have a thermometer).
That said I wished USA would move over to the unit everyone else is using, but only for the reason that everyone else is using it, that is the only thing that makes it superior, and it would take Americans at worst a couple of months to adjust.
I agree. For ambient temp, F is twice as accurate in the same number of digits. It also reflects human experience better; 100F is damn hot, and 0F is damn cold.
Celsius is for chemists.
There's very little difference between e.g. +25°C and +26°C, not sure why you would need event more accuracy in day to day life. There are decimals if you require that for some reason.
Celsius works significantly better in cold climates for reasons mentioned in another comment.
If that’s the case why do the Celsius thermostats I used while on vacation in Canada use 0.5C increments? The decimals are used, because the change between 25C and 26C is actually pretty big :)
In my old apartment, the difference between 73F and 74F was enough to make me quite cold or hot. And that’s a difference of about 0.5C. I’m not arguing that Farenheit is better, but I definitely do prefer it for setting my thermostat (which is a day to day thing) , but then again I grew up using it so that could be why I prefer it too.
Probably because they were made for US and changed the labels? I've never seen a thermostat with 0.5 C increments in Europe.
I would maybe be able to tell you if it's 23 or 27, certainly I can't tell 1 C difference.
The difference between -1 C and +1 C is VASTLY more important in daily life than the difference between 26.5 and 27 C.
Farmers, drivers, people with gardens need to know if it will get subzero at night.
Nobody cares if it's 26.5 C or 26 C.
What the hell are you talking about. If it's 0°C outside (or below that), I know that it's high time to put winter tires on because the water in the puddles will freeze and driving on summer tires becomes risky. I had to look it up, but apparently that's +32 °F. Good luck remembering that.
+10°C is "it's somewhat cold, put a jacket on". +20°C is comfortable in light clothing. +30°C is pretty hot. +40°C is really hot, put as little clothing as society permits and stay out of direct sun.
Same with negatives, but in reverse.
Boiling water is +100°C, melting ice is very close to 0°C. I used that multiple times to adjust digital thermometers without having to look up anything.
It's the most comfortable system I can imagine. I tried living with Fahrenheit for a month just for fun, and it was absolutely not intuitive.
You'll want winter tires on well before the air temperature hits freezing for water. Forecasts aren't that predictable, and bridges (no earth heat sink underneath) will ice over before roads do.
40 F is a good time for getting winter tires on.
As someone who lives in a humid, wet area that goes from -40 at night in winter to 100+ F in summer, I also vastly prefer Fahrenheit.
The difference between 60, 70, 80 and 90 is pretty profound with humidity, and the same is true in winter. I don't think I've ever set a thermometer to freezing or boiling, ever. All of my kitchen appliances have numbers representing their power draw.
Well, it's been working fine for me for about 15 years, let's agree to disagree here. I would still find it easier to remember to change the tires at +1°C than whatever the hell it comes down to in Fahrenheit.
I too live in a region with 80 (Celsius) degree yearly variation (sometimes more; the maximum yearly difference I've lived through is about 90 degrees IIRC: -45 in January to +43 in July), and Fahrenheit makes absolutely no sense to me in this climate.
Winter tyres are less to do with freezing water and more to do with the way the tire compound in summer tires hardens/loses elasticity and therefore grip in lower temperatures, around 7 degrees Celsius.
I don't see enough love for feet and inches.
A foot can be divided cleanly into 2, 3, 4, and 6. Ten is a really sucky number to base your lengths on. It only divides nicely into 2 and 5.
"Celsius is nearly useless."
http://i.imgur.com/3ZidINK.png?1
For anyone not living in the US or Jamaica or Belize, it is Fahrenheit that is completely useless. Which is something like 7,7 billion people.
0 = water freezing temp is hugely useful heuristics for anyone living in moderate climate.
95% of the world uses Celcius without problems because they're used to it. You'd either also be fine with it or you belong to a sub-5th percentile which couldn't figure it out, take your pick.
For me the best feature of Celsius, the one that makes it much better for weather, is the zero on the freezing point of water. Everything changes in life when water start to freeze, roads get slippery, pipes burst, crops die. So it is important that such a crucial threshold is represented numerically in the scale. In other words, going from 5 to -5 in Fahrenheit is just getting 10° colder, nothing special, while going from 2 to -2 in Celsius is a huge change in your daily life.
I agree that for weather F is better, but I don't think it's so much better as to be worth having two different temp scales, and unlike K, C is at least reasonable for weather, and it works fine for most scientific disciplines.
At least conversion between Celsius degrees and Kelvin is easy and lossless
It’s been tried. The “rational” calendar reform was something of a failure.
That's what most people think and the world keeps trucking along.
It's the rare people that don't who actually change the world.
You can change the world if you make it easier to meet a need enough people have. Persuading everyone they're holding it wrong is not that.
Those are exactly the same thing if you convince everyone they are holding it wrong by holding it right.
e.g. Fosbury flop (https://en.wikipedia.org/wiki/Fosbury_flop)
"You can change the world if you make it easier to meet a need enough people have"
True and should not be forgotten in this debate.
But clear communication is a need many people have.
Persuasion by argument, maybe not. But if you simply ask for clarification when you hear "nth century" but not when you hear "n-hundreds" then you've effectively made it easier for the speaker to meet their need one way over the other way.
Same thing for "this weekend" when. Not spoken during a weekend.
Nobody's asking to reprogram anyone. Just stop using one of two conventions. The reason to do it is simple and obvious. I'm really baffled at the responses here advocating strongly for the current way. But I guess that's just a "people thing"
Asking me to stop using my preferred convention is tantamount to 'reprogramming' me.
Reprogramming mankind is unreasonable. Reprogramming you may not be.
I'm amazed you didn't even hedge by saying "telling me to", claiming that a request to shift convention is tantamount to a reprogramming is certainly a bold, provocative claim.
What's "more logical" about "the seventeenth century" compared to "the sixteen hundreds"?
I’d say more sensible. It’s always weird to me to use the number 17 to talk about years that start with 16. Makes more sense to just say the 1600s.
After their 16th birthday, the person is going through their 17th year.
Just like 11:45 can be told as "a quarter to 12"
The publishing industry already has style guides for large swaths of the industry.
Imagine, for a moment, that AP adopted the OP's "don't count centuries" guidance. An enormous share of English-language publishing outfits would conform to the new rule in all future publications. Within a couple months, a large share of written media consumption would adopt this improved way of talking about historical time frames.
The best part? Absolutely no effort on the part of the general public. It's not like the OP is inventing new words or sentence structures. There's zero cognitive overhead for anyone, except for a handful of journalists and copywriters who are already used to that kind of thing. It's part of their job.
I think a lot of people take these sorts of ideas to mean "thou shalt consciously change the way you speak." In reality, we have the systems in place to make these changes gradually, without causing trouble for anyone.
If you don't like it, nobody's trying to police your ability to say it some other way - even if that way is objectively stupid, as is the case with counting centuries.
Why not just fix the calendar to match what people expect?
There was no time when people said "this is year 1 AD". That numbering was created retroactively hundreds of years later. So we can also add year 0 retroactively.
Specifically I agree, but generally I disagree. I’m very glad we got the metric system, standards for commonly used protocols and so on.
You just made me realize that the common saying “the eleventh hour” isn’t what anyone thinks it is
The right answer was, and still is: Jan 1, 1901
No, the first century began Jan 1, 0000. Whether that year actually existed or not is irrelevant - we shouldn't change our counting system in the years 100, 200 etc.
The calendar goes from 1 BC to 1 AD, there is no year 0.
There is no year zero according to first-order pedants. Second-order pedants know that there is a year zero in both the astronomical year numbering system and in ISO 8601, so whether or not there is a year zero depends on context.
It's ultimately up to us to decide how to project our relatively young calendar system way back into the past before it was invented. Year zero makes everything nice. Be like astronomers and be like ISO. Choose year zero.
Yes but, is there such a thing as a zeroth-order pedant, someone not pedantic about year ordinality? As a first-order meta-pedant, this would be my claim.
Moreover, I definitely find the ordinality of pedantry more interesting than the pedantry of ordinality.
Thank you for your service.
Interesting indeed. I suppose third-order pedantry must be "jerk".
Talking about standards let's not pick and choose.
First, let's get rid of miles and feet, then we could even discuss this.
If only—I think most US citizens who actually work with units of measurement on a daily basis would love to switch to the metric system. Unfortunately, everyone else wants to keep our “freedom units” (and pennies)
Or, just to add more fuel to the fire, we could use the Holocene/Human year numbering system to have a year zero and avoid any ambiguity between Gregorian and ISO dates.
https://en.wikipedia.org/wiki/Holocene_calendar
We are all defacto ISO adherents by virtue of our lives being so highly computer-mediated and standardized. I’m fully on board with stating that there absolutely was a year zero, and translating from legacy calendars where necessary.
I vote for a year zero and for using two's complement for representing years before zero (because it makes computing durations that span zero a little easier).
What does that even mean? Do we allow for the distortion due to the shift from the Julian to Gregorian calendars, such that the nth year is 11 days earlier? Of course not, because that would be stupid. Instead, we accept that the start point was arbitrary and reference to our normal counting system rather than getting hung up about the precise number of days since some arbitrary epoch.
> What does that even mean?
It means just what it says. In the common calendar, the year after 1 BC (or BCE in the new notation) was 1 AD (or CE in the new notation). There was no "January 1, 0000".
As I said twice, whether that date actually existed or not is irrelevant.
> whether that date actually existed or not is irrelevant.
No, it isn't, since you explicitly said to start the first century on the date that doesn't exist. What does that even mean?
The first day of the 1st Century is Jan 1, 1 AD.
The point is that some days got skipped over the centuries, but there's no need to make the Centuries have weird boundaries.
> The first day of the 1st Century is Jan 1, 1 AD.
That's not what the poster I originally responded to is saying. He's saying the 1st Century should start on a nonexistent day.
No, I'm saying we ignore when it actually started and instead use the normal rules of counting to decide what to call the respective centuries.
You can make this work by having the 1st century start on the last day of 1 BC. Think of it as an overlap if you like; it doesn't really matter.
That allows for consistent zero-indexed centuries. It doesn't have any other practical consequences that matter.
0 CE = 1 BCE
10 C = 50 F = 283.15 K
1 = 0.999…
Things can have more than one name. The existence of the year 0 CE is not in question. What’s in question is whether that’s a good name for it or not.
Hence why the parent wrote "Whether that year actually existed or not is irrelevant".
They might or might not have a point, but they already addressed yours.
Have a read of this, it’s not how you think it is. https://www.historylink.org/File/2012
How can that be if 15 of those centuries are on the Julian calendar?
Also, when they switched things in 1582:
https://www.britannica.com/story/ten-days-that-vanished-the-....
The century in which the switch occurred (which was different in different countries) was shorter than the others. As were the decade, year, and month in which the switch occurred.
The "original" Julian calendar was indifferent to year number systems. The Romans typically used the consular year, although Marcus Terentius Varro "introduced" the ab urbe condita (AUC) system in the 1st century BC, which was used until the Middle Ages. From the 5th to the 7th century, the anno Diocletiani (also called anno martyrum) after emperor Diocletian was used primarily in the eastern empire (Alexandria), or the anno mundi (after the creation of the world). It was Dionysius Exiguus in the 6th century, who replaced the anno Diocletiani era with the Anno Domini era. His system become popular in the West, but it took a long time until it also was adopted in the East. Its application to years before the birth of Christ is very late: we come across it first in the 15th century, but it was not widespread before the 17th century.
All these systems used the Julian system for months and days, but differed in terms of the year and (partialy) in the first day of the year.
Incorrect, this answer wasn't given in the form of a question ;)
On the other hand "1700s art" sounds like trash compared to "18th century art".
How about if you say "settecento"? Maybe it is a new confusion that they drop a thousand years, and maybe it would imply Italian art specifically.
Just to make sure I understood this, that would be used as "17th settecento" to mean 1700s right?
(This Xth century business always bothered and genuinely confused me to no end and everyone always dismissed my objections that it's a confusing thing to say. I'm a bit surprised, but also relieved, to see this thread exists. Yes, please, kill all off-by-one century business in favor of 1700s and 17th settecento or anything else you fancy, so long as it's 17-prefixed/-suffixed and not some off-by-anything-other-than-zero number)
Think of it as the 700s, which is a weird way to refer to the 1700s, unless you are taking a cue from the common usage. That’s just how the periods are referenced by Italian art historians.
Not much different from 60s refering to 1960 to 1969, to my mind
And Italian people in general.
"settecento" can be read as "seven hundred" in Italian; gramps is proposing to use a more specific word as a tag for Italian art from the 1700s. Of course, 700 is not 1700, hence the "drop 1000 years". The prefix seventeen in Italian is "diciassette-" so perhaps "diciasettecento" would be more accurate for the 1700s. (settecento is shorter, though.)
Hope this clarifies. Not to miss the forest for the trees, to reiterate, the main takeaway is that it may be better to define and use a specific tag to pinpoint a sequence of events in a given period (e.g. settecento) instead of gesturing with something as arbitrary and wide as a century (18th century art).
You're looking for millesettecento [1]. Italian doesn't do 10-99 hundreds, just 1-9 hundreds and 1-99 thousands.
[1] https://www.youtube.com/watch?v=LMIGnMs4VZA
settecento means "700". Just proposed above as a way to say 18th century or 1700s, same as we sometimes remove the "2000" and just say "the 10s" for the decade starting 2010 (nobody cares for the 2011-as-start convention except people you don't want to talk to in the first place).
I think that's good, because it helps you realize that categorizing art by century is kind of arbitrary and meaningless, and if possible it would be more useful to say something like "neoclassical art from the 1700s". "18th century" isn't an artistic category, but it kind of sounds like it is if you just glance at it. "Art from the 1700s" is clearly just referring to a time period.
no it won't lol, people will pay just as much through the new dating system as they would through the old.
People pay as much for art because they are the rare combination of educated person with money which values the aesthetics and artifacts of an era, or as something to signal their wealth to others, or as a way to launder money.
Agreed. The haiku is “18th century art” as that’s when it was first invented. So it’s either a uselessly broad category, or an indefensibly Eurocentric one.
And 1700s already has a different meaning, i.e. early 18th century.
If using “1700s”, I’d write it as “art of the 1700s”.
There is no "0" year, 1 is the 1st year, so 100th year is still the 1st century, therefore 2nd century starts in 101 and 20th in 1901.
I find this decree frustrating. Someone could have just as easily said "the 'first' century starts at 1 BC" to account for this.
Or better yet just year 0, why not? Do we say the 80s start in 1981?
The concept of zero was not popularized in 500s Europe, when the system was devised.
And also, the system is a direct descendant of regnal numbering, where zero wouldn’t have made sense even if invented (there is no zeroth year of Joe Biden’s term of office).
Do you also count the first decade of your life from January 1st of the year before you were born?
https://en.wikipedia.org/wiki/East_Asian_age_reckoning
Then what is the last year of the first century BC? 2 BC? Now there's an off-by-2!
What? 0 is the year Jesus Christ was born.
No, jesus was born in 1AD
"Most scholars, on this basis, assume a date of birth between 6 and 4 BC"
https://en.wikipedia.org/wiki/Chronology_of_Jesus
Doesn't matter, we can just agree the first century had 99 years, and be done with it.
We have special rules for leap years, that would just be a single leap-back century.
At the scale of centuries, starting 2nd century at 100 as opposed to 101 is just an 1% error, so we can live with it. For the kind of uses we use centuries for (not to do math, but to talk roughly about historical eras) it's inconsequential anyway.
It’s actually the second.
Trebek's*
Let's reform Alex Trebek's name, it's difficult.
And while we are at it, Tim Apple
There was an airport-novel series about a future where people's surnames are the company they work for. It was called Jennifer Government.
Some of the characters in Death Stranding, namely the main one, have a given-name, profession, employer convention -- as in Sam Porter Bridges.
And in the far future of that future surnames like Government or PepsiCo or Alcoa will be as common as Smith, Fletcher, and Miller.
Ahhh, reminding me of Nation-States too. What a curious little website / online community.
Death strandings naming is not too far from very common naming conventions throughout history, it's a nicely subtle touch.
Glenn Miller, Gregory Porter and Sam Smith just happen to have been more inclined to make music.
1700s means 1700–1709, i.e. roughly the first decade in the 18th century. Just like '2000s'. The OP acknowledges this issue and then just ignores it.
I have a solution that would work in writing, but not sure how to pronounce it:
1700s means 1700–1709
1700ss means 1700–1799
To go one step further:
2000s means 2000-2009
2000ss means 2000-2099
2000sss means 2000-2999
There are numerous common concise ways to write the 18th century, at the risk of needing the right context to be understood, including “C18th”, “18c.”, or even “XVIII” by itself.
These are even more impractical, so I wonder what your point is? I can come up with an even shorter way to say 18th century, by using base26 for example, so let's denote it as "cR". What has been gained?
that is fascinating trivia. you could do a whole Jeopardy on Jeopardy facts alone
In Icelandic the 1-based towards counting is used almost everywhere. People do indeed say: “The first decade of the 19th century” to refer to the 18-aughts, and the 90s is commonly referred to as “The tenth decade”. This is also done to age ranges, people in their 20s (or 21-30 more precisely) are said to be þrítugsaldur (in the thirty age). Even the hour is sometime counted towards (though this is more rare among young folks), “að ganga fimm” (or going 5) means 16:01-17:00.
Speaking for my self, this doesn’t become any more intuitive the more you use this, people constantly confuse decades and get insulted by age ranges (and freaked out when suddenly the clock is “going five”). People are actually starting to refer to the 90s as nían (the nine) and the 20-aughts as tían (the ten). Thought I don’t think it will stick. When I want to be unambiguous and non-confusing I usually add the -og-eitthvað (and something) as a suffix to a year ending with zero, so the 20th century becomes nítjánhundruð-og-eitthvað, the 1990s, nítíu-og-eitthvað and a person in their 20s (including 20) becomes tuttugu-og-eitthvað.
So shouldn't this be the "0-episode"? ;-)
(0, because only after the first question, we have actually 1 episode performed. Consequently, the 1-episode is then the second one.)
Yea, but a rhetorical failure. This sounds terrible and far worse than alternatives.
If we want a better system we'll need to either abandon the day or the Gregorian (Julian + drift) caliendar.
What about languages that don’t have an equivalent to “the Xs” for decades or centuries?
Also, 1799 is obviosly more than 1700, as well as 1701 > 1700 – why should the naming convention tie itself to the lesser point? After one’s third birthday, the person is starting their fourth year and is not living in their third year.
I feel this is relevant https://xkcd.com/927/
Depends on the language. Century being 3 syllables really makes it long in English, but it's still 5 syllables vs 5 syllables.
In Polish: [lata] tysiącsiedemsetne (6 [+2] syllables) vs osiemnasty wiek (5 syllables).