Of Course the Aughts Are Ending
Please spare us from these ninnies who keep insisting that the aughts aren't ending tonight. If we were discussing "the 201st decade," they'd have a point. We're not, so shut up. If you insisted that 2000 was the last year of the 20th century, you were being pedantic. If you insisted it was the last year of the '90s, you were just being stupid.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Your innumerate idiocy doesn't justify scolding people who can actually, you know, count. All it does is call your own intelligence into question.
The best counter in the world can't find the words "nineteen ninety" in the phrase "two thousand."
I can find twenty one "ohs" in this decade, though.
Since when did semantics over rule math? STFU Jessie, we have one year left. Now screw you guys, I'm going home.
We are not talking about math, we are talking about the name of a period of time. I'd just as soon kill you as listen to your fucktardery, shitfuck.
It's so simple. A human is one year old after he has completed one year of life. A decade is ten years old (and ends) after the tenth year has expired, not when it starts. This is the tenth year of the decade. It just started. The end.
It's nothing to do with maths. A century is defined in terms of the start of the common era (either CE or AD depending on your religious affiliation) so the twentieth century finished at the end of 2000. Decades are not defined in that manner and can describe any arbitrary period of ten years. So e.g. the 1980s are defined as all years of the form 198x and the naughties as all years of the form 200x where x is a variable from 0->9.[1]
In a nutshell, decades are not defined in the same manner as centuries.
[1] OK, so there was some maths.
"So e.g. the 1980s are defined as all years of the form 198x and the naughties as all years"
They're not "the naughties" (and even if they are, at least have the decency to write "the Naughties"). They're the '00s.
You got everything else right, so get the name right too.
The Naughties is a joke name used because most people don't know how to pronounce 00s.
Pedantic and right are often the same thing, you know.
I think it's much ado about little. What's more important is that we call this decade the Aughts.
No, they're the '00s. Just like the '90s were the '90s, the '80s were the '80s, etc.
This was pretty immature for a reason blog post.
Oh, btw, drink.
Noooooooooooooo...
The Ohs, damn it! I refuse to say... that word.
You ought to.
I see what you did there.
the double oh's?
in 007 I was bangin' three women...
busy, busy times
I recall that a contingency plan if the Moon landing didn't occur in 1969 was to say that 1970 was in "this decade"--i.e., part of the Sixties. Which tempts me to say it's total bullshit.
What we need is a commission to distinguish between the "First Decade of the 21st Century" and "the Aughts." Or maybe just a Czar.
Ever been between two people who are yelling at each other?
Jesse is 100% correct. Anyone who insists decades start with year 1 instead of year 0 cultish boobs.
Jesse was asleep in school, and Warren was too. There never was a year '0'. It starts with 1. The first decade ends with 10. The end of the first century was December 31, 100. Not 99. The last of the second millenium was Dec. 31, 2000. Anyone knows that. Why don't you? Putzes.
I wasn't asleep in school, but apparently you were asleep when you read my post. Otherwise you would have noticed that part that says If we were discussing "the 201st decade," they'd have a point. We're not, so shut up.
Warren and I know that there was no year zero. The fact that there was no year zero has nothing to do with the point we're making. 1900 was the last year of the nineteenth century; it was also the first year of the 1900s. Is this really so difficult to understand?
Yeah, or you could define a century to be "100 years, except for the first one". It's all arbitary, anyways.
sums it up nicely. The Aughts aren't the equivalent to "first decade in the 21st century". It's the 00s. 00-09.
hell - you could be like my great aunt Myrna who insists of dividing decades if they end with a 4. She's half way done with the "five-fourteen", as she calls it.
and don't be too sure about warren not knowing that there's no year zero.
[keed keed]
Was 1990 part of the 80's? What's that you say? No?
Pedantry aside, we're talking about common usage here, so Jesse is correct. But you didn't have to call ProL stupid, Jesse. That was just uncalled for.
I'm looking forward to saying, when I'm 100, "Back in Aught Eight. . . ." In fact, I plan to trail off exactly like that, even if I'm plugged into a supercomputer by then.
Am I the poster child for being pedantic around here? Somehow, I don't that's right.
That should be "Somehow, I don't think that's right", you pedantic jerkweed.
Strange that you felt the compulsion to point out my curious omission.
Nothing can ruin my bliss, anyway. New computer has arrived. It only has 8 gigs of RAM and a Core i7 processor, but my needs are simple.
It's like twenty times more powerful than my current system, so all is well with the world. At least, it will be when I'm home and not here.
You also got a severely badass video card, right? Right?!?
Not top end, but better than average. I can't remember the specs.
If I start gaming on the PC again (as opposed to the Xbox 360), I may do some upgrading. I'm just thrilled to enter the world of 21st century computing.
Oh, you huge schmuck. A crazy video card (without out going to an SLI bridge) isn't that much more than an OK one, and makes a colossal difference.
The XBox can never match a jacked PC for how the game looks.
It's actually been quite a while since I placed the order, so I can't really remember. I know I didn't just get the standard card.
The Radeon 5770s are the sweet spot right now, about $170. If you want to blow another $80-100 on a 5870, and get a 256-bit bus, that'll work too.
I just checked--I may have to upgrade. Here's what I got: 1024MB nVidia GeForce GT220. For now, just about anything will wipe my current system on the floor. It's from 2003. . .the Before Time.
I think a 5770 will do for my needs. For now. Dangit.
I won't need it for gaming for a little while, so maybe the humming graphics card can be an anniversary gift. My wife never knows what to get me.
A 220 is OK, but yes, it is far surpassed by what is out now. Still, you'll probably find it more than adequate, unless you're playing Far Cry or Crysis.
Personally, I get a kick out of how little I can spend on building a system, but still have kick-ass performance. For this reason, I'll take AMD over Intel (which needs the competition anyway). I'll get 80% of the performance for roughly half the price.
I bought an Phenom II 720, which is a triple core, for $99. I also bought a Gigabyte board which can unlock the 4th core in some cases. Popped it in, flashed the BIOS and viola! a 4th core for $0. It's even stable OC'd by 400 MHz.
I have a GeForce 9800 GT video card, which is pretty creaky, but it's fine for now. I do have my eye on the 5770s, waiting for them to frop under $150.
BTW, you probably wont notice any difference with the 8 GB of RAM, but it does future-proof you to a good degree. I'm running 4 gigs of PC1300 RAM and that's more than enough for my needs and mighty speedy.
When I boughtFar Cry, my system blew out on me a few days later and only a quarter the way through the game (lightning storm, only the power unit was salvageable). I wound up getting out the old PIII with a Ge-Force 2 graphic card to play it. Far Cry was scalable to an impressive extent, and didn't need shader support to run.
Tip for the youngsters who are about to get married, okay, just for the dude youngsters. Never let the old girl blow you on your birthday, Valentine's Day, Christmas or on the anniversary, or she'll take away from this weak chink in your armor the first time you let her, the idea she can get away with only blowing you when you deserve a customary present and not as an applied obligation under the standard marriage vows.
implied obligation, 'applied', lol!
Your commas are all wrong.
I will lay down my arms. I have been defeated. It turns out you can make something so just by saying it enough times.
Of Course the Aughts Are Ending
I think I prefer "aughties" myself.
So did Killing Joke.
Well, close, anyway.
awesome, Baked!
To paraphrase John F. Kennedy, ich bin ein pedantic, stupid ninny. Ich werde nicht shut up.
I am a jelly doughnut.
Good thing Kennedy wasn't in Vienna, or he would have said, "Ich bin ein Wiener."
For the pedant: Would have been true, though.
He womanized because he couldn't stop thinking with his Viennese.
Mr. Walker, if when you count items you consider the first item the zero item and count the tenth item as item nine, maybe you'd have a good point. I doubt you do either.
When I count items and the first one ends with a zero, I still consider it the first item.
You count strangely. I tend to start counting with 1. Which, oddly enough, almost never ends in zero.
Yes, I start counting with one. The first item on the list of years following the '90s is 2000. It ends with a zero.
So this is about where the decade started. You started counting with 2000. Other people started with 2001. Which means to them 2009 is not the last year of the decade. (And I agree with them.) But ultimately, what the frak do you care?
Anybody who doesn't consider 2000 to be part of the 2000s is a fool and not worth considering.
Either this is a name problem or a counting problem. You don't get both.
I start counting w/ zero, but I'm written in C++.
That's not counting, that's offsetting. You certainly don't start counting at 0 when you figure out the number of integers in
int drink[ 42 ];
You mean
INTEGER, DIMENSION( 0:41 ), INTENT( IN ) :: DRINK
42 is the length of the array, which is completely accurate. The last element in the array would be drink[41] and the first would be drink[0].
If you said int count = drink.length; you'd get 42, as you should.
But if you start counting at 0, you're only going to get up to 41. So your so-called "count" is off by one from the actual number of elements in the array.
The point is, the indices of array elements in C are not ordinals, they're offsets. So strictly speaking, drink[0] is not the "zeroth element" but rather the element with offset 0.
since when does C++ allow you to get array lengths with a member variable?
count = sizeof(drink) / sizeof(int)
So Walker has officially given up and embraced popular ignorance?
Jesse,
This pendant thinks you meant the 201st decade. Otherwise, I agree with you.
To distinguish from the turn of 20th century, I would vote for call this decade the naughts (aka naughties).
Yes, I did. Fixed.
Pendant or pedant? Please excuse me for being such a necklace.
The real problem is that we don't hit a teen until 2013, so we can't really say we're entering the teens tonight. Ten years from now, no one will have trouble saying the teens are ending and we're entering the twenties, because the reason for the demarcation will be obvious.
Well, that and the fact that fewer people will half-remember the hubbub around the turn of the century and repeat an argument they absorbed but didn't understand.
Maybe the 2000s will be the longest decade ever, with thirteen years. The twenty-teens, however, will be quite short, with only seven years.
Just because eleven is called eleven instead of "eleventeen", doesn't mean it is not in the teens. And just as 2000 was not the beginning of the 20th century, 2010 is not the beginning of a new decade.
And "teen" just means "in the tens". I wonder why we don't say "fourten, or nineten. Yet another inconsistency of the English language.
Look, can we just get on with calling this decade the "nulls" and get on with it?
If there ever was a decade in desparate need of a big fucking do-over, the nulls win.
Anyone who thinks the 10s "start" in 2011 is just cargo-culting what they remember from 2000.
If you are asked when the 201st decade starts, you will be correct to answer 1/1/2011. If you are not, you are the worst kind of pedantic: pedantic and wrong.
That's the 202nd decade.
Pedantic, and wrong.
If I take a shower on Monday at 8 am, and then go without showering until Thursday at 8 am, then it is correct to say I didn't shower for three days, even though there were only two days (Tuesday and Wednesday) when I didn't shower.
The original sin in all this, it seems to me, is that there is no Year 0 in the Gregorian calendar.
I'd rather have TWO year zeros, 0 BC and 0 AD, than no year zero at all.
But then we'd have to alter all the papyri, scrolls and coins from the ancient world that have BC dates written on them to reflect that change.
This sort of niggling will only matter in 2012: does the world end at the beginning, or at the end of the year?
I need to know, so I can decide whether to pay my American Express bill.
I am a jelly doughnut.
Powdered, or glazed?
Communist, with red filling.
If a year can begin on day zero because I say it can, then I am already living in 2010. Ya'll are all late to the party.
Today is the first day of the rest of your life...
...and the last of an otherwise arbitrary period begun ten years ago which may or may not coincide with your life and any events of import.
Today is Primidi, 11 Niv?se 218. Vive la R?volution!
Don't you mean: Today is the Zero day of the rest of your life?
The decade known as "The 70's" included:
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
(That's 10 years for those who can't count)
The "80's" included:
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
(Also 10 years)
The "90's" included:
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
(10 years)
The "Aughts" or whatever they're called, include:
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
That's right, there are ELEVEN years in the aughts, and anyone who says otherwise is obviously an ignoramus or a shill for Small Decade. Which is it Walker?
I'm pushing hard for Long Decade now.
The Whatevers:
2000-2019
That's right, a double decker decade.
Then we can just start clean with the twenty twenties: 2020-2029
Actually, you may be onto something.
What is the qualification for being included in the naughts? Having at least two zeroes in the year. Obviously 2010 qualifies!
If that's the case, might as well throw in 2020, 2030, 2040,2050,2060,2070,2080 and 2090.
Now you're talking. But, just to round it out to an even twenty years, we'd better include 2100 as well.
sorry - I just saw the 2010 and thought about bewbs.
what are we talking about again?
Be here now.
It's all the fault of our proto-germanic linguistic forebears.
Why do 11 and 12 have unique names and not "-teen" variants... Oneteen, Twoteen. That would seem to clear this all up...
http://en.wikipedia.org/wiki/Duodecimal#Origin
It's not just English that has this problem. In Spanish, they don't use the -teen equivalent structure until 16. They have unique names for 11-15.
In French, they use unique names all the way until 16.
In Italian, they have -teen equivalents from 11-16, but then it switches for 17-19, such that it's basically teen- instead, e.g. teenseven.
So really, we're a bit better off than the strictly Latin-based languages.
I think we should go all the way, though, and campaign for either oneteen and twoteen, or something like unteen and duoteen.
In Welsh they have a rigidly logical structure. There are numbers from 0 to 10, above that the numbers are "one ten one" "one ten two" "one ten three" etc. Twenty is "two ten" thirty is "three ten" etc. It's just about the easiest counting system to learn.
It's not a proto-Germanic thing; in Spanish, 11-15 are once, doce, trece, catorce, quince, before establishing a pattern starting with 16.
What JoeM said...
from http://en.wikipedia.org/wiki/Duodecimal#Origin :
"Germanic languages have special words for 11 and 12, such as eleven and twelve in English, ... considered to come from Proto-Germanic *ainlif and *twalif (respectively one left and two left)..."
The current confusion seems partly due to fact that the unique names for 11 and 12, which defy the easy decade-specific x+teen then decade+x form that kicks in (in English) at 13.
I'm not saying we're alone, or that the Spanish or Italians don't have similar/related problems. Clearly they do, as they are Spanish and Italian.
But, per Jesse's original post, this is getting pedantic. Let's just agree I'm right and move on.
More importantly it's getting late and well past time to start drinking.
I'm telling you, let's just call this a 13-year long decade, and have a short one for the teens. Or combine it all into a double sized decade.
Either way, agreed, it is time to leave work and go drink. Happy New Year!
Counting anything other than rotations around the axis and rotations around the sun is nothing but giving desperate writers a vehicle for stinking retrospectives.
The real problem is that we don't hit a teen until 2013, so we can't really say we're entering the teens tonight.
You dumb people. It was the Otz, next the Tweens, then the Teens.
Then nothing because by 2020 my doomsday machine should be ready.
The "aughts" started January 1st, 2000. The 21st Century CE started January 1st, 2001. These statements are only contradictory to small-minded twits that cannot recognize more that one base to count in.
Ohs! And I've got a beer in my hands now. Yay.
"Raaaaaacist!!"
What's everyone talking about?
If you insisted that 2000 was the last year of the 20th century, you were being pedantic. If you insisted it was the last year of the '90s, you were just being stupid.
Yum. Yum. That's some good Jesse!
Security Show HAS TO STOP !!!
This is NOT SECURITY.
THIS IS BULLSHIT.
ANYONE:
- a Terrorist
- a suicidal maniac
- a weirdo
etc,
can attach any plane.
These security measures DO NOT WORK.
WHY IS NO ONE in the MEDIA admitting that the current SECURITY AT AIRPORTS is BULLSHIT?
Yeah!
Now go fuck yourself!
Happy new year!
Since a day is really nothing but a 24 hour period, I'll say this day started at 8:00 last night and I'm going out to light my fireworks now.
No, you shut up. Nanner nanner nanner.
Itchy Puss|12.31.09 @ 8:56PM|#
Security Show HAS TO STOP !!!
et c
Was this posted from an airport bar, by any chance?
Don't you people know you can't get an "is" from an "aught"?
Fuck a bunch of counting!