We asked a handful of "olds"—baby boomers and Gen Xers—connected to the Reason universe to give a shout-out to their favorite millennials. The results include libertarian activists, Middle Eastern radicals, children's book illustrators, independent congressional candidates, and a magic dragon.
Alexander McCobin of Students for Liberty, by David Boaz
We have lots of scholars in the libertarian movement. Not enough, but a few Nobel laureates and many distinguished authors and professors. What we don't have enough of are genuinely libertarian organizations and the institution builders who create and maintain them. Alexander McCobin, co-founder and president of Students For Liberty (SFL) since 2008, is one of those institution builders.
Since I spoke at SFL's first conference, which drew 100 students to Columbia University in 2008, McCobin and his colleagues have built Students For Liberty into the largest libertarian student organization in the world, with more than 400 leaders supporting more than 1,300 student groups representing over 50,000 students. Their 2014 conference drew 1,400.
When people ask me if the libertarian movement is growing, I tell them, "Seven years ago there was no national libertarian student group. Now there are two": SFL and Young Americans for Liberty, which grew out of Ron Paul's presidential campaign. Indeed, Politico wrote in February, "You could argue that the rapid growth of these libertarian groups has been the real story of campus politics in the Obama era."
Not content to free the United States, the global citizens of SFL have founded branches in Europe, Latin America, Asia, and Africa, where more than 1, 100 students gathered at a July conference.
Try as I might with my jaded cynicism, I can't get the smile off Alexander's face, or shake his confidence that millennials are "the libertarian generation." Thanks to the combined efforts of the U.S. government and Students For Liberty, they just might be.
David Boaz is executive vice president of the Cato Institute and author of The Libertarian Mind, coming in February from Simon & Schuster.
Piff, the Millennial Magic Dragon, by Penn Jillette
Piff, the Magic Dragon, is my favorite millennial magician. Piff was born in London in 1980 and at first blush (dragons do get blood in the face) would therefor not quite be a millennial young man. But Piff is a magic dragon. And magic dragons have a longer lifespan than humans. So, in dragon years, Piff is about 27 and right in the pocket.
I saw Piff for the first time when he walked out on our stage to do our show Penn & Teller: Fool Us. He opened with a dragon sneeze of fire, and then he did a baffling card trick. There's a rule about comedy magicians—they aren't funny and they aren't baffling. Piff is different. Piff had the audience screaming in laughter, and he fooled the pants off every one of them. I got lucky and we guessed the method of his trick by the skin of our teeth. We beat him. Piff lost the battle and won the war. After we joined the audience in Piff's well-deserved standing ovation, the bookings started rolling in for him. He got so much attention from his YouTube clip from our show, that he got booked in Vegas and moved here to do his act full time at a casino. I was proud to write a letter to U.S. Immigration giving my expert opinion that the United States of America didn't have enough magic dragons and we should allow him work here. No way he'd be taking work away from any American magic dragons.
Piff is the only magic dragon member of the prestigious Magic Circle in England and one of the youngest ever to lecture there (and that's in people years). Now, that he's moved here and we've become friends, he's also become part of the Penn & Teller brain trust and helps us design, create, and rehearse new tricks for our show.
I don't know if the kids are alright, but the magic dragon fledglings sure are.
Penn Jillette is half of the legendary act Penn & Teller, a best-selling author, moviemaker, and host of the podcast Penn's Sunday School.
Nick Troiano, Independent Candidate for Congress, by Krist Novoselic
Is Ron Paul really the new Nirvana? Is he grunge? On the contrary, he comes across as a country doctor and an unlikely national political figure. His prominence is a phenomenon that rose in the changes brought by the technological revolution. Look for more candidates who capture the imagination by way of technology. The 2014 election cycle is already offering this. The big news will be when this new breed of candidate is such a hit that one of them wins an election.
I look at the changes occurring in politics through the lens of my personal experience. Nirvana was yanked out of the musical underground into the front and center of the mainstream without the aid of the Internet. Nirvana really didn't change the music industry. We only helped change what kind of music people plugged into. People wanted something different and that cleared the way for "alternative" music.
Politics today are stuck in a similar rut to what rock was in 1990. Back then, music seemed like the same old stuff the media was pushing on people. Nevermind was the flip side of that and was at the right place and the right time to pull people in. Nevermind was such a phenomenal hit that you couldn't buy it for a few weeks after it broke because the label didn't expect to sell so many copies and the inventory was simply exhausted. The push-media paradigm of TV stations and magazines exposing what they wanted onto the masses did not break Nirvana. Rather, there was huge appetite for a new sound. Our record made number one because of that hunger.
It was the technology revolution that eventually changed the music industry. The new paradigm is to pull people in. Who is the next political Nirvana? The answer will be whoever can pull enough people in to propel them to the top of the vote count.
I want to write about a millennial who is running his first campaign for office. I am excited about his campaign because he is acting like a millennial—this cohort are supposed to be problem-solvers and are really tech savvy. Nick Troiano is a 25-year-old candidate running for Congress as an independent in Pennsylvania's 10th district. He is a political centrist and I see this as part of his millennial problem-solver attitude (his age group are supposed to be a lot like their great-grandparents.) Look at him campaigning and he is not, by any means, a fringe candidate.
I think that our political system is out-of-whack, with the same insiders spending tons of money. The system for congressional elections makes most of us voters spectators regardless of what our political preferences are. Independent Troiano is eschewing donations from the insiders in favor of a "citizen-funded" campaign. It is expensive to run for Congress and it takes guts for a candidate to cut their donor options. Nevertheless, it is clear what Troiano wants to do—tap into the pull paradigm of social networking. Enough people send him five bucks and he can have some real money. Something is working as he is currently the highest fundraising independent or third-party candidate in the country.
I love that he is independent. I see him as the alternative candidate. I like a lot of his platform, however, as an independent myself, what I really want to see is the two major parties go down in Pennsylvania's 10th district. It is as simple as that. Troiano is a glimpse of the future of American politics; where enough voters can amplify a small financial contribution by way of technology. The elephant and donkey show have got to go and here is an opportunity to make that happen, with crowd-sourcing, for one seat in congress.
This is the new kind of politics where people are pulled into campaigns. This will not be the alternative much longer. Rather, it will soon be the mainstream. As the millennials come of age, we will see more campaigns like this. I hope that Troiano shoots to the top with his first campaign for office.
Krist Novoselic was the bassist and co-founder of Nirvana. He is active in politics and chair of the electoral reform organization FairVote.
Christian Robinson, Book Illustrator, by Frank Portman
Like most of us, I first encountered the written word and saw my first actual art in picture books. That first look at the world as seen and presented by someone else is an experience that sticks with you. And fortunately I was clever enough to have scheduled my childhood to occur during a golden age of children's books, and in a social environment that forced the stuff on you with the insistence of a life insurance salesman. You couldn't escape Ezra Jack Keats and Maurice Sendak, even if you'd wanted to.
Christian Robinson is a prolific young San Francisco-based artist who is making big waves in the picture-book racket. His artwork evokes the "classic" aesthetic—it's not for nothing that he won the Ezra Jack Keats Award. But his whimsical take on the tradition subtly transforms it into something else again, something that feels quite new. Taking on history in a playful, positive spirit is his consistent theme. The gorgeous paper cut-out images of Harlem's Little Blackbird just spring off the page, bringing to life the story of the little-known (and unrecorded) Harlem renaissance singer Florence Mills in a way that words alone could never do. He also pops up in unexpected places, like the most recent MLK Google doodle, the gentle, wry take on educational pamphlet illustration in Queer: the Ultimate LGBT Guide for Teens, and in this charming video on children's understanding of music, another old trope given new life.
I risk a familiar nasty question: Why should a grownup care about children's books? Well, perhaps it's just that, growing up with the tradition, you're curious about where it's going and how it's going to end. Or maybe it's the inverse of what grabbed me as a kid, a joy in seeing a new take on familiar things I may have forgotten to notice. Whatever it is, Christian Robinson is in his tiny San Francisco room, quietly making huge things happen.
Frank Portman is a novelist and leads the band The Mr. T Experience. His newest book, King Dork Approximately, will be published in December.
Elizabeth Nolan Brown, staff editor at Reason.com, by Sharon Presley
My favorite millennial is Elizabeth Nolan Brown, a staff editor at Reason.com and a contributor to various other publications. From "Rise of the Hipster Capitalist" (a feature article in the October issue of Reason magazine) to "School District Bans Talking About Ferguson" her writing is on point, clever, and fresh. Most of all, at least to me, she writes about issues that concern women. There is a dearth of writing about such issues that Brown is filling admirably. These include articles such as "Pregnancy Crimes"; "Another Unconstitutional Abortion Restriction Struck Down in the South; Judge Asks What If Similar Laws Applied to Guns?"; "Pregnant Women Warned: Consent to Surgical Birth or Else," and "Punishing Prostitution Clients Is Not a Feminist Solution."
Such topics are vitally important to at least half the human race, so thank you, Elizabeth.
Sharon Presley is a psychologist, author, and editor whose works includes Feminist Interpretations of Ayn Rand; Exquisite Rebel: The essays of Voltairine de Cleyre: feminist, anarchist, genius; and Standing Up to Experts and Authorities: How to Avoid Being Intimidated, Manipulated, and Abused. Read her Facebook page here.
Ashe Schow, commentary writer at the Washington Examiner, by Glenn Reynolds
My nomination goes to columnist Ashe Schow of the Washington Examiner. Over the summer, she's devoted considerable energy to a series of columns looking into the moral panic over campus sexual assault, and into the one-sided, due-process-violating legislation that is currently before Congress.
Schow has done what pundits often fail to do, gathering new data by questioning the members of Congress supporting this legislation, and pushing repeatedly when they failed to answer, or were evasive. Her series of columns on this topic is must-read material for anyone interested in the subject, and an excellent example of how to do punditry for anyone just starting out.
Glenn Reynolds runs Instapundit.com and is the Beauchamp Brogan Distinguished Professor of Law at the University of Tennessee. He is the author of, most recently, The New School: How the Information Age will Save American Education from Itself.
Middle-Eastern and Central-Asian Millennials, by Thaddeus Russell
Millennials are winning the War on Terror—not the young people operating drones from command centers in Nevada or shooting Taliban from Apache helicopters, but the millennials across the Middle East who are throwing their head scarves to the wind, shooting hip-hop videos at house parties, selling bootleg satellite dishes, sporting outlawed tattoos, and watching tons of porn.
As drone missiles rain down on Pakistani villages, artillery shells pound the Kandahar Valley, F/A-18 fighter jets drop bombs on Iraq, and military-enforced sanctions choke the Islamic Republic of Iran, a bloodless but raucous revolution led by Arab and Persian millennials is underway all over Muhammad's empire. This is a revolution in everyday life, and it is gradually but surely subverting repressive regimes across the Middle East. No one understands this more clearly than the radical Islamists who are clinging to power against the waves of young people's cultural subversion. Ayatollahs in Pakistan, Iran, Iraq, Saudi Arabia, and Egypt have issued fatwas against the "satanic instruments" of Western pop culture. But they are losing, as more young women shed their hijabs, Islamic authorities beg for people to stop watching TV, and satanic verses fill their air.
All America has to do to defeat global jihad is give up, come home, and let the kids get down.
Thaddeus Russell is the author of A Renegade History of the United States and a biography of Jimmy Hoffa. He teaches history at Occidental College and is a frequent contributor to Reason and Reason.com (click here for his archive).
The post My Favorite Millennial! appeared first on Reason.com.
]]>Popping the Higher Education Bubble
Glenn Harlan Reynolds
The economist Herbert Stein once said that something that can't go on forever, won't. That observation, sometimes called Stein's Law, could well turn out to be the theme for the current decade. But nowhere is it truer than in higher education. American higher education is first in the world, but it can't go on forever on its current path.
Colleges are raising tuition and fees every year, at a rate of increase that far outpaces any reasonable expectation. One might think this is the kind of thing that couldn't continue forever, but that's precisely what has been happening over the past several decades. Prices have gone up, and buyers have poured in anyway, buoyed by a flood of seemingly cheap government money in the form of student loans.
As with any bubble, there are doomsayers who are mostly ignored and cheerleaders who say that this time it's different. But—as with any bubble—reality is starting to intrude.
Though people have been talking about a bubble in higher education for a while, one major indicator that the swelling is approaching its limit was found in last year's Occupy protests. While the protesters represented a diverse array of grievances, one common thread was that many had run up huge student loan debts for degrees that weren't capable of generating sufficient income to make the payments.
At an annual growth rate of 7.45 percent, tuition has vastly outstripped both the consumer price index and health care inflation (see chart). The growth in home prices during the housing bubble looks like a mere bump in the road by comparison. For many years, parents could look to increased home values to make them feel better about paying Junior's tuition—the so-called "wealth effect," in which increases in asset values make people more comfortable about spending. Or at least they could borrow tuition costs against the equity in their homes. But that equity is gone now, and tuition marches on.
So where does that leave us? Even students who major in programs shown to increase earnings, such as engineering, face limits to how much debt they can sanely amass. With costs exceeding $60,000 a year for many private schools, and out-of-state costs at many state schools exceeding $40,000 (and often closing in on $30,000 for in-state students), some people are graduating with debt loads of $100,000 or more. Sometimes much more.
That's dangerous. And the problem is not a small one: According to the Ohio University economist Richard Vedder, writing in the Chronicle of Higher Education, the number of student-loan debtors now actually equals the number of people with college degrees. How is this possible? "First, huge numbers of those borrowing money never graduate from college," Vedder explains. "Second, many who borrow are not in baccalaureate degree programs. Third, people take forever to pay their loans back."
Total student loan debt in America has passed the trillion-dollar mark. That's more than total credit card debt and more than total auto loan debt. Students graduating with heavy burdens of student loan debt must choose (if they can) jobs that pay enough money to cover the payments, often limiting their career choices to an extent they didn't foresee in their undergraduate days.
Even students who can earn enough to service their debts may find themselves constrained in other ways: It's hard to get a mortgage, for example, when you're already effectively paying one in the form of student loans. And unlike other debt, there's no "fresh start" available, since student loans generally aren't dischargeable under bankruptcy. The whole thing looks a bit like the debt slavery schemes used by company stores and sharecropping operators during the 19th century.
Now the whole scheme is starting to break down. In my own world of legal education, applications have plummeted over the past few years. According to the ABA Journal, there has been a 22 percent drop this academic year alone, and they're down almost half from 2007. Business schools, with declining pay and employment prospects for MBA graduates, are experiencing similar declines. Even in undergraduate admissions, colleges are losing the ability to set prices as applicants become more value-conscious. These trends have led the Moody's rating service to downgrade the outlook for the entire higher education sector to "negative."
Some in higher education are offended. College should be about improving your mind, they say, not about future salaries. But a recent study of more than 700 schools by the American Council of Trustees and Alumni found that many have virtually no requirements. Perhaps that's why students, on average, are studying 50 percent less than they were a couple of decades ago.
When higher education was cheap enough that students could pay their own way by working part-time, "study what interests you" was reasonable advice. When the investment runs well into the six figures, students would be crazy not to worry about the return. If there's no return, it's not an investment; it's a consumption item. A six-figure consumption item is well beyond the resources of most college-age Americans; nobody would advise an 18-year-old to purchase a Ferrari on borrowed money. But if a college education is a consumption item, not an investment, then they're basically doing the same thing.
Higher education needs to be cheaper, more flexible, and better. It's possible that technology will show the way: With the proliferation of online courses, some offered by major brand-name schools like Harvard, MIT, or Georgia Tech, there's no reason why students should have to go into massive debt. And while an online degree from MIT (when such becomes available) probably won't be worth as much as traditional MIT sheepskin, it may well outperform degrees from many less prestigious brick-and-mortar schools.
Alternatively, universities could become leaner. Fueled by the bubble, they've piled on staff (mostly non-teaching administrators, who now outnumber faculty in many institutions), constructed buildings and palatial athletic facilities, and paid almost no attention to costs. A few science and engineering courses may have inherently high equipment costs, but there's no reason why it should cost more in constant dollars to get a degree in English literature today than it did 50 years ago. Schools that get costs under control will have a huge advantage. Those that don't, will suffer.
Today's university is a knowledge industry using a 19th-century model in the 21st century. If that doesn't suggest a need for updating, what does?
Glenn Harlan Reynolds is a law professor at the University of Tennessee, and the author of The Higher Education Bubble (Encounter).
The Government Yoke
Richard Vedder
American higher education offers students relatively little education at a rapidly rising cost. It produces reams of irrelevant research, operates with breathtaking inefficiency, and treats its customers arrogantly. This sorry state of affairs has worsened markedly over the last half century, propelled largely by increased government involvement.
Start with higher education outcomes. There is a growing body of evidence that the "value added" of college in terms of knowledge acquisition and enhancement of critical thinking skills is embarrassingly small. The 2011 American Civil Literacy Report, using a test that the Intercollegiate Studies Institute gave to thousands of students, showed that seniors on average know little more than freshmen. In their careful 2011 study Academically Adrift, sociologists Richard Arum and Josipa Roska demonstrated that the writing and critical thinking of seniors are only slightly more advanced than that of freshmen.
Students do less yet are rewarded more. Several recent surveys have concluded that undergraduates study less frequently than their parents did (fewer than 30 hours a week on all academic chores, including class attendance, paper writing, etc.), but get higher grades: above a "B" average for all students, compared with a "C+" to "B-" average 50 years ago. Results are similarly dreary on the research side; Mark Bauerlein of Emory University has demonstrated that the vast majority of published work in his discipline, English, is rarely cited, even by other scholars in the same field.
This is bad, but what makes it scandalous is the rapidly growing amount of money spent to obtain these poor results. Tuition fees have skyrocketed, at both state and so-called "private" schools. When I started at Northwestern University in 1958, the tuition was $795 a year. Now it is $43,380. In inflation-adjusted terms, the sticker price has quadrupled. At the rather typical mid-quality state university where I have taught for more than 47 years, Ohio University, tuition has risen from $450 in 1965 to $10,216 today, tripling in real terms, growing far faster than incomes over that period.
This academic arms race is largely financed by taxpayers. In 1970, the federal government's student financial assistance programs totaled a bit over $1 billion. Last year it was $173.8 billion. Ostensibly, this increased funding provides improved access by allowing those of modest incomes to attend college. But the proportion of recent college graduates coming from the bottom quartile of the income distribution has actually fallen sharply since 1970—from 12 percent to a bit over 7 percent. Former Secretary of Education Bill Bennett was right in 1987 when he hypothesized that higher federal student loans would merely lead to higher tuition fees. Colleges, not students, capture the money.
What have the schools done with the funds? Mostly, they have hired lots of staff. The enrollment-adjusted number of "non-instructional professional" personnel has roughly doubled since the mid-1970s. Meanwhile, professors have been given reduced teaching loads—the proportion of instructors who teach four hours or less per week has more than doubled since 2000.
Administrators also pay themselves handsomely. It's not just football coaches who make million-dollar salaries these days; some university presidents do as well, with fringe benefits that can include mansions, chauffeured cars, club memberships, copious first-class or private jet travel, even payment of income taxes.
Colleges and universities, echoed and abetted by politicians like President Barack Obama and education-cheerleader groups like the $1.4 billion Lumina Foundation, promote the notion that nearly every American should have a post-secondary education and that we need to regain world leadership in the percent of young adults who have bachelor's degrees. Yet labor market data show that a large portion of those with bachelor's degrees have jobs that do not require a college education. (A forthcoming Center for College Affordability and Productivity study puts the portion at around 48 percent.) There are more than 115,000 janitors, for example, with bachelor's degrees.
The feds encourage students to borrow money for college, run up huge debts, and then, increasingly, get low-paying jobs upon graduation. The default rate on student loans is above 12 percent, exorbitantly high by commercial lending standards. The more students borrow, the easier it is for colleges to raise tuition fees to pay for the comfortable lifestyles of administrators and professors.
Colleges have no skin in this game. They often use federal monies to lure students whom they know have a high probability of not graduating, then suffer no consequences when students default.
Even so-called "private" schools are corrupted by government funding. Princeton University receives vastly more government subsidies per student than the nearby state school, the College of New Jersey. Tax deductions and exemptions (e.g., on capital gains from endowment income) give private schools a privileged status. Federal research dollars with generous overhead allowances add to the fact that prestigious private schools, with a few exceptions such as Hillsdale College, are heavily beholden to the government.
This brief tour de horizon skips some areas of dysfunctionality, such as the vast underutilization of campus facilities, taxpayer-subsidized country club–style recreational centers, and more. So what is the solution?
As long as governments, particularly the one in Washington, heavily subsidize higher education, it is likely that reform efforts will be futile. For starters, we need to get the feds out of the student financial assistance business, and start privatizing state universities. Schools such as the Universities of Virginia, Michigan, and Colorado, where state subsidies amount to very small portions of the budget, would be good places to start looking for reform models. Because start we must.
Richard Vedder directs the Center for College Affordability and Productivity, teaches at Ohio University, and is an adjunct scholar at the American Enterprise Institute.
Poisoned Feeder Schools
Lisa Snell
Since 1970, K-12 education spending in the United States has tripled in inflation-adjusted dollars. Yet American 17-year-olds score no higher in reading and math on the National Assessment of Education Progress than they did 40 years ago. The United States ranks second behind Switzerland in education spending for K-12 out of 28 Organization for Economic Co-operation and Development (OECD) countries, yet exactly in the middle of 49 developed and newly-developed countries when it comes to gains made by 4th and 8th graders since 1995, according to a 2012 study by the Harvard-based journal Education Next.
This crisis in public-school productivity has followed American students into higher education. Colleges spend more than $3 billion dollars annually on remedial education, with more than 1.7 million students a year needing help getting up to speed in basic reading, writing, and math. Of the more than 50 percent of students at the community college level who start in remedial courses, fewer than 1 in 10 ever earn a two-year degree. Less than one-third of students who start in remedial classes at four-year colleges ever earn a bachelor's degree.
If we ever expect to get real value for both the $185 billion in post-secondary student aid each year (49 percent of which came from the federal government in 2012) and the additional $114 billion spent on government and private student loans in 2012, we must first improve the productivity of K-12 education.
Public school choice and competition may be the key. There are now 29 school voucher and tax credit programs and more than 6,000 charter schools in the United States, enrolling nearly 3 million students. School choice has led to the rapid growth of niche schools that focus on college readiness and college credit completion during high school. There are more than 200 Early College high schools that allow students to enroll in college courses during their junior and senior years, with many kids completing a two-year Associates degree even before they graduate from high school. These students save parents and taxpayers the first two years of college tuition.
Many choice schools use college readiness as one of their most important performance metrics. For example, the Alliance for College Ready Public Schools, a group of high-performing charters in Los Angeles, explicitly measures how their students perform on the California State University early assessment exam, which is used to determine which high school grads will need placement in remedial education. The goal is not just to enroll their students in college, but to reduce the number of kids who will need extra help once they get there.
The United States spends more than almost any other country in the world on elementary and secondary education, then we spend money all over again trying to get those students up to college standards after they have already arrived on campus. If we want to improve the quality of higher education we must improve the performance of K-12 education through school choice.
Lisa Snell is the director of education at the Reason Foundation.
The Decline of Liberal Arts
Naomi Schaefer Riley
Last fall, Florida Gov. Rick Scott released plans for revamping his state's higher education system. Among other things, Scott proposed charging lower tuition rates to students who major in the more potentially lucrative STEM fields—Science, Technology, Engineering, and Mathematics. The chairman of his higher education task force explained the comparative price increase of humanities degrees this way: "The purpose would not be to exterminate programs or keep students from pursuing them.…But you better really want to do it, because you may have to pay more."
It's not clear that such measures would do much to stem the tide of students pursuing useless degrees. After all, the market already sends strong signals to students that engineering is worthwhile. But kids are idealistic and impractical. In her new book Mission Adulthood, about the struggles of Generation Y, journalist Hannah Seligson profiles a woman who graduated from a no-name liberal arts college with a degree in "Peace & Conflict Studies, International Studies, and Women's Studies." She paid $40,000 a year for the privilege, and has been struggling ever since to land a full time job in her career of choice—abortion counselor.
Even if students recognized before graduation that positions in their chosen field might not even cover their loan payments, they might not have the preparation to choose more potentially lucrative majors. According to a 2011 study by ACT Inc., less than half of U.S. high-school graduates who took the ACT test were prepared for college-level math, and fewer than a third of tested high-school graduates were ready for college-level science. And it's no secret that humanities and social science courses tend to be easier than their math and hard science counterparts.
It is certainly possible to receive a strong education in the humanities and social sciences, but the vast majority of American colleges are not providing one. According to the American Council of Trustees and Alumni, fewer than two in five colleges require a single literature course, fewer than one in five require American history or government, less than 14 percent have an intermediate-level foreign language requirement, and less than 5 percent require basic economics.
Employers do value the skills that one can gain from a strong liberal arts education. A recent op-ed in The Wall Street Journal quoted a successful Silicon Valley tech entrepreneur as saying, "English majors are exactly the people I'm looking for." Prospective bosses are searching for people who can write well, but those are pretty hard to come by. Which is not surprising, given how many professors are too busy writing obscure articles in dreadful academic lingo to help students who are struggling with basic sentence construction.
All this will probably get worse before it gets better. In the past year, one social science professor at a public research university told me her department instructed her to stop assigning papers and just give multiple-choice exams, in order to reduce the workload of grad students. Another was told by an administrator to stop correcting grammar in his students' essays, because the kids don't like all that criticism, and besides, it's not an English class.
There's no need to "exterminate" programs in the social sciences and humanities. But if they continue down their current absurd path, few will mourn the thinning of their ranks.
Naomi Schaefer Riley is the author of The Faculty Lounges… And Other Reasons Why You Won't Get the College Education You Paid For (Ivan R. Dee).
Humanities Under Siege
Nick Gillespie
Universities existed long before the federal government began bankrolling the higher education bubble. In the West, higher education has been around for more than 1,000 years; in America, the first colleges opened their doors long before the Founding Fathers were even born.
Even after the bubble bursts, and the government money begins to dry up, the university will endure. In the words of the founder of Faber College, the fictional school in Animal House, "Knowledge is Good." The market for people who are inquisitive and interested in learning will always be bullish.
The real existential threat to higher ed comes from folks who conceive of college as a sort of high-end vocational-tech program. Right-leaning critics such as Naomi Schaefer Riley, Richard Vedder, and Charles Murray complain about feel-good majors that don't help fill the nation's need for STEM-related graduates. Left-leaning commentators such as Richard Arum, Josipa Roska, and Christopher Newfield fear that college is becoming more expensive for students even as it teaches them little or nothing of value.
These sorts of critiques are wrong for two reasons. First, they assume that education, especially college, should somehow be related to employment. While that has always been an expectation—most of America's colonial colleges started as seminaries—it long ago stopped being the rule. In a 2011 Pew Research survey, 74 percent of college graduates called the experience "very useful" for their "knowledge and intellectual growth" and 69 percent said it facilitated their "personal growth and maturity." A relatively puny 55 percent said college was very useful as "preparation for a job or career."
As the proud possessor of no fewer than four English degrees (a B.A., two M.A.s, and a Ph.D.) who paid my own way through every stage, I think these graduates have it exactly right. You should be going to college to have your mind blown by new ideas (read: whole fields of knowledge that you didn't know existed until you got to college), to discover your intellectual passions, and to figure out what sorts of experiences you might want to pursue over the next 70 or so years. And let me suggest that it's precisely the broad field of inquiry that takes the most abuse for being totally impractical—the humanities—that students should seek out most. Understanding history, literature, art, philosophy, and the like won't make you a better citizen, or a more responsible employee, or a happier camper, but those disciplines will give you the tools to figure out who you are and what you want to be if and when you grow up.
Second—and far more wrongheadedly—most critics of the contemporary university err in talking about the place as if it exists only or mostly to serve students, especially undergraduates. What actually sets institutions of higher learning apart from high schools, barbers' colleges, online academies, and various universities-in-name-only is that they are centers of knowledge production. That is, they revolve around faculty scholars who are actively expanding, revising, and remaking the received wisdom in their given fields. Active researchers, whether in astronomy or zoology or cultural studies or good old American literature, are the folks that make college worth a damn.
Yet these are the very people under siege. A 2010 study spearheaded by the University of Arkansas' Jay P. Greene for the Goldwater Institute documented "administrative bloat" at colleges and universities. Between 1993 and 2007, wrote Greene et al., student enrollment at leading research universities rose by 15 percent and the number of administrators per 100 students jumped 39 percent. The number of tenure-track faculty per 100 students barely kept track with student growth, rising just 18 percent. At Arizona State, the number of administrators per 100 students grew 94 percent while faculty slots actually shrank by 2 percent.
The best universities will be able to get by without Pell grants or guaranteed student loans or even public-sector research dollars. They will even figure out ways to keep themselves accessible to motivated students who, like me, didn't come from money. But none of them will survive the notion that they exist mostly to serve 18- to 21-year-olds kids who need high-paying jobs rather than limn the outer edges of intellectual possibilities.
Nick Gillespie is editor in chief of reason.com and Reason TV.
School Is for Signaling
Zac Gochenour
Why are people with higher-paying, more desirable jobs more likely to have a college degree? The answer has important implications for the future of higher education.
A popular answer among labor economists and the public is that education develops human capital. Students learn skills, develop propitious habits, become acculturated, improve critical thinking, or are otherwise transformed by college. This human capital means increased productivity, which is valued by employers, who in turn offer better compensation to the employees. Certain majors pay more because they teach more valuable skills; certain schools lead to better paying jobs because they literally provide a better education.
The issue of causation is more complicated, though, because traits like intelligence, conscientiousness, compliance, patience, and conformity influence both academic and professional success. The types of workers who have the ability to do well in school can often do well in work. So even though workers with college degrees earn more, they would have earned more even if they did not go to college. Schools and majors that churn out highly paid alumni are the ones that highly productive workers choose.
Whether or not employers value the specific skills learned at school, they might value the degree because of what it tells them about these traits. To some extent they can judge these traits directly, such as with test scores or in an interview, but for what they cannot observe they rely on signals such as the school's prestige and the applicant's major, grades, awards, and other activities. Employers pay more for signals that indicate high productivity, such as a tough major with good grades from a top school.
The relative importance of these distinct factors—human capital, ability, and signaling—will determine how the market for education and labor will be changed by technology and politics.
If higher earnings for college graduates are mostly due to the human capital they develop at school, then higher education is destined for obsolescence. Subsidies will not be enough to keep universities afloat and competing with the cornucopia of knowledge made available by modern technology. As effective alternatives become cheaper compared to university education, the traditional college apparatus will at least need a major overhaul.
If the earnings correlation is mostly due to ability and signaling, however, no upheaval need occur. Participation in educational alternatives such as online learning might teach the same skills as a traditional university, but also can't help but signal an unwillingness to conform with important societal norms. On average, good employees are known to prepare for college, go to college, and finish college. Employers will not be willing to pay the same premium for alternative educations if they do not signal the same traits as a traditional education. If public spending for education declines, society benefits: Fewer resources are being spent to do the same sorting. But so long as spending on higher education enjoys enduring popularity, there is no reason for the train to stop, or even slow down.
Zachary Gochenour is a Ph.D. candidate in economics at George Mason University and a researcher for Bryan Caplan's forthcoming book, The Case Against Education.
Tear Down the Clock Tower!
Michael Gibson
What is the oldest and most common architectural feature on college campuses? The clock tower. Why is this so, and what does it mean?
The widespread adoption of clock tower technology had profound consequences in the Middle Ages. It standardized time within a small geographic area. Bells chimed the hour, creating discrete, uniform time chunks within its earshot. This was the only way to reduce disagreement and disputes about time spent on any activity, as everyone began to operate on the same hour-length chunks. On nascent university campuses, it created a way for students, tutors, and administrators to coordinate lectures and meetings.
So dominant was the clock tower on campus that time became a proxy measurement for knowledge and skill acquisition. Students take credits measured in hours per week. Their abilities are tested at the end of a season with two-hour exams. A degree is a four-year course of study.
Future technology will undermine the notion that centralized, synchronized time should structure the learning process. The more precisely we can measure skill and knowledge acquisition, the less dependent we are on time as a proxy measurement. Education should not be based on how much time a student spent in the vicinity of a clock tower. It should be based on what is timeless.
Many cultural patterns that seemed normal in the past are absurd today. Behaviors like smoking on airplanes or using lead paint in the home now seem beyond the pale. After the discovery of penicillin, dying from an infected wound became unthinkable. Likewise, technology products such as mobile phones the size of a shoebox or mainframes the size of an SUV have all been rendered silly. Like these totems of the past, the clock tower, too, will crumble and soon seem ridiculous.
College students currently pay more than quadruple in real terms what their parents paid. In exchange they get a vague credential, poor job prospects (of late), and crushing debt. The essence of this system is an insane and perpetual Hunger Games–style competition where teenagers fight each other to gain access to dorm rooms within earshot of the highest-status clock towers.
Many experts will tell students how to pick a college. Fewer ask students to question whether they should go at all.
We think the future will be very different. In 2011, Peter Thiel started the 20 Under 20 Thiel Fellowship because he wanted to help young entrepreneurs and visionaries get started on their ideas as soon as possible, before they accumulated any debt and before they found themselves tracked into an uninspired career. During the two-year program, each fellow receives $100,000 and mentorship from our network of innovators, engineers, scientists, thought leaders, and business development experts.
Our first and second classes have started and sold companies, secured a book deal, raised a million dollars in venture capital, won international awards, and sparked a do-it-yourself education movement. We just closed the application season for our third class of fellows.
We are not telling all college students that they should drop out. Some universities may still offer community, great real estate, a fervent dating pool, and—in a few academic departments—valuable skill acquisition. But students should keep in mind that the "safe path" is quickly becoming the riskiest. Students who have entrusted their future solely to college administrators are marching to someone else's tune. They have forgotten why the tower chimes ring.
Michael Gibson is an associate at Thiel Capital and is VP of Grants for the Thiel Foundation, where he helps run the 20 Under 20 Thiel Fellowship.
The Competition Goes Online
Alex Tabarrok
Online education is a game changer for traditional universities. Almost a third of college students are now taking at least one online course (up from single digits less than 10 years ago). And traditional universities should take note that the competition is about to get much fiercer: online education radically lowers the cost of educational startups.
In October 2012 my George Mason colleague Tyler Cowen and I launched MRUniversity (short for Marginal Revolution University, named after our blog), with a full-length academic course on development economics. Today we have many thousands of registered students from all over the world, and are launching new courses. Inspired by Salman Khan and the YouTube tutorials he offers through his Khan Academy, we wanted to see what two individuals who enjoy teaching could accomplish without major financial backing.
Our experience has shown us many advantages of online lectures: They allow the best professors to teach many more students. They offer large time savings, since students can repeat sections of the lecture without slowing down other students or consuming instruction time. They offer lower fixed costs, since students no longer need to drive to and from campus. Lectures can be shorter, and consumed at any time, which means online universities are effectively open 24 hours a day.
A number of online course providers now exist, including Coursera, Udacity, and the Harvard-MIT project edX. These "massively open online courses," or MOOCs, have sought and received millions of dollars of venture capital and/or university funding.
The biggest hurdle for MOOCs is credentialing, which usually requires a business model and a partnership with a name-brand institution. The challenges here are not technical but bureaucratic. We have, however, seen universities move into this space at a pace unheard of for such normally slow-moving institutions. Antioch University, for example, is offering credit for some Coursera offerings and San Jose State University has recently partnered with Udacity to offer online mathematics classes for credit. Most worryingly for the traditionals, the San Jose online price will be less than half the price of an on-campus course.
It's not just the MOOCs that are competing with traditional universities. Online lectures are behind the astonishing rise of for-profit universities. Enrollment at the for-profits was less than 500,000 students nationwide in 2000, but that number doubled to 1 million by 2005, and doubled again to 2 million by 2010. Why? The for-profits pioneered online education, a product that appealed to non-traditional students who could not afford or did not want the five-year "edu-vacation" produced by more traditional universities.
The education establishment tends to dismiss the MOOCs and the for-profits as low-quality. There are legitimate complaints about for-profits, mostly stemming from the fact that the final customer is often not the student but the federal government, which supplies the bulk of loan money that students spend there. But it's inarguable that the sector has demonstrated a strong demand for online education. Low-quality entrants that disrupt established industries often improve their offerings over time. MP3s didn't sound as good as CDs, and audiophiles objected. But compressed digital sound is improving, and has already overtaken physical CDs in income generated for the recording industry.
Traditional universities should not assume that the quality of education at the MOOCs or the for-profits is or will remain low. In a 2011 interview with The Next Web, business guru Clayton Christensen, author of The Innovator's Dilemma, reports that the University of Phoenix invests $200 million annually on research to improve the quality of teaching. "That's $200 million every year just on making their teaching better," Christensen says. "Do you know how much money Harvard spends every year to make its teaching better? Zero."
Take those figures with a grain of salt, but the lesson about quality improvement is sound. When each teacher teaches only tens or hundreds of students, no teacher will invest millions of dollars in improving education. But when a single online course can reach tens or hundreds of thousands of students, it pays to invest in quality.
The real competitors to traditional universities may end up being neither the MOOCs nor the for-profits but entirely new educational startups. Online education replaces labor with capital, in the form of software. Online pioneer Marc Andreessen has argued that software is eating the world—if so, education is the appetizer. As software replaces labor we will see greater possibilities for productivity improvements in education. Incentives to invest in such improvements will expand with the size of the market.
What will a future "course" look like? One model is a super-textbook: lectures, exercises, quizzes, and grading all available on a tablet. The textbook's artificial intelligence routines could guide students to lectures and exercises designed specifically to address that student's deficits, and could call on human intelligence—tutors—on an as-needed basis. ("Click here to connect with a tutor; $5 for the first five minutes and 50 cents a minute thereafter.") A textbook of this kind would draw on content experts but also actors, animators, graphic designers, and experts in pedagogy.
Universities are not the natural producers of educational software. Instead think of video games. The process of developing the mega-hit video game Halo 3 provides a useful model: The development team at Bungie Studios analyzed 3,000 hours of Halo play by 600 gamers in order to suss out everything from preferences for weaponry to places where players are most likely to get killed. Then they did the same with their competitors. Wired sums it up this way: "It might seem like an awfully clinical approach to creating an epic space-war adventure. But Bungie's designers aren't just making a game: They're trying to divine the golden mean of fun."
A game like Halo must be difficult enough so that players feel challenged but not so difficult that they give up in frustration. Educators want a course with exactly the same characteristics. Game developers invest millions of dollars in producing and testing high-quality video games because the size of the market justifies such investment. Online technologies are bringing the same economics to the world of education. In the new world of online education, we will finally have courses that are as well designed and as awesome as video games.
Alex Tabarrok is a professor of economics at George Mason University and the co-founder with Tyler Cowen of MRUniversity.com. He writes regularly at the blog MarginalRevolution.com.
The post Where Higher Education Went Wrong appeared first on Reason.com.
]]>Yesterday, the Court issued its long-awaited opinion, ruling 5-4 in favor of an individual right to own guns. reason assembled a panel of 8 leading civil libertarians to help make sense of what the Court said, what it means, and what's likely to come next.
***
Alan Gura: Yesterday's decision is a huge victory for liberty. First, we saved the Second Amendment. That much should be obvious from the opinion. Yesterday, federal courts in 47 states were telling Americans they had no Second Amendment rights. The score is now 50-0, plus the capital, in the other direction. For budding lawyers, "individual right" is now the correct answer on the Multi-State Bar Exam. The movement to end private firearm ownership in America is dead and buried. Yes, we've got some work to do to make sure it stays that way. It will.
The case is "narrow but broad." Narrow, in the sense that our objective was merely to secure the individual nature of Second Amendment rights, and demonstrate—with a judgment—that the right has substance. Broad, in the sense that this simple principle can now be applied in other contexts. This is not just about flat-out gun bans in Washington, D.C. homes. All regulations that touch upon Second Amendment rights will get a well-deserved constitutional look. Instant background checks and felon-in-possession laws will survive. Laws meant to harass gun possession, while at best advancing only a hypothetical public benefit, will not. The Second Amendment is now a normal part of the Bill of Rights. It's not realistic to expect one Second Amendment case to answer all right to arms questions for all time, just as we have no one decision telling us what a Fourth Amendment "reasonable search" in all circumstances. We may not win every case. We'll win a good amount of them. The next step is obviously 14th Amendment incorporation. I'm looking forward to leading that fight. Learn more at www.chicagoguncase.com.
Libertarians can be impatient. Would anyone prefer the quick certainty of Kelo? Or McConnell v. FEC? It may be a tough slog to restore the Takings Clause and free political speech. Restoring the Second Amendment will take time, too. Today, with the right to keep and bear arms, we start from a position of strength.
Alan Gura argued District of Columbia v. Heller before the Supreme Court. He is a partner at Gura & Possessky.
***
Glenn Reynolds: My first thought on Heller is that many gun-rights supporters never thought they'd live to see a Supreme Court opinion to the effect that "The Second Amendment protects an individual right to possess a firearm unconnected with service in a militia, and to use that arm for traditionally lawful purposes, such as self-defense within the home." Bob Levy, who brought the case against the advice of many gun-rights supporters, should feel very good about that.
My second thought is that this is a gift to the Obama campaign. While this won't take the gun issue off the table, it also won't energize the gun-rights crowd (which cost Al Gore the election in 2000 when he failed to carry Tennessee, largely because of his support for gun control) the way a contrary opinion would have. Obama's record of strong support for sweeping gun control would hurt him much more if gun owners felt more vulnerable.
My third thought is that whether this has much impact on the real world depends on how the next several cases proceed. In the 1990s the Supreme Court announced a major shift in Commerce Clause doctrine that offered the hope of paring back federal power considerably. But right-leaning public interest law groups didn't take up the challenge and bring carefully selected cases to advance the principle, leading it to be characterized by some (including me) as a constitutional revolution where nobody showed up. Gun-rights advocates are already talking about follow-on challenges in places like Chicago or Morton Grove. How well those are brought will have a lot to do with whether the Heller opinion is a milestone, or just a speedbump.
Glenn Reynolds is a law professor at the University of Tennessee. He blogs at Instapundit.com.
***
Randy Barnett: Justice Scalia's historic opinion will be studied for years to come, not only for its conclusion but for its method. It is the clearest, most careful interpretation of the meaning of the Constitution ever to be adopted by a majority of the Supreme Court. Its analysis of the "original public meaning" of the Second Amendment stands in sharp contrast with Justice Stevens' inquiry into "original intent" or purpose and with Justice Breyer's willingness to balance an enumerated constitutional right against what some consider a pressing need to prohibit its exercise. The differing methods of interpretation employed by the majority and the dissent also demonstrate why appointments to the Supreme Court are so important. In the future, we should be vetting Supreme Court nominees to see if they understand how Justice Scalia reasoned in Heller and if they are committed to doing the same. Now if we can only get a majority of the Supreme Court to reconsider its previous decisions—or "precedents"—that are inconsistent with the original public meaning of the text.
Randy Barnett is the Carmack Waterhouse Professor of Legal Theory at Georgetown University Law Center and author of Restoring the Lost Constitution: The Presumption of Liberty.
***
Brian Doherty: The Heller decision was exciting for fans of American liberty—even the dangerous and disreputable end of that liberty, where weapon possession and use rights abide in the minds of many good-hearted people who think guns are just ugly and awful and appeal to the worst aspects of human nature.
Scalia's opinion did a thorough job of fileting, layer by layer, the lame and unsupportable "collective right" beliefs about the Second Amendment—including lots of sadly necessary exegesis on how the word "keep" means that people have a right to, yes, keep arms in their homes.
But Heller represents no happy ending to our legal and public policy duels over guns. Scalia's opinion does admit that we do have a constitutionally protected right to some degree to defend ourselves and our property with weapons.
But the opinion also stresses that right is still regulatable in many, many ways. It leaves plenty of room (which you can be sure will be filled rapidly) for future court challenges and public policy fights to define the degree to which the government, at any level, can restrict or regulate the sale, possession, and use of weapons. It may well turn out that anything less severe than D.C.'s total ban will withstand scrutiny even under the newly revived Second Amendment.
The "eternal vigilance is the price of liberty" part: four members of the Supreme Court think that it's A-OK for the government to completely bar citizens from using guns for the protection of their lives and homes. That can't make sleeping at night any easier. That said, the Heller victory was a sweet one for the recognition that there are limits to what democracy can do to individual rights, and is worth celebrating for that.
Brian Doherty is a senior editor at reason and the author of This Is Burning Man and Radicals for Capitalism. He is currently writing a book about District of Columbia v. Heller.
Sanford Levinson: The majority obviously found that the Second Amendment does protect an individual right to bear arms, and they applied this right in the easiest possible case, i.e., a functionally absolute prohibition against handgun possession.
What cannot be determined from the opinion is what the future impact of Heller will be, beyond further litigation. I am reminded of a cartoon in the New Yorker several years ago, of a conversation at a suburban cocktail party where a woman says to a well-dressed man, who is carrying a rifle slung over his shoulder, "I've never met a Second Amendment lawyer before." I suspect that there will be more such lawyers in the next few years, but this says nothing about the prospects of winning such cases. For all of the rhetorical bluster of Scalia's opinion, it not only focuses on the extreme nature of the D.C. ordinance, but also goes out of its way in effect to legitimize a plethora of existing federal legislation regarding guns. And, of course, there is no way of knowing who will be appointing the all-important "inferior" federal judges, beginning in January 2009, who will play a far more important role than the Supreme Court in deciding the operational meaning of the Second Amendment.
Finally, Scalia should take a certain pleasure that Justice Stevens, by confining the entirety of his opinion to an "originalist" analysis of the Second Amendment (that obviously came to a completely different conclusion), seemed to concede the overarching importance of original meaning. Neither Justice was willing to pay any attention to the "dynamic" aspect of the Second Amendment. Scalia was presumably unwilling to cite Chief Justice Taney's opinion in Dred Scott, but it's the strongest single piece of evidence for the proposition that by mid-19th century an individual right to bear arms (at least if you were an American citizen) had become the conventional wisdom.
Sanford Levinson holds the W. St. John Garwood and W. St. John Garwood, Jr. Centennial Chair in Law at the University of Texas Law School. His most recent book is Our Undemocratic Constitution.
***
Jacob Sullum: The most important aspect of D.C. v. Heller, of course, is the Supreme Court's recognition that the Second Amendment protects an individual right to arms. From that premise it almost inevitably follows that the District of Columbia's gun law—which, as the Court noted, "bans handgun possession in the home" and "requires that any lawful firearm in the home be disassembled or bound by a trigger lock at all times, rendering it inoperable"—is unconstitutional. If such a law does not violate the right to armed self-defense, it's hard to imagine what law would. That's why the Court did not bother to specify what level of scrutiny is appropriate for purported violations of the Second Amendment. It concluded that the D.C. law is invalid "under any of the standards of scrutiny the Court has applied to enumerated constitutional rights."
By the same token, however, this decision does not give a clear sense of the line between constitutional and unconstitutional forms of gun control. The Court indicates that laws regulating the sale of firearms and prohibiting concealed carry, gun ownership by "felons and the mentally ill," possession of "unusual and dangerous weapons" (as opposed to weapons in common use for lawful purposes), and possession of firearms in "sensitive places" such as schools and government buildings are consistent with the Second Amendment. But it is not clear whether a law against openly carrying guns would pass muster, or what kinds of guns count as "unusual and dangerous," or how onerous licensing and registration requirements can be before they run afoul of the Second Amendment.
On that last point, the Court says licensing and registration are not necessarily unconstitutional, but it sounds like it would look askance at conditions attached to them.
"Assuming that [plaintiff Dick] Heller is not disqualified from the exercise of Second Amendment rights," the Court says, "the District must permit him to register his handgun and must issue him a license to carry it in the home." (Emphasis added.) It's harder to predict which weapons will end up being covered by the Second Amendment, except that they will include handguns but evidently not machine guns or bazookas.
Finally, the majority opinion does not address the question of whether the Second Amendment, either directly or via the 14th Amendment, applies to the states as well as a federal domain like the District of Columbia. But it's hard to imagine why it wouldn't now that the Court has clearly acknowledged the right to armed self-defense as a fundamental aspect of liberty protected by the Constitution.
Jacob Sullum is a senior editor at reason and a nationally syndicated columnist.
***
Dave Kopel: Heller is a tremendous victory for human rights and for libertarian ideals. Today's majority opinion provides everything which the lawyers closely involved in the case, myself included, had hoped for. Of course I would have preferred a decision which went much further in declaring various types of gun control to be unconstitutional. But Rome was not built in a day, and neither is constitutional doctrine.
For most of our nation's history, the U.S. Supreme Court did nothing to protect the First Amendment; it was not until the 1930s when a majority of the Court took the first steps towards protecting freedom of the press. It would have been preposterous to be disappointed that a Court in, say, 1936, would not declare a ban on flag-burning to be unconstitutional. It took decades for the Supreme Court to build a robust First Amendment doctrine strong enough to protect even the free speech rights of people as loathsome as flag-burners or American Nazis.
Likewise, the Equal Protection clause of the Fourteenth Amendment was, for all practical purposes, judicially nullified from its enactment until the 1930s. When the Court in that decade started taking Equal Protection seriously, the Court began with the easiest cases—such as Missouri's banning blacks from attending the University of Missouri Law School, while not even having a "separate but equal" law school for them. It was three decades later when, having constructed a solid foundation of Equal Protection cases, the Court took on the most incendiary racial issue of all, and struck down the many state laws which banned inter-racial marriage.
So too with the Second Amendment. From the Early Republic until the present, the Court has issued many opinions which recognize the Second Amendment as an individual right. Yet most of these opinions were in dicta. After the 1939 case of United States v. Miller, the Court stood idle while lower federal courts did the dirty work of nullifying the Second Amendment, by over-reading Miller to claim that only National Guardsmen are protected by the Amendment.
Today, that ugly chapter in the Court's history is finished. Heller is the first step on what will be long journey. Today, the Court struck down the most freakish and extreme gun control law in the nation; only in D.C. was home self-defense with rifles and shotguns outlawed. Heller can be the beginning of a virtuous circle in which the political branches will strengthen Second Amendment rights (as in the 40 states which now allow all law-abiding, competent adults to obtain concealed handgun carry permits), and the courts will be increasingly willing to declare unconstitutional the ever-rarer laws which seriously infringe the right to keep and bear arms.
As the political center of gravity moves step by step in a pro-rights direction, gun control laws which today might seem (to most judges) to be constitutional will be viewed with increasing skepticism. The progress that the pro-Second Amendment movement has made in the last 15 years has been outstanding. As long as gun owners and other pro-Second Amendment citizens stay politically active, the next 15, 30, and 45 years can produce much more progress, and the role of the judiciary in protecting Second Amendment rights will continue to grow.
Dave Kopel is Research Director at the Independence Institute, in Golden, Colorado. He was one of three lawyers at the counsel table who assisted Alan Gura at the oral argument on March 18. His brief for the International Law Enforcement Educators and Trainers Association was cited four times in the Court's opinions.
***
Joyce Lee Malcolm: What a great day for individual rights. The majority of the Supreme Court retrieved the original intent of the Second Amendment to permit individuals the right and ability to defend themselves. For thirty years those convinced that ordinary people can't be trusted with guns have dominated the discussion. In order to ban civilian ownership of weapons, the original meaning of the Second Amendment had to be reinterpreted, and unfortunately with its awkward language—which was well-understood at the time—that wasn't too difficult. Generations of law students have been taught that the Second Amendment merely protected the right of states to have a militia, a right already incorporated into the body of the Constitution. The nearly complete control over the militia by the federal government was not altered in any way by the amendment, but no mind. The linguistic efforts to deny an individual right were quite inventive—"the people" only in this amendment meant a group, not an individual, "bear arms" implied an inclusively military context, that awkward word "keep" was to be erased by linking it with "bear" in order to make it exclusively military, and so on. And it all nearly worked. But not quite.
Thanks to the scholarly efforts of many people, the overwhelming evidence for an individual right to keep and have weapons for self-defense was uncovered and published. It was that evidence that the justices relied upon.
My only disappointment with an otherwise great decision was how narrow it was. Four justices ignored the evidence in order to preserve the gun control measures meant to deny individuals the right to be armed. In the process, they were prepared to erase a basic right and uphold the stringent and ineffective D.C. gun ban, a law that went so far as to forbid reassembling a gun in the home in the case of a break-in.
Still, it was a great day for every American, one that will ensure a safer America than any number of gun bans ever could.
Joyce Lee Malcolm is professor of legal history at George Mason University School of law. She is the author To Keep and Bear Arms: The Origins of an Anglo-American Right.
The post The Second Amendment Goes to Court appeared first on Reason.com.
]]>Unifying the insurgency
W. James Antle III
1. Did you support the invasion of Iraq?
I opposed the Iraq war, though regrettably I did hedge in the month before the invasion.
2. Have you changed your position?
Yes—even as an opponent of the war, I was too trusting of the hawks' arguments concerning weapons of mass destruction and Saddam's propensity for anti-American terrorism. My basic views on the inadvisability of democratic nation-building, however, remain unchanged.
3. What should the U.S. do in Iraq now?
The only thing unifying the insurgency is the continued American presence. An orderly withdrawal, while no panacea, would cause the insurgency to divide against itself and allow Iraq's ethno-religious factions to chart their own course.
W. James Antle III is a senior writer for The American Conservative.
Scrap the current constitution
Ronald Bailey
1. Did you support the invasion of Iraq?
Yes.
2. Have you changed your position?
Not yet—but the history of the last three years in Iraq has greatly deepened my appreciation of the Federal government's abilty to screw up anything. It might have been different as I outlined in my August 2005 column "Iraq 2007."
3. What should the U.S. do in Iraq now?
Train sufficient Iraqi forces to secure the country's borders and scrap the current constitution and encourage a transition to a loose confederation of democratically self-governing Sunni, Kurdish and Shi'a regions and then get out. I offered a similar plan in my January 2005 "Free Kurdistan!"
Ronald Bailey is Reason's science correspondent.
Let the Shiite Crescent bloom
Tim Cavanaugh
1. Did you support the invasion of Iraq?
No, which is not the same as saying I don't believe the United States is engaged in a serious, multi-dimensional struggle against radical Islam, and that intelligent use of violence is an important tool in that fight. (I don't know whether we still need to draw this distinction, but I haven't forgotten being called a traitor and fifth columnist for suggesting the invasion of Iraq was ill-advised.)
2. Have you changed your position?
Somewhat, but like Rhett Butler, I won't join a cause until it's truly lost. The occupation has gone better than any prudent person had a right to expect, and the failure of so many hawks to understand this shows how unserious they were all along. What made the invasion a mistake was not any particular fact on the ground in Iraq, but the three-year, day-by-day demonstration it has given our country's enemies (Iran most notably) of the precise limits of American power and resolve. The lessons our real enemies learn from this will come back to haunt us; that's why some guy way back in the 1900s said something about speaking softly and carrying a big stick.
3. What should the U.S. do in Iraq now?
Let the Shiite Crescent bloom: We've already spent thousands of lives and half a trillion dollars inadvertently nurturing it, so we might at least get the incremental benefit of having a deadly rival to the Salafists who are even more determined than the Shiites to destroy our civilization. (Just so, we should have let the Iranians finish kicking Saddam's ass in the 1980s.) That means accepting the current Mullahfied Iraqi government and leaving it to brutalize the Sunnis at will. This might offend our sense of decency, but if we stay in the country, the same historical forces that drove the British, the Turks, and all other visitors to back the Sunnis over the Shiites will drive us down the same road.
Tim Cavanaugh is Reason's web editor.
Declare victory
Brian Doherty
1. Did you support the invasion of Iraq?
No.
2. Have you changed your position?
No.
3. What should the U.S. do in Iraq now?
With a democratically elected Iraqi government in place and Saddam's (nonexistent) threat of WMDs gone, declare victory and pull out troops with all dispatch.
Brian Doherty is a senior editor of Reason magazine and author of This is Burning Man (Little, Brown).
Consider a division
David Friedman
1. Did you support the invasion of Iraq?
No. I thought it was probably a mistake, although I did not have enough information to be certain.
2. Have you changed your position?
I am now more confident it was a mistake.
3. What should the U.S. do in Iraq now?
The U.S. should leave, if possible in a way that permits the Iraqis to make arrangements that do not result in large numbers of people being killed. That might involve a de jure, or at least de facto, division of the country.
I am not an expert on Iraq, and it is hard to know for certain whose account of the situation to believe. But those are my best guesses.
David Friedman is a professor of law at Santa Clara University and the author of many books, including The Machinery of Freedom: A guide to radical capitalism and the new novel A Long Short War: The Postponed Liberation of Iraq (Plume).
Remember a pressing engagement
Wendy McElroy
1. Did you support the invasion of Iraq?
I opposed the invasion on both principled and practical grounds
2. Have you changed your position?
My opposition has deepened as the war has exceeded my worst fears in duration, blatant economic motives, political incompetence and military brutality.
3. What should the U.S. do in Iraq now?
Get out right now. Declare victory, declare defeat, remember a pressing engagement back home… it doesn't matter what reason is given. Get out immediately.
Wendy McElroy is the editor of ifeminists.net, a weekly columnist for FOX News, and the author of several books on anarchism and on feminism. She maintains a daily blog at www.wendymcelroy.com.
Recruits, inspiration, targets
John Mueller
1. Did you support the invasion of Iraq?
I opposed it, as my 2003 commentary in Reason will indicate. One comment about the link between a war in Iraq and terrorism from that: "it seems likely that an attack will supply them with new recruits, inspire them to even more effort, and provide them with inviting new targets in the foreign military and civilian forces that occupy a defeated, chaotic Iraq."
2. Have you changed your position?
Hardly. The main issue now is whether the war has become the greatest debacle in American foreign policy history or only the second greatest, after Vietnam.
3. What should the U.S. do in Iraq now?
A considerable number, maybe a vast majority, of the insurgents are fighting because the country has been invaded and occupied. Since the American presence is the cause of their participation in the insurgency, their incentive to fight would accordingly vanish if the United States pulled out. It's an experiment well worth conducting.
John Mueller is a professor of political science at Ohio State University. His most recent book is The Remnants of War. His next, nearing completion, is currently titled, Devils and Duct Tape: Terrorism and the Dynamics of Threat Exaggeration.
No Monday-morning quarterbacking
Charles Murray
1. Did you support the invasion of Iraq?
Yes.
2. Have you changed your position?
No. I'm as critical a Monday-morning quarterback as anyone else, but I think the administration's rationale for invading Iraq was correct, and an American president who had not invaded, given the information he had for making the decision, would have been irresponsible.
3. What should the U.S. do in Iraq now?
Damned if I know.
Charles Murray is the W.H. Brady Scholar at the American Enterprise Institute.
Six months or sooner
William A. Niskanen
1. Did you support the invasion of Iraq?
No.
2. Have you changed your position?
No.
3. What should the U.S. do in Iraq now?
Announce that all U.S. troops will leave Iraq within six months of a request from the Iraqi government. Leave earlier if necessary to avoid involvement in an Iraqi civil war.
William A. Niskanen is chairman of the Cato Institute
Fashion a tolerable compromise
Tom G. Palmer
1. Did you support the invasion of Iraq?
No, I opposed it. I listened to the case, but I was not convinced by the administration's arguments or evidence.
2. Have you changed your position?
No, I have not, I think it was a mistake. But it is also a mistake that was made and so the question is not only what to have done, but what to do now.
3. What should the U.S. do in Iraq now?
The US government should be clear that the US will withdraw all of its troops after a reasonable period of time to allow the Iraqi government to fashion a tolerable domestic political compromise among those who are willing to tolerate each other and to defeat enough of the terrorists to give the process some chance of success.
Tom G. Palmer is a senior fellow of the Cato Institute and a founder of the Lamp of Liberty, an Arabic-language libertarian website.
No, no, go
Charles V. Peña
1. Did you support the invasion of Iraq?
No.
2. Have you changed your position?
No.
3. What should the U.S. do in Iraq now?
Exit promptly; no later than the end of the year.
Charles V. Peña is a senior fellow at the Coalition for a Realistic Foreign Policy and the author of Winning the Un-War: A New Strategy for the War on Terrorism.
Maintain a sustainable troop level
Jonathan Rauch
1. Did you support the invasion of Iraq?
On the fence. Leaned in favor but uneasily.
2. Have you changed your position?
Retrospectively, the invasion was a mistake. If I knew then what I know now, I'd have opposed it. But then, if he knew then what we know now, Bush would not have proposed it.
3. What should the U.S. do in Iraq now?
I'm with Nick Gillespie. Ebbing public support makes the operation unsustainable at current troop levels; pulling out entirely could cause a (heightened) civil war. Little choice but to reduce forces in a phased withdrawal. I think something like 40,000 or 50,000 troops could be sustained indefinitely (at least if there's apparent political progress in Iraq), which buys some time.
Jonathan Rauch is a senior writer for National Journal and a guest scholar at the Brookings Institution.
Win
Glenn Reynolds
1. Did you support the invasion of Iraq?
Yes.
2. Have you changed your position?
No. Sanctions were failing and Saddam was a threat, making any other action in the region impossible.
3. What should the U.S. do in Iraq now?
Win.
Glenn Reynolds runs the blog Instapundit and is the author of An Army of Davids: How Markets and Technology Empower Ordinary People to Beat Big Media, Big Government, and Other Goliaths
Fascists and fascism
Louis Rossetto
1. Did you support the invasion of Iraq?
Yes, both the one that didn't happen in 1991 and the one that did in 2003. But Iraq is not the war, it is a battle. The war is The Long War against Islamic fascism.
2. Have you changed your position?
If anything, I believe even more strongly in actively combating Islamic fascism throughout the Global Village. Everyday is Groundhog Day for the anti-war movement, which is stuck re-protesting Vietnam — while we are confronted by a uniquely 21st century challenge: a networked fascist movement of super-empowered individuals trying to undo 50K years of social evolution. Waiting to get hit by an NBC weapon is not an option. Dhimmitude for me or my children is not peace. Righteous forward defense is a necessity.
3. What should the U.S. do in Iraq now?
The US should persevere militarily until we defeat the fascists in Iraq, as we did in Afghanistan, as we must everywhere. The US's biggest failure has not been on the battlefield — where we are relentlessly reducing our enemies — but in waging media war against the Islamists and their fellow travelers on the Left, and in rallying the American people, who are confused, and perhaps angered, that once again we are being called upon to save the world.
Louis Rossetto is the founder of Wired magazine
Three-state solution
Jacob Sullum
1. Did you support the invasion of Iraq?
No.
2. Have you changed your position?
No.
3. What should the U.S. do in Iraq now?
Get out as quickly as possible without leaving behind utter chaos. The best hope for stability may be to let Iraq split into three countries, or three autonomous sections with a weak central government.
Jacob Sullum is a senior editor of Reason and a syndicated columnist.
Retire the Rusmfeld-Cheney gang
Jon Basil Utley
1. Did you support the invasion of Iraq?
No, I strongly opposed it. My column "Seven (revised Eight) Lies about Iraq" was a key piece for four years.
2. Have you changed your position?
I did not change. Everything happened as we feared, with all options now being bad.
3. What should the U.S. do in Iraq now?
Retire the Rusmfeld-Cheney Gang and their staffs; those who were wrong mainly now want to justify past mistakes. Use knowledgeable State and Defense Middle East experts to implement a new policy. Work with Europeans for a common front and policy. Renounce permanent military bases in Iraq. Demand dismantlement of settlements on the West Bank and work with peace parties in Israel for an Israeli-Palestinian peace agreement. This would regain for America our lost legitimacy.
Jon Basil Utley, a longtime commentator at Voice of America who writes regularly about foreign policy, is associate publisher of The American Conservative.
A little smugger
Jesse Walker
1. Did you support the invasion of Iraq?
No.
2. Have you changed your position?
No, but I've probably gotten a little smugger about it.
3. What should the U.S. do in Iraq now?
Get out in the least damaging manner possible. That will probably entail splitting the country in three.
Managing Editor Jesse Walker is the author of Rebels on the Air: An Alternative History of Radio in America (NYU Press).
Don't put me in charge
Matt Welch
1. Did you support the invasion of Iraq?
No. Nor did I oppose it.
2. Have you changed your position?
Slightly. What kept me from opposing the war outright was:
1) I thought it very likely the Saddam Hussein regime had WMDs, and that the West would never have a mechanism for real weapons inspections without the credible threat of military intervention;
2) I thought there was far more international/legal justification for bombing Iraq than there ever was for bombing Kosovo (an action which, at the time, I supported);
3) Saddam's totalitarian reign was one whose end I would not weep for.
In short, I supported the bluff (though not, of course, the exact way it was made), and then hoped we wouldn't call it. Which isn't very intellectually defensible, but there you go.
What has changed about my position (as opposed to the changes in presumed facts) is that I'm even more worried than I was in spring 2003 (which was a lot) that cranky interventionism (or Jacksonian Wilsonianism, as I don't like to call it), is a terrible approach to foreign policy, because it extends our resources, enlarges the target on our back, feeds into the anti-American pathology that comes when we and only we flex the only Power that matters, and leads to the corruption that inevitably accompanies an expansion of power. Which is to say, the stuff I thought might go bad has gone far worse than I feared.
All that said, I have reserved space in my brain for the possibility that in the long view, this will have turned out to be a daring and revolutionarily pro-demcratic (if hugely flawed) act.
3. What should the U.S. do in Iraq now?
I have no earthly idea. Maybe the most sensible thing to do is the most radical — separate the warring parties both physically and geographically. That is to say, stop trying to prop up the arguably untenable fiction that a multi-ethnic, multi-sectarian, post-colonial, mid-FUBAR country can become the 21st century Switzerland, and instead hasten the business of dividing the pie up into three (or whatever) countries. This would probably set a terrible precedent, but it's hard to see how it can be much worse than the one we've established now.
Matt Welch, a former assistant editor of Reason, is assistant editorial page editor at the Los Angeles Times, and propietor of mattwelch.com.
Non-invasive individuals
Robert Anton Wilson
1. Did you support the invasion of Iraq?
No. I loathe invasions and occupations and all violence against non-invasive individuals.
2. Have you changed your position?
Yes. I oppose the invasion even more vehemently, since Bush has used it as an excuse to destroy the last few tattered remnants of the Bill of Rights.
3. What should the U.S. do in Iraq now?
Stop killing people, bring the troops home, and rebuild Katrina damage. (But they never listen to me.)
Robert Anton Wilson is the coauthor of Illuminatus!. His latest book is Email To the Universe.
Buttress, counterbalance, prevent
Michael Young
1. Did you support the invasion of Iraq?
Yes, I did. Saddam's fall was too appealing a prospect not to.
2. Have you changed your position?
No. I regret that the U.S. mismanaged the aftermath, breaking the momentum to turn Iraq into a stable, acceptably pluralist system. This will have negative repercussions for democracy in the region. But I find that an American withdrawal today would be disastrous for the Iraqis.
3. What should the U.S. do in Iraq now?
It should maintain its military presence, even if that means modifying it in such a way as to avoid the semblance of military occupation. It should plan to stick around for the long term, regardless of domestic pressures. And it should oversee a genuine, consensual process of national dialogue and stabilization in Iraq, not a self-defeating handing over of power to security forces that are, in reality, cover for sectarian militias. This continued American presence is essential—to buttress democratic forces elsewhere in the region, to counterbalance Iran's growing power, and to prevent the outbeak of civil war in Iraq.
Reason contributing editor Michael Young edits the opinion page of the Beirut Daily Star
The post Iraq Progress Report appeared first on Reason.com.
]]>As Campaign 2004 entered its home stretch, we asked a variety of policy wonks, journalists, thinkers, and other public figures in the reason universe to reveal for whom they are voting this fall, for whom they pulled the lever last time around, their most embarrassing presidential vote, and their favorite president of all time. Their answers, as of late August, follow.
—The Editors
Contributing Editor Bagge is best known as author of the alternative comic book Hate.
2004 vote: If it looks like my home state could go either way by Election Day, I'll vote for John Kerry. Otherwise I'll vote for the Libertarian Party's candidate, Michael Badnarik. That's been my M.O. every election year, since the Democratic candidate usually strikes me as the lesser of two evils (if not by much).
2000 vote: Harry Browne.
Most embarrassing vote: Every time I've voted for a major-party candidate I've felt embarrassed. I vaguely recall voting in '88 for Michael Dukakis, whose only positive attribute was that his last name wasn't Bush (as is the case with John F. Kerry).
Favorite president: George Washington, for actually refusing to assume as much power as he could have gotten away with. I can't think offhand of another president that could be said about.
Bailey is Reason's science correspondent.
2004 vote: I'm undecided between Republican George W. Bush and Libertarian Michael Badnarik. Bush has been a great disappointment. But Kerry will be even worse—raising taxes, overregulating, and socializing more of medicine. What to do?
2000 vote: George W. Bush. I couldn't possibly have voted for Gore since he dislikes me personally. Besides, I was presciently worried (you can ask my wife) about a popular vote/electoral vote mismatch.
Most embarrassing vote: George McGovern, 1972. I was 18 and thought I was a socialist.
Favorite president: George Washington. The man spurned being made king and stepped peacefully down from office.
Barlow is a songwriter for the Grateful Dead and other bands, the co-founder and vice chair of the Electronic Frontier Foundation, and a Berkman Fellow at Harvard Law School.
2004 vote: I'm voting for John Kerry, though with little enthusiasm. This is only because I would prefer almost anything to another four years of George W. Bush. I don't believe the Constitution, the economy, or the environment can endure another Bush administration without sustaining almost irreparable damage.
2000 vote: John Hagelin of the Natural Law Party. I discovered, in the voting booth, that a friend of mine was his vice presidential candidate. I couldn't bring myself to vote for Bush, Gore, or Nader and had intended to cast no presidential vote.
Most embarrassing vote: I'm embarrassed for my country that in my entire voting life, there has never been a major-party candidate whom I felt I could vote for. All of my presidential votes, whether for George Wallace, Dick Gregory, or John Hagelin, have been protest votes.
Favorite president: Jefferson, who defined, in his works and in his person, just about everything I love about America.
Bovard is author of The Bush Betrayal (Palgrave Macmillan) and seven other books.
2004 vote: I will probably vote for Badnarik, the Libertarian Party candidate. Both of the major-party candidates brazenly flaunt their contempt for the U.S. Constitution. Regardless of who wins in November, the U.S. likely will have a lousy president for the next four years.
2000 vote: I abstained.
Most embarrassing vote: I voted for Gerald Ford in 1976. He was not that embarrassing, compared to Jimmy Carter. And compared to George W. Bush, Ford was verbally graceful.
Favorite president: It might be a coin toss between Washington and Jefferson.
Washington set a magnificent example of self-restraint, protecting the new nation from both his own power lust and unnecessary wars (despite foolish popular demands). Jefferson masterfully reined in the federal government from the tyrannical Alien and Sedition Act persecutions that John Adams launched.
Brand is the founder of the Whole Earth Catalog and the Long Now Foundation. He is the author of, among other books, The Media Lab (Viking) and How Buildings Learn (Penguin).
2004 vote: Kerry. He's knowledgeable enough and appears to do well in crunches. He has the skills and connections to begin to undo the damage of the Bush years. He's highly ambitious, which is fine with me. And he personally killed a man who was trying to kill him and his crew. You could also say he personally attacked a government that was trying to kill his generation. Those actions take sand. Too bad they won't come up in the debates.
2000 vote: Al Gore.
Most embarrassing vote: Lyndon Johnson.
Favorite president: Theodore Roosevelt for a fine blend of intellect and zzzzzest, tied with Bill Clinton for the same reasons.
Carey stars in Drew Carey's Green Screen Show, beginning October 7 on the WB.
2004 vote: Quit pretending that it matters, would you? Can you vote for all the nefarious cabals that really run the world? No. So fuck it.
2000 vote: I voted Libertarian, for all the good it did me.
Most embarrassing vote: Is it considered embarrassing to cast a vote out of principle for someone you know doesn't have a snowball's chance of winning? Oh, OK. Then they're all embarrassing.
Favorite president: Andrew Jackson, because he's what a lap dance costs (and because, ironically, he opposed having a National Bank).
Cavanaugh is Reason's Web editor.
2004 vote: Michael Badnarik, because he's Not Bush either.
2000 vote: Ralph Nader.
Most embarrassing vote: Dukakis in 1988. I thought he looked cool in that tank!
Favorite president: If we can't count John Hanson, then Warren G. Harding; would that they could all achieve so little.
Chapman is a columnist and editorial writer for the Chicago Tribune.
2004 vote: I haven't decided between John Kerry and Michael Badnarik. I have only the dimmest hopes for a Kerry presidency, but I think Bush has to be held accountable for Iraq, the worst foreign policy blunder since Vietnam, and the accelerated growth of the federal government.
2000 vote: Harry Browne, in keeping with my usual (though not automatic) practice of voting for the Libertarian presidential nominee.
Most embarrassing vote: Richard Nixon, in my first election, 1972, an experience that helped estrange me permanently from the Republican Party.
Favorite president: Thomas Jefferson, who took great and justified pride that as president, he eliminated internal taxes and avoided war, and who peacefully doubled the size of the young nation.
Senior Editor Doherty is author of This Is Burning Man (Little, Brown).
2004 vote: I am a principled nonvoter. If I were forced to vote at gunpoint, I'd pick the Libertarian Party's Michael Badnarik, whose views on the proper role of government most closely resemble mine.
2000 vote: I did not vote. Those who vote have no right to complain.
Most embarrassing vote: I've been saved the embarrassment of ever having to feel any sense of responsibility, of even the smallest size, for the actions of any politician.
Favorite president: In their roles as president, I can't be an enthusiastic fan of any of them, but for his role in crafting the Constitution, a document that held some (unrealized) promise to limit government powers, James Madison.
Richard Epstein
Epstein is a professor of law at the University of Chicago and author, most recently, of Skepticism and Freedom: A Modern Case for Classical Liberalism (University of Chicago).
2004 vote: I don't know who the Libertarian candidate is this time, but you can put me down as voting for him; anyone but the Big Two. As far as I can tell, the debate thus far has borne no relation to the important issues facing the nation…except Vietnam. It's just two members of the same statist party fighting over whose friends will get favors.
2000 vote: I can't remember.
Most embarrassing vote: Since I don't remember who I vote for from one election to the next, it's hard to say. I suppose Richard Nixon in '72, though that doesn't mean I'd want to have voted for George McGovern either.
Favorite president: I'm certainly a Calvin Coolidge fan; he made some mistakes, but he was a small-government guy.
Freund is a senior editor at Reason.
2004 vote: I'm still thinking about it.
2000 vote: Harry Browne.
Most embarrassing vote: Andre Marrou.
Favorite president: I have no favorite president.
Contributing Editor Garvin, author of Everybody Had His Own Gringo: The CIA and the Contras (Brassey's), writes about television for The Miami Herald.
2004 vote: I live in Florida. My votes are randomly assigned based on the interaction of our voting machines, the Miami-Dade Election Commission, and passing UFOs.
2000 vote: See above.
Most embarrassing vote: My presidential record is solid. However, I once cast a write-in ballot for Dynasty's Joan Collins for Congress. It was an immature act, insulting to America's democratic institutions, and I regret it. Upon reflection,China Beach's Dana Delany would have been a more deserving choice.
Favorite president: William Henry Harrison caught pneumonia while delivering his inaugural address, lay in bed barely conscious for six weeks, and then died, his presidency having done hardly any damage to the country.
George is a New York Post columnist, West Indian Catholic stand-up comic, and recovering Republican flunky.
2004 vote: Living in a maximum blue state allows me to vote my conscience. Rather than vote for an entitlement-expanding, tariff-imposing, deficit-increasing, big-government Johnson Republican or an entitlement-expanding, tariff-imposing, tax-increasing, big-government Nixon Democrat, I will vote for Libertarian Party candidate Michael Badnarik.
2000 vote: Harry Browne. Bush's last-second evasiveness on his DUI arrest was too reminiscent of the slippery tongue of the guy about to leave office. Bob Jones University didn't help either.
Most embarrassing vote: Never actually did it, but had I been a citizen the year I turned 18 (1980), I would have (gulp!) voted for Jimmy Carter. It's the secret shame I have carried around for decades.
Favorite president: Abraham Lincoln, for proving that it's best to keep the band together despite many years of creative differences. (Had the Beatles followed his example, the world would have been a much better place.)
Gillespie is editor of Reason and of the new anthology Choice: The Best of Reason (BenBella Books).
2004 vote: Probably no one but maybe Badnarik, if only to register dissent from the Crest and Colgate parties.
2000 vote: Harry Browne, I think, but possibly no one.
Most embarrassing vote: In 1984, the first time I could vote for president and the only time I've voted for a major-party candidate, I cast a ballot for Walter Mondale. I empathize with complete losers, and a guy whose only memorable campaign line—"Where's the beef?"—came from a Wendy's commercial (not even a McDonald's spot!) was a loser of historic proportions.
Favorite president: Richard Nixon, who has done more in my lifetime than any other U.S. pol to discredit the idea that government should wield massive and unexamined power over citizens.
Contributing Editor Godwin is legal director of Public Knowledge.
2004 vote: Kerry. Let's put it this way: After four years of Bush, the Republican Party has become an example of a political machine out of control. Everything they used to decry—reckless foreign intervention, fiscal irresponsibility, deficit spending—they now represent. You don't have to love Kerry or the Democrats to think it's time for a change. Worst case, at least we'd get divided government for four years.
2000 vote: Gore, only because as a Texan I already had strong reservations about George W. Bush.
Most embarrassing vote: Not a one.
Favorite president: I have a lot of fondness for Theodore Roosevelt: In him you had a strong, articulate president who never thought he lost manhood points by being pro-environment. I also like Eisenhower; he presided over a strong American response to a very polarized East-West world.
Hentoff, a nationally syndicated columnist, writes regularly for both the Village Voice and The Washington Times. An expanded paperback edition of his book The War on the Bill of Rights and the Gathering Resistance (Seven Stories Press) will be released this fall.
2004 vote: I'm not voting for anyone at the top of the ticket. I can't vote for Bush, who supports Ashcroft's various "revisions" to the Bill of Rights, since our liberties are what we're supposed to be fighting for. As for Kerry, I think he's an empty suit: How much time did he give his years in the Senate in his convention speech, about 40 seconds?
2000 vote: I voted for Nader last time. But he wants to pull the troops out of Iraq, which would lead to a state of nature like Thomas Hobbes had; it would be disastrous. He's also become part of the bash-Israel crowd, and to get on ballots he's been cooperating with Lenora Fulani, who has been accused of harboring anti-Semitic biases.
Most embarrassing vote: Well, I didn't mind voting for Nader in 2000, because Gore had a whole series of empty suits during that campaign, and I didn't think much of Bush either. I can't think of any votes I'm particularly embarrassed about.
Favorite president: FDR. He could have done much more to help the victims of the Holocaust, but he did act decisively (if trickily) to take us into the war, which was essential. Otherwise we'd all be speaking German. And as Cass Sunstein has pointed out, FDR was the one who laid out a "second bill of rights," with economic freedoms like a right to decent housing.
Higgs is a senior fellow in political economy at the Independent Institute and author, most recently, of Against Leviathan (Independent Institute).
2004 vote: I never vote. I don't wish to soil my hands.
2000 vote: Had I been forced to cast a ballot for president in the 2000 election, I might have died of septicemic disgust.
Most embarrassing vote: I voted only once in a presidential election, in 1976, and I did so on that occasion only so that I could irritate my left-liberal colleagues at the University of Washington by telling them that I had voted for "that idiot" Gerald Ford.
Favorite president: Grover Cleveland, because he, more so than any of the others, acted in accordance with his oath to preserve and protect the Constitution, despite great pressures to act otherwise.
Jillette is the larger, louder half of the comedy/magic team Penn & Teller and star of Showtime's Penn & Teller: Bullshit!
2004 vote: I'm undecided (always the stupidest position). I might do the moral thing and not vote at all, or do the sensible thing and vote Libertarian (Badnarik, right?), or I might make 100 bucks from my buddy Tony and vote for Bush. (I told Tony that Bush and Kerry were exactly the same, and he bet me 100 bucks that I didn't believe that enough to really truly vote for Bush.) But if you want to be pragmatic, I'm in Nevada, so who cares?
2000 vote: Harry Browne!
Most embarrassing vote: I must have voted Republicrat at least once, but voting is secret—the Founding Fathers didn't want us to be embarrassed by our evil pasts.
Favorite president: Teller (he's president of Buggs and Rudy Discount Productions [Penn & Teller's company]), because he can lie without saying a word.
Kopel is research director of the Independence Institute in Golden, Colorado.
2004 vote: George Bush. This will be the first election in which I have ever voted for a Republican for president. We're in a war in which the survival of civilization is at stake, and Bush is the only candidate who realizes the gravity of the danger we face and who is determined to win World War IV.
2000 vote: Ralph Nader.
Most embarrassing vote: Harry Browne, 1996.
Favorite president: George Washington, for leading the nation through extremely perilous times, and for setting the highest standard of personal conduct and patriotic leadership.
Contributing Editor McClaughry, a senior policy adviser in the early Reagan White House, is president of the Ethan Allen Institute in Vermont.
2004 vote: George W. Bush. Unlike his opponents, he at least understands that only America can defeat militant Islam by a combination of military force and the ideology of freedom. At home, his recent advocacy for "a new era of ownership" promises the only way out of statist stagnation.
2000 vote: Bush.
Most embarrassing vote: Nixon, 1972.
Favorite president: Jefferson. He respected the Constitution, shrank government, slashed taxes, paid cash for Louisiana, hammered the Algerine pirates, reined in the judiciary, and began to extinguish the national debt.
Contributing Editor McCloskey teaches economics, history, English, communications, and anything else they tell her to teach at the University of Illinois at Chicago.
2004 vote: Can you believe this? The last time I voted for anyone but a Libertarian was for George McGovern in 1972, against the war. This time, another Democrat gets it—though as an economist I know it's irrational to vote at all.
2000 vote: For whomever the Libertarian Party candidate was. You expect me to remember?
Most embarrassing vote: Lyndon Johnson in 1964, my very first vote. Because Goldwater was scary. Did I know scary?
Favorite president: Speaking of scary, the office is. So the least effective: Warren Harding.
McElroy is a Fox News columnist, the editor of ifeminists.net, and a fellow at the Independent Institute.
2004 vote: I'm voting for No One for at least three reasons: 1) As a Canadian, I am spared the insulting process of punching a ballot to express which power glutton should prevail; 2) as an anarchist, I refuse to legitimize the process that puts anyone in a position of unjust power over people's lives; and 3) as a practical matter of value returned for effort, the time is better spent enjoying family or working.
2000 vote: No One. I admit to being so anti-Clinton and so appalled by the prospect of political correctness continuing that I uttered out loud "anyone but Gore"—which, in practical terms, meant Bush. Those famous last words are right up there with Socrates saying, "I drank what?"
Most embarrassing vote: I have never voted in a political proceeding. But when I first became a libertarian while living in California, I did support the Libertarian Party candidate. This would be more embarrassing if I had not learned from my mistake. The lesson: It is not the particular man in power that I oppose but the power itself, which is unjust. As a matter of logic, if nothing else, I cannot oppose the office as illegitimate while waving a straw hat and yelling, "Elect my man to it!"
Favorite president: Thomas Jefferson in his first term because he did less harm than any other president.
Murray is W.H. Brady Scholar at the American Enterprise Institute and author, most recently, of Human Accomplishment (HarperCollins).
2004 vote: Reluctantly—very reluctantly—George Bush. I find the Democrats so extremely obnoxious that I have to vote against them, and I can't do that voting Libertarian.
2000 vote: Harry Browne.
Most embarrassing vote: For Bob Dole in 1996, because I found Bill Clinton so extremely obnoxious that I had to vote against him. Probably my 2004 vote will be my second most embarrassing ballot.
Favorite president: Gotta be George Washington. He was acutely conscious that everything he did would be a precedent, and just about every choice he made was right.
O'Rourke is H.L. Mencken Research Fellow at the Cato Institute and author, most recently, of Peace Kills (Atlantic Monthly Press).
2004 vote: George W. Bush, because I don't want Johnnie Cochran on the Supreme Court.
2000 vote: George W. Bush. (I always vote Republican because Republicans have fewer ideas. Although, in the case of George W., not fewer enough.)
Most embarrassing vote: A 1968 write-in for "Chairman Meow," my girlfriend's cat. It seemed very funny at the time. As I mentioned, this was 1968.
Favorite president: Calvin Coolidge—why say more?
Paglia is a professor of humanities and media studies at the University of the Arts in Philadelphia.
2004 vote: John Kerry. In the hope that he will restore our alliances and reduce rabid anti-Americanism in this era of terrorism when international good will and cooperation are crucial.
2000 vote: Ralph Nader. Because I detest the arrogant, corrupt superstructure of the Democratic Party, with which I remain stubbornly registered.
Most embarrassing vote: Bill Clinton the second time around. Because he did not honorably resign when the Lewinsky scandal broke and instead tied up the country and paralyzed the government for two years, leading directly to our blindsiding by 9/11.
Favorite president: John F. Kennedy. Not that he accomplished much. But he was the first candidate I campaigned for as an adolescent, and I still admire his articulateness and vigor. The Kennedys gave the White House sophistication and style.
Pinker is Johnstone Professor of Psychology at Harvard and author of The Blank Slate (Penguin), How the Mind Works (W.W. Norton), and The Language Instinct (HarperCollins).
2004 vote: Kerry. The reason is reason: Bush uses too little of it. In the war on terror, his administration stints on loose-nuke surveillance while confiscating nail clippers and issuing color-coded duct tape advisories. His restrictions on stem cell research are incoherent, his dismissal of possible climate change inexcusable.
2000 vote: Gore, with misgivings.
Most embarrassing vote: I left Canada shortly after turning 18 and became a U.S. citizen only recently, so I haven't voted enough to be too embarrassed yet.
Favorite president: James Madison, for articulating the basis for democracy in terms of the nature of human nature.
Contributing Editor Pitney is a professor of government at Claremont McKenna College and author of The Art of Political Warfare (University of Oklahoma Press).
2004 vote: I'm voting for Bush. He cut taxes. Kerry would raise them.
2000 vote: Bush. Former reason Editor Virginia Postrel put it well: "Bush is a mixed bag. But I think Al Gore is the devil."
Most embarrassing vote: I'm comfortable with my presidential votes in general elections. In 1980, though, I hesitated to back Reagan in the primaries because I didn't think he would win in November. Oops.
Favorite president: Abraham Lincoln preserved the Union and wrote some of the most beautiful prose that ever came from an American pen.
Poole is the founder, former president, and current director of transportation studies at the Reason Foundation. He served on the Bush-Cheney transportation policy task force during the 2000 campaign and on the administration's transition team.
2004 vote: Bush, reluctantly, despite his troubling expansions of the federal government and threats to civil liberties. The alternative is simply worse. We elect not an individual but a consulting team, and we'll have a far better team in place with Bush than with Kerry.
2000 vote: Bush.
Most embarrassing vote: Richard Nixon in 1968. What a disaster! But at least he kept his promise to eliminate the draft.
Favorite president: Thomas Jefferson, despite his flaws, for his intellect, limited government philosophy, and strong support for separation of church and state.
Rauch is a correspondent for The Atlantic and a columnist for National Journal.
2004 vote: I'll recuse myself from this…I never tell my vote. A journalist thing.
2000 vote: See above.
Most embarrassing vote: See above.
Favorite president: Lincoln. All the usual reasons. Favorite prez of past 40 years: Bush 41. Beats Reagan and everybody else hands down.
Rennie is editor-in-chief of Scientific American.
2004 vote: John Kerry. Anybody who has seen Scientific American's editorials during the last few years knows we're deeply unhappy with the de facto anti-scientism of the current administration. Science shouldn't trump all else in setting policy, but it would be a nice change of pace for a White House to put science ahead of ideology again. Of course, I'm keeping my expectations low.
2000 vote: Remember that guy? The one that everybody said claimed to have invented the Internet, except he hadn't said that at all? He seemed good.
Most embarrassing vote: Back in college in 1980, flushed with youthful sanctimony, I voted for John Anderson. The voting booth is a bad place to be an idealist. But at least when I threw my vote away on a third-party candidate, it was irrelevant.
Favorite president: John Quincy Adams showed that it was possible for the son of a president to rise to that same office in a highly disputed election without being remembered as a dangerous embarrassment.
Reynolds, a professor of law at the University of Tennessee, publishes InstaPundit.com.
2004 vote: Most likely George Bush, and for one reason: the war. I'm having trouble trusting Kerry on that.
2000 vote: Harry Browne.
Most embarrassing vote: Dukakis, '88.
Favorite president: Calvin Coolidge, who knew his limitations.
Along with his partner Jane Metcalfe, Rossetto started Wired.
2004 vote: Bush may be wrong about everything else, but he is right about the issue that matters most for my children's future: stopping Islamic fascism. And Manchurian candidate Kerry and the Copperheads, er, Democrats, are just a joke, preferring to act as though this probably generation-spanning war is about politics, not the survival of the West.
2000 vote: I am proud to say that I have never voted for president in my life, despite having eight opportunities to do so, and really resent being forced to do so now.
Most embarrassing vote: This one—but the alternative of not voting and allowing a billionaire currency speculator like George Soros to pick the next U.S. president is too dire to contemplate.
Favorite president: Chauncey Gardner, from Being There. He was even less verbose than my next favorite president, Calvin Coolidge.
Sanchez is Reason's assistant editor.
2004 vote: Kerry would get my vote if I didn't live in the District of Columbia, but the prospect of raising his total from 94.0001 percent to 94.0002 percent isn't quite enough to lure me to the polls. Badnarik is embarrassing, and Bush is so egregiously dishonest and destructive that even electing a mannequin like Kerry is an acceptable price of ousting him.
2000 vote: I didn't vote, though (to my shame, in retrospect) I was optimistic about G.W. Bush after the convention speeches, where all that focus on Social Security reform, educational choice, and "humble" foreign policy led me to think he might do some net good.
Most embarrassing vote: I've managed to spare myself that particular breed of embarrassment by not voting. Blessed are the apathetic, for they get the better even of their political blunders.
Favorite president: Grover Cleveland, who vetoed a popular agricultural assistance bill with the phrase, "Though the people support the government, the government should not support the people."
Shafer writes the Press Box column for Slate.
2004 vote: Who is the Libertarian candidate this year? That's who. Because I'm a yellow dog Libertarian.
2000 vote: Who was the Libertarian candidate that year? That's who.
Most embarrassing vote: I've never been embarrassed in the slightest by my presidential ballot.
Favorite president: Richard Nixon, because he's the gift that keeps giving.
Shermer is publisher of Skeptic magazine, a monthly columnist for Scientific American, author of The Science of Good and Evil (Henry Holt), and a bicycling enthusiast.
2004 vote: John Kerry. I'm a libertarian, but in 2000 I voted my conscience under the assumption that it probably didn't matter who won between Bush and Gore (Tweedledee and Tweedledum when compared to Browne), and I was wrong. It did matter. The world situation is too precarious and too dangerous to flip a coin, the Libertarian candidate cannot win, Bush's foreign policy is making the world more dangerous and more precarious rather than less, and Kerry has a good chance to win and an even better chance to improve our situation. Most important, he's a serious cyclist who wears the yellow "LiveStrong" bracelet in support of Lance Armstrong's cancer foundation and Tour de France win.
2000 vote: Harry Browne, because like the Naderites on the other end of the spectrum I voted my conscience.
Most embarrassing vote: Richard Nixon, 1972, my first presidential vote cast, just out of high school. My poli-sci profs the next several years of college regaled us with daily updates about Watergate. Ooops…
Favorite president: Thomas Jefferson, because 1) he was a champion of liberty, 2) he applied scientific thinking to the political, economic, and social spheres, and 3) when he dined alone at the White House there was more intelligence in that room than when John F. Kennedy hosted a dinner there for a roomful of Nobel laureates.
Sirius, former editor-in-chief of Mondo 2000, edits NeoFiles at life-enhancement.com/NeoFiles and is author of, most recently, Counterculture Through the Ages: From Abraham to Acid House (Random House).
2004 vote: I can't bring myself to say it. I'm voting for the only guys who stand a chance of replacing the complete insanity of the Bush administration and return us to the ordinary consensus madness we had come to know so well. You know, John and John.
2000 vote: Even though I ran as a write-in candidate myself, I wound up voting for Nader because I thought he gave such rousing and impressive speeches. I wouldn't actually want him to be president though. He's way too puritanical. Did anybody notice that he joined in on the Janet Jackson nipple crisis? Instead of objecting from a religious point of view, he objects from the view that corporate media are "spewing filth" into our environment. The health fascists are everywhere.
Most embarrassing vote: Ralph Nader in 2000. First of all, some other people actually voted for me. My insincerity is justifiable only in a dadaist context, which I therefore proclaim. And secondly, it encouraged Nader, who is now clearly addicted to the run.
Favorite president: It's difficult to rate the quality of an 18th-century president's decisions at this distance, but I choose Thomas Jefferson for eloquently elucidating many of the ideas and attitudes of the Age of Reason.
Smith is chairman of the Federal Election Commission.
2004 vote: That's one an election commissioner better not answer; we're not supposed to engage in partisan activities
2000 vote: I don't want to answer that one either.
Most embarrassing vote: I'm too smart to cast embarrassing ballots.
Favorite president: Warren G. Harding; he's a vastly underrated president and a man of great ordinary decency.
Smith, the 2002 Nobel Prize winner in economics, was born and raised in Kansas and inspired to learn by a farmer-teacher in a one-room country schoolhouse.
2004 vote: I am not voting for Kerry. Yet to be decided is whether I will vote for Bush or for neither.
2000 vote: Bush.
Most embarrassing vote: Many of the presidents were embarrassments, but not for me as a voter, because I always think of myself as voting against the other one. This policy led me to vote "for" Humphrey, Nixon, and others, but I saw myself as choosing negatively.
Favorite president: Eisenhower, for whom the affairs of state could always await another round of golf—for his not wanting to "get bogged down in a land war in Asia," and his concern about the military-industrial complex, which generalizes to other complexes like the Treasury-investment-banking complex.
Sullivan, a senior editor at The New Republic, blogs at andrewsullivan.com.
2004 vote: I can't vote because I'm not a citizen. So I can only "support" candidates, and I'm not supporting anyone in this election.
2000 choice: Bush.
Most embarrassing choice: I'm unembarrassed by all my choices.
Favorite president: Lincoln, of course. He saved the Union.
Senior Editor Sullum is a syndicated columnist and author of Saying Yes: In Defense of Drug Use (Tarcher Penguin).
2004 vote: The thought of choosing between Bush and Kerry, or casting another pathetic protest vote for the Libertarian candidate, is so depressing that I probably won't be motivated to visit my local polling place this year. I'd like to see Bush lose, but without Kerry winning. Much as I disliked him when he was in office, Clinton is looking better and better to me in retrospect.
2000 vote: Harry Browne.
Most embarrassing vote: It's a tossup between Mondale in 1984 and Dukakis in 1988. Dukakis is more embarrassing because it was more recent (and because he's Dukakis), but Mondale is more embarrassing because I voted for him twice—in New York's open primary (when I almost went for Gary Hart) as well as the general election.
Favorite president: Thomas Jefferson for his writings rather than his performance in office, Grover Cleveland or Calvin Coolidge for taking seriously their oaths to uphold the Constitution and its limits on federal power.
Taylor writes Reason Express, a weekly news commentary for Reason Online.
2004 vote: George W. Bush, pathetic bastard that he is—and has made me. The only thing that I am certain that John Kerry would do is raise taxes. Plus I figure the potential confusion surrounding a new national security team may well get people killed. See? Pathetic.
2000 choice: Bush, in Maryland. A gimme.
Most embarrassing vote: Bob Dole in 1996. Knew that was going nowhere.
Favorite president: Ronald Reagan, because he beat back the flawed economic theory that was destroying America and the world.
Volokh teaches and writes, mostly about constitutional law and cyberspace law, at UCLA School of Law. He blogs at volokh.com.
2004 vote: George W. Bush. I almost always vote for the party, not the man, because the administration, its legislative agenda, and its judicial appointments generally reflect the overall shape of the party. I tend to think that Republicans' views on the war against terrorists, economic policy, taxes, and many though not all civil liberties questions—such as self-defense rights, school choice, color blindness, and the freedom of speech (at least as to political and religious speech)—are more sound than the Democrats' views. I certainly find plenty to disagree with the Republicans even on those topics, but if I waited for a party with which I agreed on everything or even almost everything, I'd be waiting a long time.
2000 choice: George W. Bush.
Most embarrassing choice: Can't think of any.
Favorite president: George Washington. As best I can tell, he did a crucial job better than anyone else could have done, and I don't know him well enough to have learned about all his warts.
Managing Editor Walker is author of Rebels on the Air: An Alternative History of Radio in America (NYU Press).
2004 vote: I'm not sure yet, but I'm increasingly inclined to write in Elmer Fudd.
2000 vote: Harry Browne.
Most embarrassing vote: Dukakis. Bush Sr.'s ACLU-baiting campaign was appalling, and I wasn't yet ready to start throwing my vote away on third-party candidates and frivolous write-ins. So I threw it away on a Democrat instead.
Favorite president: It would have to be one of those practically powerless presidents who served under the Articles of Confederation—maybe the anti-federalist Richard Henry Lee, chief of the Continental Congress from 1784 to 1785, who helped launch the American Revolution, tried to ban the importation of slaves, fought to include a Bill of Rights in the Constitution, and sang the goofiest song in 1776.
Wanniski midwifed supply-side economics, was the first Marxist to work as an editor of the Wall Street Journal editorial page, and is author of The Way the World Works (Gateway).
2004 vote: Bush does not deserve to be re-elected, and Kerry does not deserve to be elected—Bush because of Iraq and Kerry because his economics are dreadful. I'm leaning toward Kerry because I prefer recession to imperialist war, but Bush might tempt me back by firing Cheney, Rumsfeld, and company.
2000 vote: George W. Bush.
Most embarrassing vote: I only voted for Dole in '96 because Jack Kemp was on the ticket. I should have voted Perot in protest.
Favorite president: Abraham Lincoln, because he saved the Union. Otherwise, there would today be no USA.
Contributing Editor Welch is a columnist for Canada's National Post. He blogs at mattwelch.com.
2004 vote: John Kerry, because I think the foreign policy approach of Bush and his administration would make the world, on balance, less safe and less free, while expanding the target on Uncle Sam's back, compared to a presidency run by a flip-floppy Democrat with a Republican Congress. I think Bush needs to be fired.
2000 vote: Ralph Nader, despite covering his campaign.
Most embarrassing vote: Dukakis. I'm hoping not to feel the same way this November.
Favorite president: Lincoln, for freeing slaves, preserving the Union, and serving as a ghostly conscience to haunt the Republicans' "Southern Strategy."
Wilson helped found the Guns and Dope Party (gunsanddope.com), which urges everybody to vote for himself. His best-known book is Illuminatus! (Dell), co-written with Robert Shea.
2004 vote: I'm voting for myself because I don't believe anybody else can represent me as well as I can represent myself.
2000 vote: Nobody. I sat in my tent and sulked, like Achilles.
Most embarrassing vote: In 1964 I voted for LBJ because I thought Goldwater would escalate the war in Vietnam.
Favorite president: John Adams, because he didn't trust anybody in politics, including himself.
Contributing Editor Young is a columnist for the Boston Globe.
2004 vote: That's a little private, don't you think? Whichever way I answer, half my friends won't talk to me anymore. (Such are the perils of having a bipartisan social life.) Of course, I could cast a write-in vote for Xena, Warrior Princess…but that would probably piss off my friends who are Buffy fans.
2000 vote: See above.
Most embarrassing vote: Well, talk about private!
Favorite president: George Washington—can't offend anyone with that.
The post Who's Getting Your Vote? appeared first on Reason.com.
]]>Despite the nod to constitutional restraints on police power, some of the tools Bush and other politicians have proposed—national I.D. cards, roving wiretaps, indefinite detainment of suspects, lower burdens of proof, intensified surveillance, and much more—raise serious questions about the government's commitment to civil liberties. In late September, reason asked a panel of experts to discuss which civil liberties they thought were most at risk in what has been called America's first 21st century war.
Jerry Berman
My first fear is that the government will expand intelligence investigations and break down the wall that Congress established after Watergate to prohibit the kind of surveillance that was conducted against domestic dissent during the civil rights and Vietnam War eras.
Under the Foreign Intelligence Surveillance Act, the intelligence community already has vast authority to conduct intelligence and criminal investigations of terrorist activities. Our intelligence community can already conduct secret electronic surveillance, engage in secret searches, and plant bugs against terrorists. The act covers foreign powers and terrorist enterprises. It applies overseas and in the United States and allows wiretapping, bugs, and surveillance directed at Americans, as well as foreigners, who may be engaged in terrorist activities on behalf of enterprises like bin Laden's network. The standard for this surveillance is less restrictive than what we require for domestic criminal surveillance.
The attorney general wants to change the primary purpose of those taps from intelligence, which is why they get a lower standard, to any purpose, meaning they could be used for criminal investigations. This would vastly expand federal authority to use surveillance without probable cause.
Federal agents still need to make the case that the expanded powers for which they are asking are necessary. To date, there has been no evidence, either from classified briefings or any public source, that restrictions in the law hindered their efforts to know about bin Laden and his associates. It wasn't a restriction breakdown. It was an analysis breakdown.
Jerry Berman is the executive director of the Center for Democracy and Technology, a high-tech civil liberties and Internet policy organization. He has worked on all the intelligence guidelines passed since Watergate.
Clint Bolick
Detention without trial. Racial profiling. Electronic surveillance. National identification cards. The growing list of proposals in the wake of the September 11 attacks makes it tough to single one out as the most threatening to our civil liberties. Much depends on the contours of the power, how it's exercised, and whether it has an end point.
More worrisome is the notion that our civil liberties are subject to cancellation in times of crisis. Our Constitution seeks to protect rights the Framers deemed inalienable. It faces its gravest tests in times of crisis.
We have traveled this road often before, and we should draw upon the lessons of history. In his dissenting opinion in Korematsu v. United States, which upheld Japanese internment camps, U.S. Supreme Court Justice Robert H. Jackson wrote that "a judicial construction…that will sustain this order is a far more subtle blow to liberty than the promulgation of the order itself….[O]nce a judicial opinion rationalizes the Constitution to show that [it] sanctions such an order…the Court for all time has validated the principle of racial discrimination….The principle then lies about like a loaded weapon ready for the hand of any authority that can bring forward a plausible claim of an urgent need." What other principles are we prepared to sacrifice?
With determination, we will get past this crisis. The question is: Will we do so with our constitutional system—the cornerstone of our free society—fully intact? If not, the terrorists will have extracted even more grievous costs than those already apparent.
Clint Bolick is vice president and director of state chapter development for the Institute for Justice, a Washington, D.C.?based public interest law firm.
Jim Bovard
The blind glorification of government currently prevailing puts almost all liberties at grave risk. Most of the media and most of the politicians are stampeding behind the notion that the greatest danger is any limit on federal power. The Justice Department wish list of remedies invokes the danger of terrorism to seek sweeping new powers to be used against all classes of alleged criminals.
The determination of some members of the Bush administration to use the terrorist attacks to wage wars against a laundry list of "rogue nations" could mean that aggressive military action continues indefinitely—along with the pretext to suppress Americans' freedom of speech and movement. And if there is another successful terrorist attack that kills many Americans, the pressure for severe crackdowns will probably be irresistible—regardless of how badly government agencies screwed up in failing to prevent the attack. At least for the time being, people have lost any interest in government's batting average—either for actually protecting citizens or for abusing power.
The best hope for the survival and defense of liberty is that enough Americans will recall the history lessons that public schools never teach.
Jim Bovard is the author, most recently, of Feeling Your Pain: The Explosion & Abuse of Government Power in the Clinton-Gore Years (St. Martin's Press).
Alexander Cockburn
For a foretaste of the sort of assault on civil liberties we might expect, we need go no further than the actions last year of George W. Bush's nominee to run the newly minted Office of Homeland Security (OHS), Gov. Tom Ridge of Pennsylvania.
Last July, in preparation for anti?World Trade Organization demonstrations scheduled at the same time as the Republican National Convention, Philadelphia saw coordinated law enforcement involving the Federal Bureau of Investigation and local and state police, with covert surveillance, infiltration, and disruption of legitimate groups, snooping on e-mail, and phone taps. Protest leaders were arrested early on, under absurd pretexts and, in the case of John Sellers (of the Ruckus Society) and Kate Sorensen (ACT-UP), held on $1,000,000 bail. Jailed protesters were brutally handled, denied access to medical care and attorneys, and thrown into cells with dangerous inmates.
When all was over, the courts threw out 95 percent of the charges brought against protesters by the Philadelphia police. Ridge presided over an utterly disgraceful and violent denial of freedom of assembly, free speech, and due process.
The only comfort we can expect is that the FBI, the Central Intelligence Agency, the Federal Emergency Management Agency, the Pentagon, and the Coast Guard will see the OHS as a bureaucratic threat to their turf and move swiftly to neutralize it. I have no doubt that these seasoned bureaucratic fighters will soon be leaking information to discredit Ridge and the OHS.
Alexander Cockburn is a syndicated columnist and co-editor of the radical newsletter CounterPunch (www.counterpunch.org).
Jim Harper
Goaded by leftist privacy advocates, Congress has been toying with the idea of regulating the private sector in the name of privacy. As we now see, governments themselves are the most powerful, constantly lurking threats. The War on Terrorism will strike hard at the Fourth Amendment right to be free from unreasonable searches and seizures.
In the shelter of this already-too-weak freedom, Americans protect what remains of their privacy from government. Because no one yet knows what national security lapses allowed the heinous September 11 attacks to occur, laws expanding warrantless wiretapping amount to a ritual wartime shedding of civil liberties. The charge of opportunism sticks to some proposals, which came forward while fires still burned at the Pentagon and World Trade Center. They involve powers law enforcement could not win from a rational, deliberative Congress.
The laws balancing surveillance and privacy could be updated—rationally and in light of new technology, our tradition of privacy from government, and the Fourth Amendment. But when the last terrorist has died or renounced violence, the privacy protections taken hastily from American citizens in the coming days are not likely to be restored. This truly just war against terrorism may take a generation. Mopping up the carnage to privacy may take many decades longer than that.
Jim Harper is editor of Privacilla.org, a Web-based privacy-policy think tank.
Nat Hentoff
We may lose the need for a magistrate to give police a warrant for any kind of search. Also, federal law enforcement officials want to allow an investigator to search a suspect's home without immediately notifying the target of the search. In J. Edgar Hoover's day, that was called a "black bag job." It is totally unconstitutional, and it should be criminally illegal.
Then there's the First Amendment. In the Gulf War, the government managed to tie the hands of the press totally. In other words, practically no information came out except from the government. By contrast, in the much more dangerous Vietnam War, there was a great deal of free reporting and as a result of that, people began to understand what was going on and there were political changes. As Justice Hugo Black said in the Pentagon Papers case, it's especially when you're in a war situation and people are going to die that the press has to tell people what's going on.
And that's just a few freedoms under attack. Most Americans have only a rudimentary understanding of their own rights, because those rights are taught so badly in the schools. Therefore, they don't care much what other people's rights are. Because of the terror of the attacks, about 70 percent of the public is willing to give away their rights in the interests of security. All in all, it's a very bleak picture.
Nat Hentoff is a syndicated columnist and author of numerous books on civil liberties.
Robert Higgs
The American people may well be witnessing the death of their right to privacy, not with their usual whimper but with their ill-considered, too-hasty approval after the New York and Pentagon bangs.
The government has declared war on "terrorism," but because terrorists assume many guises and operate in many places, the only way to ensure that no terrorist escapes notice is to watch everyone, everywhere. Lacking the patience and the wit to focus its surveillance on only the most likely suspects, the government will regard all of us as potential terrorists or as their potential providers, unwitting perhaps, of aid and comfort. Our communications by ordinary mail, telephone, fax, and e-mail will be scrutinized or at constant risk of scrutiny; our homes and places of business will be searched or at constant risk of search; our personal contacts, financial affairs, and travel by airliner, train, and ship will be closely monitored and restricted. To borrow the lyrics of a once-popular song: Every step we take, every move we make, they'll be watching us.
Of course, once people have been subjected to such thoroughgoing government surveillance, all relations between the government and the public are transformed. Whether the rulers be revolutionary despots or democratically elected officials, every citizen knows that "they" know all about him and his affairs, and hence no one dares to step out of line. In such a situation, the socio-political system will gravitate ineluctably toward totalitarianism.
Robert Higgs, author of Crisis and Leviathan: Critical Episodes in the Growth of American Government (Oxford University Press), is a senior fellow in political economy at the Independent Institute and editor of the institute's quarterly journal, The Independent Review.
David Kopel
In the short run, Fourth and Fifth Amendment liberties face the greatest perils. While Attorney General John Ashcroft has performed very well on many fronts, in his relationship with the FBI he is, unfortunately, following in the footsteps of the attorneys general from the Clinton and the first Bush administrations. He's doing nothing to restrain the FBI's ever-escalating demands for the creation of a surveillance state.
Over the longer course of war, almost every part of the Bill of Rights will come under assault. Centralizers will attempt to further undermine the Tenth Amendment. Already, the gun prohibition lobbies are claiming that gun shows must be destroyed in order to fight terrorism. Note, by the way, that the McCain-Lieberman gun show bill before Congress is not just about background checks on some gun show sales. The McCain-Lieberman bill has so many sweeping restrictions, the violation of any one of which is a felony, that it would be legally reckless for anyone to operate any gun show if McCain-Lieberman were law. The issue really isn't background checks, but whether gun shows should continue to exist.
Besides threats to particular liberties, friends of traditional American values—namely freedom, privacy, and justice—should keep their eyes on two transcendent issues during wartime. First, the effort to replace our checks and balances and our system of federalism with unreviewable central executive power. Second, the tendency of people to suppress their own willingness to think freely, and to lash out at those who do not similarly self-suppress. r
David Kopel is research director of the Independence Institute, in Golden, Colorado.
Declan McCullagh
So far, politicians are targeting privacy and anonymity. National I.D. cards are being taken far too seriously. Oracle CEO Larry Ellison told a San Francisco TV station: "We need a national ID card with our photograph and thumbprint digitized and embedded in the ID card." Perhaps eyeing a lucrative support contract, Ellison volunteered to provide the software "absolutely free."
The likely result: Cops arresting any American who refuses to show a surveillance-enabled I.D. on demand. (Previously the concept arose, in less dramatic form, when anti-immigration zealots in Congress unsuccessfully tried to encode fingerprints, Social Security numbers, and other personal information into everyone's driver's license.)
That's not the only retread of a bad idea from the 1990s. In a floor speech, Sen. Judd Gregg (R-N.H.) called for a global ban on privacy-protective encryption products that don't have "back doors" that would allow government surveillance. Gregg plans to introduce a bill in October. No word yet on how those crypto-mavens in the U.S. Congress plan to convince Osama bin Laden and company to use FBI-certified spyware.
Another item on the FBI wish list is about to come true: The Senate has already approved a bill—by a 97?0 vote—that lets police conduct some forms of Internet surveillance without a court order.
The outlook isn't entirely bleak. Liberty has found some surprising, if uncertain, champions in the Congress. We should be heartened that members of both major parties are starting to ask for more time to weigh these proposals. The Bush folks had said, generously, that three days would be more than enough. r
Declan McCullagh is the Washington bureau chief of Wired News (wired.com) and a contributor to wartimeliberty.com.
Roger Pilon
As the Declaration of Independence says, the main business of government is to secure rights, but legitimate government can't do it by any means. It can't violate rights in the name of securing them.
That frames the issue. Between those boundaries—and given a world of uncertainty—the devil is in the details. Governments too restrained leave rights exposed. By contrast, societies that trade liberty for security, as Ben Franklin noted, end often with neither.
Thus, the government's war against terrorism implicates two kinds of rights—the rights governments are instituted to secure, and those they must respect in the process. At this writing, it appears that the terrorist attacks of September 11 resulted from a massive government failure to protect rights of the first kind. Predictably, friends of government are now saying that an undue regard for rights of the second kind led to that failure. That may be true, but it may also be special pleading. Were agencies prohibited from talking to each other in the name of privacy? Or did they simply fail to coordinate efforts? Those are the kinds of questions that need answering.
At this juncture, therefore, it's difficult to say which civil liberties are most at risk. Certainly, as September 11 demonstrated, the liberties we created government to secure are at risk. But can we better secure them and remain free? Yes, if we act smartly. Above all, whether with surveillance or searches or due process, judicial oversight must be preserved—for citizens and non-citizens. It's the final safeguard for liberty.
Roger Pilon is vice president for legal affairs and director of the Center for Constitutional Studies at the Cato Institute.
Glenn Reynolds
No liberty is in danger so long as we are willing to preserve it. But I think the biggest danger is to the presumption of innocence. Defensive precautions against terrorism involve, unavoidably, a presumption of guilt. That's tolerable at airport security checks, but the trick will be to keep this approach from infecting all of law enforcement—even in areas that have nothing to do with terrorism.
To judge by the proposed legislation before Congress as I write, that infection is already happening at the Justice Department.
Glenn Reynolds is a professor of law at the University of Tennessee, and writes for the Web site InstaPundit.com.
Harvey A. Silverglate
There are three areas in which civil liberties are most vulnerable. One is in the area of domestic surveillance. One change that has a good chance of occurring is the monitoring of individuals on the basis of something other than probable cause. I also expect that there will be an attempt to give the Department of Justice the authority to monitor individuals even without a court order under circumstances that Justice, in its wisdom, deems appropriate. I think the DOJ will take advantage of the current crisis to not only get authority to monitor different kinds of communications modalities, such as the Internet, but also to lower the relatively strict standards that govern wiretapping.
The probable increase in racial and ethnic profiling of tan-skinned people and the maintenance of a different set of standards for them as opposed to "Americans" is also very worrisome. When an unwise rule is applied only to a disfavored or minority group, there is no substantial social pressure to get that bad rule repealed. It's only when that rule is applied to all of us that really horrendous rules get repealed.
I am also concerned about moving into an atmosphere where there will be coercive displays of patriotism required of citizens in order for them to avoid various kinds of discrimination—or even violence and threats of violence. r
Harvey A. Silverglate is co-author of The Shadow University: The Betrayal of Liberty on American College Campuses (Free Press).
Nadine Strossen
My greatest fear is that too many members of the public will embrace the government's call to give up some freedom in return for greater safety, only to find that they have lost freedom without gaining safety. That is the lesson of every recent actual or metaphorical war.
In the Gulf War, we saw unprecedented restrictions on freedom of the press. Subsequent studies showed that the restrictions were not justified by any actual security concerns, but rather were designed largely to shield our government from criticism. Still, efforts are now under way to impose even greater restrictions on media coverage in the current crisis.
In the wake of the Oklahoma City bombing, our government has been exercising pervasive surveillance over millions of innocent communications, through dragnet systems including record-breaking numbers of wiretaps, the Carnivore program for intercepting all communications that pass through certain Internet service providers, and the Echelon global electronic surveillance system. Nonetheless, the Justice Department continues to seek even more expansive surveillance powers, which would sweep in even more innocent communications, with even less judicial oversight.
In the War on Drugs, many rights-curtailing measures fall disproportionately on members of racial minorities who fit stereotypical "profiles." Likewise, in the stepped-up "War on Terrorism," the most endangered rights will probably be those of ethnic and religious minorities who are targeted not because of individualized suspicion, but only because of stereotypes. The most beleaguered are likely to be Arabic or Islamic individuals who are not American citizens.
Nadine Strossen, a professor at New York Law School, is president of the American Civil Liberties Union.
Paul M. Weyrich
The Fourth Amendment to the Constitution is the most at risk. The government will challenge our right to be shielded from unreasonable search and seizure by trying to obtain the keys to our encrypted communications. That way, at will and without a warrant, they will be able to read our e-mail and other documents.
Ah, you say, I have nothing to hide, so why do I care? In this era of political correctness where the ordinary practices of today become the crimes of tomorrow, it is dangerous to have that view. Perhaps you may want to trust the government to have access to your innermost views. But what do you suppose will happen when you determine that the government has become too repressive and it must be replaced? What do you suppose will happen to you when governmental officials find out your views before you have had a chance to act upon them?
Paul M. Weyrich is president of the Washington, D.C.-based Free Congress Research and Education Foundation.
The post Guarding the Home Front appeared first on Reason.com.
]]>The Framers of our Constitution had a fear of standing armies, and of governments backed by them, that one legal scholar calls "almost hysterical." A standing army of professionals, they were sure, would eventually do one of two things: agitate for foreign military adventures to keep itself employed, or turn against its civilian masters to create a military dictatorship. To these two political threats they added a third, moral danger: that citizens used to relying on professionals for the defense of their liberties would come to take their freedom lightly.
The Framers' solution was the militia, an armed body that included all citizens qualified to vote. Whites without property were also eligible for the militia, provided they were not felons, and so were some blacks. The Framers saw this broad-based military institution as a vital protection against tyranny. Politicians and professional military officers might betray the people, but the militia could not because it was the people. And although militiamen might lack the skills and training of full-time, professional soldiers, those defects would be offset by their vastly greater numbers and morale. Politicians might call out the militia to enforce the laws, but always with the risk that if the laws were unjust the militia might decide to sit things out, or even side with the opposition. Think of it as armed jury nullification.
The militia system also had an important moral component. By serving in the militia, a citizen said he was prepared to stand up for his rights, even at the cost of his life. Militia service brought together people from disparate social backgrounds and reminded them of their shared citizenship. It also bred a familiarity with military matters that helped to dispel the mystique of professional soldiers, an otherwise potent political tool of the establishment.
Unfortunately, the militia system foundered on the twin rocks of public apathy and elite dissatisfaction. Of the two, the latter was more decisive. The militia system was designed to make foreign military adventures difficult, and it did. As recently as 1912, when the federal government tried to send state militia units into Mexico, the attorney general opined that such an order was unconstitutional: Militias could be called into federal service only in cases of invasion or insurrection, not in the service of quasi-imperial ambitions abroad. Earlier efforts to invade Canada had encountered similar difficulties. This problem, coupled with a jealousy from professional military men that dated back to the Revolutionary War, led to the replacement of the militia system with the National Guard, a federally controlled force far more amenable to superpower demands.
Although the National Guard is sometimes referred to as the modern-day militia, it is in fact a federal force, subject to the control of the president in almost the same fashion as regular troops. The patina of state control that remains is almost entirely cosmetic. As Yale law professor Akhil Amar writes: "Nowadays it is quite common to speak loosely of the National Guard as `the state militia,' but 200 years ago any band of paid, semiprofessional part-time volunteers, like today's Guard, would have been called 'a select corps,' or 'select militia'–and viewed in many quarters as little better than a standing army. In 1789, when used without any qualifying adjective, `the militia' referred to all Citizens capable of bearing arms." The statute books continue to reflect this distinction in vestigial form: Title 10, Section 311 of the U.S. Code declares that all military-age males, and some females, are members of "the unorganized militia of the United States."
All of this may come as a surprise to the average American, who, thanks to media stereotyping, probably associates the term militia with tax-protesting, bogus-lien-issuing wackos, and who may even think that the National Guard is the militia that the Constitution talks about. But the classical notion of the militia and the virtues of an armed citizenry have attracted the interest of modern academics. During the last few years, such well-known constitutional scholars as Amar, Robert J. Cottrol of George Washington University, Brannon Denning of Yale Law School, William Van Alstyne of Duke Law School, and Alan Hirsch of Hartwick College have been sympathetically revisiting the old vision of the militia. Some of them have even concluded that it might be a good idea to consider bringing the classical militia back in some form.
To this number can now be added former senator and former Democratic presidential candidate Gary Hart, a founder of the liberal defense-intellectual establishment. Hart's latest book, The Minuteman: Restoring an Army of the People, makes a strong case for bringing back something very much like the classical militia, or perhaps the 20th-century Swiss version, in this post-Cold War era. Hart's argument deserves far more attention than it will probably receive if defense and foreign affairs elites have their way.
Hart opens by noting that our current military posture could be described as "Eisenhower's Nightmare": a military-industrial complex so politically and economically powerful that it has taken on a life of its own. It is Eisenhower's nightmare because the military-industrial complex that Eisenhower warned about was a creature of the Cold War, but its present-day version has survived the end of that struggle almost intact. Despite the oddity of a huge military establishment with no plausible superpower foes, this fact is rarely remarked upon. That is because almost everyone in a position to care, from members of Congress fighting military base closings to flak-jacketed journalists addicted to covering war zones, has an investment in keeping the money flowing. As Hart says, "this great machine grinds grimly, ineluctably onward, searching for villains, whether stone-throwing tribesmen or desert quacks, to justify its existence." (Compare this with what Framing-era writer Joel Barlow said about standing armies: "Thus money is required to levy armies, and armies to levy money; and foreign wars are introduced as the pretended occupation for both.")
Not only does this huge military establishment represent an enormous waste of money, Hart argues, it is also politically disastrous. As he notes, in language echoing the Framers: "A permanent standing military seeks causes for its continued existence and resources to maintain itself. A citizen army–an army of the people–participates in the debate as to why it exists, what threat it must repel, and how and where it may be used. For a democratic republic, there is a world of difference between these two institutions."
Hart takes the reader on an extended tour of modern warfare, laying out his thesis that warfare in the 21st century will be smaller in scale and more chaotic than has been typical in the 20th: more Bosnias, fewer Normandy invasions. His view on this subject is shared by many well-regarded military thinkers, which is not to say that it is necessarily right. War often manages to confound the experts' predictions. But technology seems to be pointing in the direction Hart suggests, with the implements of destruction becoming less expensive and more powerful, and with modern communications depriving huge integrated armies of the organizational advantages they once possessed. Hart's predictions may be wrong, but they are certainly plausible.
Hart's prescription represents a radical change. In short, he proposes a much smaller professional army, backed up by much larger reserve forces and, ultimately, by an entire population with military training. The professional army would provide a rapid-response force; the citizen force would provide the real muscle in any large war. The system might look very much like that used by the Swiss and outlined in Stephen Halbrook's Target Switzerland (see "Neither Nationalist nor Socialist," October). All citizens would undergo military training, followed by a short period of active duty and a lifetime of reserve status. And though Hart doesn't say this, if they follow the Swiss (and Israeli) approach, they will surely keep weapons and other equipment close at hand in their homes, making most "gun control" strategies toothless.
Hart's proposal is sure to provoke controversy, if not excoriations, within the military. Yet what is most interesting about his book is not his military doctrine but his political analysis. He devotes more space to the role of the military in a democratic republic than he does to matters of force structure, and his ideas are very close to those of the Framers. Though Hart does not view standing armies with quite the same degree of fear, he sees them as a genuine threat to freedom.
Like the Framers, Hart sees the danger as twofold: instrumental and moral. Again like the Framers, he regards the moral danger as more significant. Hart quotes military reformer John McAuley Palmer: "Standing armies threaten government by the people, not because they consciously seek to pervert liberty, but because they relieve the people themselves of the duty of self defense. A people accustomed to let a special class defend them must sooner or later become unfit for liberty."
On this point, Hart is clearly right. A citizenry accustomed to letting others
protect it is unlikely to shoulder the other necessary responsibilities of freedom. Thus, leaving aside Seven Days in May scenarios of military takeovers, a large professional military poses a threat that should not be ignored. And with the danger of military coups now receiving attention from military professionals–and even sparking articles in mainline military journals–that possibility should not be dismissed out of hand.
Hart's book should prompt much discussion among libertarians. On the one hand, universal training and military service seem too much like conscription, which libertarians traditionally oppose. On the other hand, most libertarians seem to agree with the Framers that an armed citizenry is a potent defense against tyranny, and a system of the sort that Hart proposes would certainly make the nation immune to military coups.
A rogue president could, of course, call everyone up and thus put everyone under military discipline, but that is a power Bill Clinton already enjoys under the Constitution. Article 2, Section 2 makes the president commander-in-chief of the militia when it is called into national service; as noted above, by statute the militia of the United States is every able male (and some females) between 18 and 55. Under existing law, most people would be disorganized, disarmed, and in shock as they were told what to do by military professionals. Under a system like the one Hart envisions, by contrast, those people would be armed, trained, organized, and not at all in awe of the military profession. Which approach is more likely to preserve liberty?
The Framers thought the answer was obvious. They also didn't think that militia service was the same thing as conscription, any more than jury duty constituted conscription. Thus, I would suggest that whether Hart's suggestion is favored by libertarians should turn on whether his approach, as implemented, resembles the classical militia more than the contemporary National Guard. The key question on this point would seem to be whether the populace is armed all the time, not just in active service. Telling people to show up at the federal building "or else" may be conscription, but telling those same people to show up at the federal building "and bring lots of guns and ammunition" isn't likely to promote tyranny.
It's possible, of course, that things would work out the other way: that rather than harmonizing the military with society, an approach like Hart's would militarize society. This seems quite unlikely to me, but I claim no monopoly on wisdom. At the very least, however, Hart's proposal deserves discussion. When compared with our current system, which he correctly describes as a "nightmare" in terms of its financial cost and threat to liberty, an alternative based on the classical militia may have much to offer.
Hart's book is well-written and thoroughly anchored in both military and political realities. An America that followed his recommendations would probably be less apt to become involved in ill-considered foreign wars, more resistant to tyranny, and effectively impossible to invade. This prospect is reason enough to begin the national debate that Hart calls for. Whether that debate takes place will be an interesting test of whether Eisenhower's nightmare is coming to an end.
Glenn Harlan Reynolds (reynolds@libra.law.utk.edu) is a professor of law at the University of Tennessee.
The post It Takes a Militia appeared first on Reason.com.
]]>