MENU

Reason.com

Free Minds & Free Markets

Is That Poll That Found College Students Don't Value Free Speech Really 'Junk Science'? Not So Fast

Or how writing about survey methodology can go wrong fast

a college graduationKit / Grads Absorb the News"It should never have appeared in the press," blared an article from The Guardian's Lois Beckett last week—seeming to refer to the findings of a recent Brookings Institution survey of college students that had been reported on widely, including here at Reason.

According to the lead researcher in that study, John Villasenor of the University of California, Los Angeles (UCLA), "The survey results establish with data what has been clear anecdotally to anyone who has been observing campus dynamics in recent years: Freedom of expression is deeply imperiled on U.S. campuses." Among other things, the poll found that 44 percent of students at four-year universities don't know the First Amendment protects so-called hate speech from censorship.

Beckett's suggestion—amplified by a number of others, including Daniel Drezner at The Washington Post—is that these numbers should not be trusted. The Brookings poll "was not administered to a randomly selected group of college students nationwide, what statisticians call a 'probability sample,'" she wrote. "Instead, it was given to an opt-in online panel of people who identified as current college students."

So what? Should you disregard the findings? Do we consider the survey "debunked"? My answer is no. The meat of Beckett's critique—that the poll didn't draw a random sample of college students—is grounded in real science. Random sampling is the gold-standard methodology in survey research, and this study did not use it. But it would not be accurate to say there's a consensus in the field that no other kinds of polls are ever valid or worth citing, something the Beckett piece strongly implies.

These days, lots of well-respected outfits are doing sophisticated work outside the confines of traditional probability polls. All surveys can be done well or poorly, and some online surveys really are garbage. Nonetheless, it's a stretch to claim that any poll that uses an opt-in panel is necessarily junk, and it's far from clear from the facts presented in the Guardian story that the study in question should be included among the bad ones.

That's the short answer, and perhaps I should leave it there. But for those who might be interested in the nitty-gritty details behind this dispute, I'll go a little further below the fold.

The charges being leveled here are threefold: that the methodology section appended to the study is overly vague; that because the poll was done via an opt-in nonprobability web panel, its findings are not trustworthy; and that Villasenor estimated a "margin of error" for the poll, a no-no for this type of research.

The first complaint is fair enough. The author says that he "requested that UCLA contract with a vendor for the data collection," omitting whom the school contracted with and how the survey was actually conducted. Disclosing that sort of thing is an important part of research transparency, so it's understandable that its absence raised questions. But of course, this doesn't necessarily mean the results are bad, only that there's good reason to seek more clarity. Catherine Rampell did that, and laid out her findings, in this piece for The Washington Post.

The second complaint is the serious one. And here it helps to have a sense of the history of online polling efforts.

For years, most good survey researchers eschewed nonprobability polling on the grounds that drawing a random sample (i.e., one where everyone has an equal chance of being interviewed) is how you know that the opinions of the relatively small number of people you actually hear from are reflective of the opinions of the population as a whole.

"I think most of us who have been part of the survey research community for more than 10 years were trained with exactly that notion—that the center of truth starts with random sampling," says Mark Blumenthal, the Pollster.com founder who now heads election polling at SurveyMonkey, a company that (in addition to the peer-to-peer web surveys you may be familiar with) offers an online nonprobability panel like the one the Brookings survey relied on. "But many of us have evolved in our thinking​."

Probability sampling is indeed the ideal. Unfortunately, as I've written about fairly extensively, getting a truly random group of people to answer your questions is difficult to the point of being almost impossible in an age when so many refuse to pick up their phones and answer their doors. Even the very best polling companies have seen response rates plummet into the single digits, meaning their raw numbers have to be adjusted ("weighted," in pollster parlance) more aggressively to try to approximate a representative sample. And it's becoming more and more expensive over time.

Under these conditions, researchers began to look for creative alternative ways to estimate public opinion—silver standards, one might say, to random sampling's gold. One of the main methods that has emerged is the online nonprobability panel–based poll.

"All forms of surveys today—whether they start with a probability sample or not—the completed sample is not truly random, and there has to be some sort of correction," Blumenthal says. "We believe we can offer something of similar quality, at a very different price point [compared to traditional probability sampling], and with more speed."

A number of companies are now working in this sphere. The British firm YouGov, probably the best in class when it comes to this type of online panel research, partners with such outlets as The New York Times, CBS News, and The Economist to conduct surveys. In 2013, the American Association of Public Opinion Research (AAPOR) cautiously authorized nonprobability polling and began building out a framework to govern its use. Sandy Berry of Rand Survey Research Group, the company UCLA brought in to oversee Villasenor's study, tells me the methodology they used "is consistent with" AAPOR best practices.

None of which means there are no legitimate critics of nonprobability survey research still out there. Cliff Zukin, a former AAPOR president quoted in the Guardian story, says nonprobability research should be reserved for internal decision-making purposes. If a number is going to be released publicly, he believes it should employ the best methodology available—and for now, that means probability sampling.

"I might not feel the same way in two years," Zukin says. "There's a lot of research being done, and [nonprobability surveys have] gotten much better than they used to be, but there's still a gap​."

How big of a gap? It's extremely hard to say. In 2016, Pew Research Center released a landmark report on the state of nonprobability survey research. Interestingly, it found that Pew's own probability-based panel (where researchers go to great lengths and enormous expense to try to reach a truly random sample of people) did not outperform all of the nonprobability polls on all of the metrics. "While the differences between probability and nonprobability samples may be clear conceptually," the authors concluded, "the practical reality is more complicated."

Among other things, that report saw what researchers have long understood: that web-only research is, for obvious reasons, weakest at estimating the opinions of groups that aren't as active online, such as the elderly, Latinos, and very low-income populations. On the flip side, tech-savvy college kids (the population the Brookings study was looking at) are arguably particularly well-suited to being reached this way.

And that brings us to the third complaint.

It's accurate to point out, as Beckett does, that it doesn't make sense to report a margin of sampling error for a survey that wasn't drawn from a random sample. I emailed AAPOR Vice President David Dutwin, who confirmed that "AAPOR does not advocate using margin of error in non-probability samples." He hurried to add, however, that "some form of error estimation has to be used to assess statistical inference [and] at this time there is no universally accepted measure of error that survey researchers use to apply to non-probability samples."

So Villasenor deviated from the standard of not using the same term ("margin of error") to discuss the uncertainty associated with nonprobability surveys as is customarily used for probability surveys. Rand SRG's Berry concedes this was a mistake on his part*. But is that enough to render the poll "junk science"?

Personally, I think not.

*UPDATE: Villasenor emailed me wishing to have noted that his original writeup said the margin of error can be estimated only "to the extent" the weighted demographics of his interviews are "probabilistically representative." He writes: "Stating the margin of error that would apply in the case of a theoretically perfect sample, accompanied by the appropriate caveat, which I gave, provides more information than staying silent on the issue. It provides information on the limiting case."

Photo Credit: Kit / Grads Absorb the News

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

  • Sevo||

    Tip o' the hat to Lily Bulero, I think:
    https://www.youtube.com/watch?v=BtWrljX9HRA&t=5s

  • AlmightyJB||

    Excellent

  • Sevo||

    More importantly, he's talking about NOW.

  • Procyon Rotor||

    Hihn, you've got to retire this sock. Everyone knows it's you, and I'm seriously worried about what it's doing to your already fragile mental state. Take better care of yourself, old man.

  • Palin's Buttplug||

    They're right.

    It is amazing how the conservatives on this board accuse the libertarians/liberals of being "socks".

    Just adjust your MAGA! hat so you can see clearly.

  • Finrod||

    You're a fucking idiot.

  • MarkLastname||

    It's a sock, moron. He even uses the same bolded narration of his own thoughts.

  • Finrod||

    Whine some more, idiot.

  • Finrod||

    Yes, because whining like an infant makes you soooo mature.

  • Bra Ket||

    The man in the video literally named names regarding conservative laws of the past AND left-wing campus rules of the present. He doesn't need you spin doctoring for him. Or restating the obvious references to the conservative side. Now will you prove any point with anyone by pretending he never mentioned the present day left-wing examples.

  • DarrenM||

    who STILL demand a theocracy?

    You've lost it (if you ever had it to begin with).

  • Finrod||

    Please, whine some more about things that happened 700 years ago. It makes you look oh so relevant in the 21st Century,

  • BestUsedCarSales||

    I think it was Diane Reynolds posting that actually.

  • Sevo||

    "I think it was Diane Reynolds posting that actually."
    (Paul), I apologize, and extend my thanks again.

  • Diane Reynolds (Paul.)||

    don't trust anything from Diane Reynolds.

  • Lily Bulero||

    Not me, but it looks good.

  • Fist of Etiquette||

    Just because they're screaming truth to power with their fingers in their ears, doesn't mean they're against speech. They deeply value correct speech.

  • MG58||

    You're missing a major point. The extent to which non-probability sampling methods are sometimes good depends on fairly sophisticated weighting methods and sampling procedures. That's what SurveyMonkey and YouGov do for their political polling. It requires detailed knowledge of the population of interest, which will be very tough for the population of college students—what is the proportion of community college attendees in the Southwest who are 18-year-old Democratic women who are moderately interested in politics? That's the kind of information these pollsters use for weighting, and it's just not available for a niche population like that.

    At any rate, the only discussion of weighting by Villasenor is to fix the vast oversample of women. He doesn't even know what vendor was used, by all indications. He had to use RAND as middleman because they have an IRB. Chances are the sample was chosen on a very ad hoc basis, sending invites to all self-identified college students who signed up for some online survey firm's panel and then using all who accepted the invite until they reached the number that was paid for. The survey could just happen to produce correct estimates, but the true margin of error is probably about 20 points in both directions because it's almost certain the members of the panel are poor representatives of the larger college student population.

  • Diane Reynolds (Paul.)||

    I agree and I don't even feel a need to defend the study. I'm hoping someone does a more rigorous study in the future, and as I said below, I hope that freedom of speech hasn't lost its cache.

  • DenverJ||

    Sometimes, you have to clear the cache.

  • Entropy Drehmaschine Void||

    You have quite the cachet, DenverJ.

    (slow clap)

  • Alan Vanneman||

    I agree as well.

    "Stating the margin of error that would apply in the case of a theoretically perfect sample, accompanied by the appropriate caveat, which I gave, provides more information than staying silent on the issue. It provides information on the limiting case."

    Excuse me, but why? Stating the lowest possible margin of error, when all we know about the real margin of error (which doesn't even meet the definition of "margin of error") is that it's larger, possibly "infinite"--in other words, that the study is useless--is actively misleading. What was it that Wittgenstein dude used to say? Oh, yeah: "Whereof one cannot speak, thereof one must be silent."

  • BestUsedCarSales||

    Any chance you could link to a study discussing the math of nonprobablistic surveying?

  • GILMORE™||

    The Brookings poll "was not administered to a randomly selected group of college students nationwide, what statisticians call a 'probability sample,'" she wrote. "Instead, it was given to an opt-in online panel of people who identified as current college students."

    Gee, where have i seen that method used before....hmmm

    amazing how journalists suddenly re-discover basic understanding of survey methods when the conclusions are less than favorable for their narratives.

  • GILMORE™||

    as a side note:

    when i was in consulting, part of my job was as a middleman for enormous amounts of primary survey research of various types. I used a dozen vendors who did everything from phone bank/panel/random face-to-face/door-to-door/mail-ins, etc. the works. and everything method had its associated degrees of reliability/implied error, qualitative utility, etc.

    The value i provided clients was mostly in hearing what they wanted to accomplish/questions they wanted to answer, and then suggesting the means to get useful data towards that end. Often, involving at least '2 passes' using complementary methods... one to get a rough range of general population opinion framing, the other to dial down to specific groups that mattered the most.

    to that latter point - its sort of the thing about polling/survey work in general that is overlooked. *most people's opinions don't really matter*;

    or - considered another way - many people don't really have an opinion in the first place. They don't even think about it until they're asked, and their answer isn't really revealing anything true, its just how they feel at that moment.

    by contrast, a small number of people care a lot. and their opinions often shape those of 'non-carers'.

    if your survey can't distinguish the strength with which any given opinion is held, its a shitty survey.

    what irritates me about journalists is that they don't realize all polling is pretty shit; even the 'good stuff'. (cont)

  • GILMORE™||

    (cont') .... what i mean by that... even when the poll/survey is random/properly weighted...

    ...they don't really think too hard about what exactly is trying to be assessed, and whether just "asking Yea/Nay" questions is the right way to assess that. Sure, 'its a method', but just because a method is methodical and scientifically applied doesn't make the right method for the question.

    basically, all you will learn in any question asking exercise is some version of stated-preference. unless you also compare that to some other data that approaches the subject from a different angle - looking at behavior, for instance, and/or opinions about related, complementary ideas... its just 'some data' which may or may not have any particular significance.

    in the question of whether students value free speech... a survey is all well and good, but i'd be far more interested in seeing how groups of differently composed groups (large, medium, small) behaved when asked to debate a topic -

    - would the range of offered opinions *narrow* the larger the group became? would people self-censor ideas when exposed to larger peer-pressure?

    i think plenty of psych-study data already shows this to be the case; the question would be, does the current population hew more-strongly to consensus than previous ones? are they faster/more likely to self-censor?

    tl;dr: the rigor of survey methodology still doesn't make surveying much more significant; it just makes them slightly less-shitty

  • AlmightyJB||

    "their answer isn't really revealing anything true, its just how they feel at that moment."

    That's about as deep as most people are ever willing to go on most all topics.

  • GILMORE™||

    aka 'stated preference'

    you can usually validate it/qualify it with other data.

    Just because people are shallow, and their response indicate "how they see themselves", and not "who they really are/or what they really believe"...

    ....doesn't mean the 'how they see themselves'-part is completely useless. Its just a data point that can be used to compare to other similarly-gathered data.

  • BestUsedCarSales||

    I understand your argument and it makes sense. Do you have any papers that go in depth in the math of these analysis? Or does an overview of these types of statistical methods?

  • GILMORE™||

    Not really. I had a shelf full of books on designing questionnaires, and a bunch of templates for SPSS. but i can't really think of any single paper/book that provides a soup-to-nuts overview on "how to match research objectives to research-method"

    my argument isn't really about statistical analysis in any case. its more about how surveys - as a research method - aren't, by themselves, all that interesting or significant unless combined with some other forms of validating-data.

    a lot of the work i did in that firm was related to consumer new product development, or new category market segmentation, stuff like that, and the most common analytical techniques used were multivariate/conjoint analysis

    in my case, the questions people were usually trying to answer was "what do [X group of] people want from [y product category/segment]"; and the work would go from identifying/quantifying market opportunity, to helping design the product-features to address it.

    but sometimes it was more straight opinion-gathering. more like, corporations "periodically sniffing at their consumers" and trying to understand how they think about all kinds of stuff. I personally hated that stuff (i preferred working for the business strategy people, not the marketing/advertising folks) and its why I got out of the market research business

  • GILMORE™||

    why do you assume that 44% is entirely left-wing?

  • GILMORE™||

    the poll found that 44 percent of students at four-year universities don't know the First Amendment protects so-called hate speech from censorship.


    The authoritarian right may be even worse

    that statement seems to assume the 44% is representative of something not-inclusive of "the authoritarian right"

  • GILMORE™||

    it is Brookings.

    you think a liberal think tank is too hard on liberals? or that they are incompetent of running a trustworthy survey?

    I'm not sure how any of this helps substantiate your suspicion that an "authoritarian right" has less knowledge of or support for the 1st Amendment.

    if you want better and more-consistent research on that topic, there is some here There might be better cherries for you to pick there, as they found:

    In 2017, less than 16 percent of self-described Liberals agreed that the First Amendment goes too far in the freedoms it guarantees (15.8 percent) compared to about one-quarter of both Moderates (26.3 percent) and Conservatives (24.6 percent)....

    if there's any take-away, its that surveys can get very different answers depending on how the questions are phrased. (there is nothing at all about "hate speech" in the above)

  • GILMORE™||

    well, i apologize for trying to make sense of your random, unsubstantiated comment. cheers.

  • GILMORE™||

    your snippet says

    I was actually directing you at their entire body of research (of which that report is only 1 annual example), not the excepted portion, which was simply trying to help show that results vary depending on how questions are framed.

    ta.

  • GILMORE™||

    Michael, I've always appreciated your input, but i'm afraid you've misunderstood me entirely here. I was simply trying to help you find statistical evidence for your claim about the 'Authoritarian Right' and their attitude toward the 1st amendment.

    the newseum institute is one of the only people who has done consistent research about popular opinions re: the first amdnt, as their preamble in the linked research pointed out:

    The Newseum Institute has supported an annual national survey of American attitudes about the First Amendment since 1997. The State of the First Amendment: 2017 is the 20th survey in this series.This year's annual survey repeats some of the questions that have been asked since 1997 and includes some new questions as well. This report summarizes the findings from the 2017 survey and, where appropriate, depicts how attitudes have changed over time.

    If there is anywhere you might find some evidence for your assertion, it is surely in their body of research, which can be found here

    Hope you are well, and please give my regards to Mrs Hihn, or the Hihn children, or whomever still speaks to you. cheers.

  • Sevo||

    GILMORE™|9.30.17 @ 10:13PM|#
    "Michael, I've always appreciated your input,"
    You'd be pretty much alone there, but have at it.

  • GILMORE™||

    oh, sevo. your facetiousness-meter needs re-calibrating.

  • Sevo||

    OK, and sarc would have done it too.

  • GILMORE™||

    if you look on the data @ , they did split out Democrat students from Republican students

    the #s showed that it was actually democrats who split more-heavily on the side of 'not believing hate speech was included'

    "hate speech is protected by 1A" - Dems, 39% yes; Reps 44% yes;
    "hate speech is not protected by the 1A" - Dems 41% yes; Reps 39% yes;

    i don't see how you'd gather that any 'authoritarian right' would be hidden somewhere in there, unreflected in the data..

  • Sevo||

    Mike, your mental facilities are no longer up to maintaining the fiction.
    Fuck off, Mike. Just go. You no longer have anything to offer and you are embarrassing yourself, assuming you are still capable of embarrassment.
    Go or STFU.

  • Sevo||

    "Assuming Michael is not succumbing to some sort of dementia (which would make me feel really bad), this is great stuff."
    I'm afraid this assumption is no longer valid; Mike is beginning to be both a caricature of himself, and starting new socks and then forgetting he did so in the span of half an hour.
    I'm an old fart; I hope when I've lost the ability to keep track of my claims, I quite making those claims. Mike seems beyond that concern; he no longer even knows.
    Pathetic.

  • Finrod||

    Whine some more, fuckhead.

  • Sevo||

    "The authoritarian right may be even worse ... if their Kaepernick hysteria in Reason's commentariat is representative."

    Pretty sure you can find a brush much finer than that even in the smallest hardware store.
    If you look for it, that is.

  • Sevo||

    David Nolan|9.30.17 @ 5:45PM|#
    "Unless it's crackers to slip a rozzer, the dropsy in snide?"
    Do you speak English? I'd prefer that.

    "This is Reason, right? The Reason commentariat is hardly a small brush. At Reason."
    It's a bunch of small brushes, which is the reason I suggest you try a finer one while characterizing it.

  • Sevo||

    "Then you don't understand your own analogy. And, apparently, cannot disagree honestly, so I shall ignore you. But feel free to pick your fights elsewhere."

    Preferring not to deal with twits, I'll return the favor.

  • DenverJ||

    ^nerd fight

  • Sevo||

    "^nerd fight"
    Nope, no fight at all. Nolan died, uh, X years ago (lemme check: 7, check Wiki).
    This is some self-satisfied twit adopting the name of a libertarian, and you might well ask why. Perhaps in the hopes that posted stupidity might be accepted with more sympathy than it might be if it were posted by, oh, MNG or some other twit who has decided his/her rep has already colored the acceptance of the posts.
    The lefty imbecile commie kid has tried this several times, as has turd. The lefty imbecile Tony has, at least, been honest enough to post his pathetic crap under the same handle.
    I'm not going to 'fight' with some dishonest twit who, in an attempt to market his/her crap, adopted a handle taken from an honest libertarian.

  • Sevo||

    DenverJ|9.30.17 @ 9:27PM|#
    "^nerd fight"

    See above and below; Hihnfection under another name.

  • Finrod||

    Says the fuckwit spewing bullshit all over this thread.

  • Sidd Finch v2.01||

    probability-based panel ... did not outperform all of the nonprobability polls on all of the metrics

    [thinking emojis]

  • BestUsedCarSales||

    So it did as well or better on every one?

  • Sidd Finch v2.01||

    No, it did worse on some, which is what you'd expect. That's why error bars exist on even perfect samples.

  • Telcontar the Wanderer||

    Sooo... Could we, like, have somebody just do the fancy-pants right-and-proper kind of survey, then?

    Inquiring minds want to know what un-inquiring minds don't want to know.

  • Lily Bulero||

    Another Millenial survey?

  • Lily Bulero||

  • Lily Bulero||

    Oops, try this.

  • Kefka||

    The poll is garbage. Online surveys are terrible at telling us things about the world. This is chapter 1 or week 1 statistics.

  • GILMORE™||

    Online surveys are terrible at telling us things about the world.

    nonsense. they tell you all sorts of stuff about the types of people who take online surveys.

  • DarrenM||

    One problem with online surveys is that people that realize online surveys are garbage probably won't participate anyway.

  • Diane Reynolds (Paul.)||

    We should be on our knees praying that it's junk science, because if its not...

  • Lily Bulero||

    "on our knees"? Hate speech! Theocracy!

  • DenverJ||

    No, he's disrespecting the national anthem, which hundreds of millions of Americans died to protect.

  • AlmightyJB||

    To me, the larger sin is creating a distraction before a game that they are being paid millions of dollars to win.

  • Lily Bulero||

    If their employers allow it, then it's protected free speech vis a vis the govt - but not vis a vis the viewer, who can (and probably should) switch the channel, or go out and get some fresh air, or maybe watch paint dry - anything more interesting.

  • MarkLastname||

    Point missed. Lily appears to be pointing out that viewers are free to not watch the distraction. The players may be free to protest - but everyone else is free to change the channel in response.

  • PlaystoomuchHALO||

    You can easily factor in error rates for such a poll. The objection really isn't valid.

  • AlmightyJB||

    Yeah everyone should be virulently defending Kaepernick who never protested as a starter when HIS games were on the line, only as a benchwarmer when being an attention whore didn't effect his record as a starting QB. And who wore a Castro shirt to a presser (freakin' prick). Oh and also this

    Such a freakin' hero.

  • AlmightyJB||

    Ok we'll try the link again

  • Lily Bulero||

    Wait, you're telling me that there are reasons to question his commitment to Americanism? How were we supposed to know that, he gave no indication of that before!

  • MarkLastname||

    So, what in the link was untrue then?

  • Finrod||

    Go die in a fire, fuckwit.

  • Sevo||

    The reason I busted the twit with the Nolan handle has to do with his/her claim that the reason commentariat was united in opposing Kaep's bullshit.
    Nope. I for one give him a pass on that. He's welcome to do whatever he pleases. But then he is also welcome to accept the consequences of doing so if his employer (not me) objects.
    BUT, I'm going to posit here:
    If Tom Brady kneeled during the anthem, he'd still have a job.
    Kaepernick would have also a job if he weren't such a pathetic excuse for a QB.

  • Sevo||

    Sevo|10.1.17 @ 12:19AM|#
    "The reason I busted the twit with the Nolan handle..."
    Oh, FFS! I hadn't checked back up-thread and there by 10PM, Mike the demented asshole has already claimed ownership:

    "David Nolan|9.30.17 @ 10:10PM|#
    Bully commits aggression ... on a libertarian website. Who explains that to him (or her)?."

    Mike, fuck off. You are not helping your brand here, proving you're mental facilities are not what they used to be. I'm an atheist, but: ''You have sat too long here for any good you have been doing. In the name of God, go!''
    Fuck off, Mike.

  • Sevo||

    Mike, fuck off. No one was fooled by your pathetic sock; no one.
    Quit making a public fool of yourself.
    Just fuck off.

  • Sevo||

    Mike, do you even know that link goes right back here? IOWs, it does nothing to support your claim?
    Do you know that Mike? Or does it come as a surprise?
    Do you have a wife? Does she know you are making and ass of yourself in public on the web?

  • Finrod||

    Eat shit and die, bloody idiot.

  • Lily Bulero||

    Speaking of NAP, I think I'll take one.

  • Lily Bulero||

  • BestUsedCarSales||

    I'm also awake late at night.

  • Telcontar the Wanderer||

    I'm not.

  • Finrod||

    Whiny fuckwit idiot just keeps on whining.

  • Liberty =><= Equality||

    It's even worse than that. Kap would have had a job if he hadn't opted out on his contract!

  • MarkLastname||

    So, if someone donates money to help poor whites in addition to giving money to neo Nazi groups, that makes him a good guy?

    Kaeoernick gave money to a group that supports domestic terrorism. He supports socialist dictators too apparently. And yet you, the o e true libertarian, just love the guy.

  • Finrod||

    Go fuck yourself.

  • DarrenM||

    When you are a marginal player, you do not do or say anything to make yourself even less marketable.

  • Finrod||

    Why are you continually embarrassing yourself here?

  • Longtobefree||

    It is junk science because it outs the left in a bad light.
    Everyone knows that all colleges fully respect free speech. First because they say so, and second because they are so careful in preventing people from speaking.
    There is no need for surveys, just ask Hillary of Nancy, and know that they have our best interests at heart, and would never let us go wrong. (well, maybe that one time they let us elect Trump; but that's all)

  • Lily Bulero||

    PS, I forgot, what's the difference between "taking a knee" and plain old kneeling?

  • Liberty =><= Equality||

    The person posting on this thread as "David Nolan" is not actually David Nolan, the LP founder, who died in 2010.

    It is incredible that Reason is allowing him to commit identity theft here. Though he is toeing the party line so maybe they're OK with it.

  • Finrod||

    Go die already, lying shithead.

  • Jerryskids||

    I see OJ has been let out - I'll bet a fat dollar the first question a Fox reporter asks him is what he thinks about the NFL players taking a knee during the national anthem. And if OJ stabs the fucker in the face I would call it justifiable homicide.

  • Longtobefree||

    Right. Clearly OJ would be exercising his free speech rights.
    Oh, wait. He is a felon, he has no rights. Long live free America.

  • David Nolan Hihn||

    Thanks for the entertainment.

  • Finrod||

    Go die, fucko.

  • Lester224||

    Instead of debating the survey which wasn't done correctly, why don't those arguing about it just create another survey, do it the right way and see what the results are.

  • Uncle Jay||

    RE: Is That Poll That Found College Students Don't Value Free Speech Really 'Junk Science'? Not So Fast

    Its not a matter of junk science.
    Its about throwing billions of dollars down a rat hole called public education so students graduating from high school doesn't know what free speech truly means.
    But at least they know how to put a condom on a cucumber.

  • Mike d||

    I wonder if young people "these days" are more opposed to the first amendment as previous generations, or are they just more honest about it.

    In other words, older people would give platitudes about how awesome and free our country is, and how proud they are that people can say whatever they want, but then these same people would turn around and argue to ban free speech; such as flag burning, banning gay (or straight) internet porn, among other things they find offensive.

    Meanwhile, young people are more likely to just straight up say "you know what, screw the first amendment. If you say racist or non-PC things, I want to shut you up." You know what. At least they're honest about it. I think I have more respect for someone like that, someone who at least openly comes out as the enemy of freedom.

  • Mike d||

    Also, on a side note: I have another theory on whats behind all the speech codes and what not. The standard libertarian view of free speech is that a) you have the legal right to say whatever you want, but b) not free from the consequences of what you say. Ie: a private land owner (such as a mall that doesn't like the shirt you're wearing) has the right to ban you. You can rant whatever nonsense on the public street, standing on your soapbox, but you can't enter private property and do that.

    I think to a large extent, a lot of college students (including public universities) just don't see their campus as an extension of the street. They see it more as a private space, which they are a consumer, no different than a mall. So even if they share the idea behind the 1st amendment; if a college expels a student for saying something offensive, somehow they don't see the college as an extension of the government. I think maybe thats where the disconnect is.

GET REASON MAGAZINE

Get Reason's print or digital edition before it’s posted online