Reason.com - Free Minds and Free Markets
Reason logo Reason logo
  • Latest
  • Magazine
    • Current Issue
    • Archives
    • Subscribe
    • Crossword
  • Video
  • Podcasts
    • All Shows
    • The Reason Roundtable
    • The Reason Interview With Nick Gillespie
    • The Soho Forum Debates
    • Just Asking Questions
    • The Best of Reason Magazine
    • Why We Can't Have Nice Things
  • Volokh
  • Newsletters
  • Donate
    • Donate Online
    • Donate Crypto
    • Ways To Give To Reason Foundation
    • Torchbearer Society
    • Planned Giving
  • Subscribe
    • Reason Plus Subscription
    • Print Subscription
    • Gift Subscriptions
    • Subscriber Support

Login Form

Create new account
Forgot password

Science

The Case of the AI-Generated Giant Rat Penis

How did an obviously fabricated article end up in a peer-reviewed journal?

Ronald Bailey | From the June 2024 issue

Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests
An AI-generated illustration of a giant rat penis that appeared in an academic journal | Photo: An AI-generated illustration of a rat with a giant penis; Frontiers in Cell and Developmental Biology
(Photo: An AI-generated illustration of a rat with a giant penis; Frontiers in Cell and Developmental Biology)
Joanna Andreasson/DALL-E4

An illustration featuring a rat with a cross section of a giant penis set off a firestorm of criticism about the use of generative artificial intelligence based on large language models (LLMs) in scientific publishing. The bizarre illustration was embellished with nonsense labels, including one fortuitously designated as "dck." The article on rat testes stem cells had undergone peer review and editorial vetting before being published in February by Frontiers in Cell and Developmental Biology.

"Mayday," blared longtime AI researcher and critic Gary Marcus on X (formerly known as Twitter). Vexed by AI's ability to abet the "exponential enshittification of science," he added, "the sudden pollution of science with LLM-generated content, known to yield plausible-sounding but sometimes difficult to detect errors ('hallucinations') is serious, and its impact will be lasting."

A February 2024 article in Nature asked, "Is ChatGPT making scientists hyper-productive?" Well, maybe, but Imperial College London computer scientist and academic integrity expert Thomas Lancaster cautioned that some researchers laboring under "publish or perish" will surreptitiously use AI tools to churn out low-value research.

A 2023 Nature survey of 1,600 scientists found that almost 30 percent had used generative AI tools to write manuscripts. A majority cited advantages to using AI tools that included faster ways to process data and do computations, and in general saving scientists' time and money. More than 30 percent thought AI will help generate new hypotheses and make new discoveries. On the other hand, a majority worried that AI tools will lead to greater reliance on pattern recognition without causal understanding, entrench bias in data, make fraud easier, and lead to irreproducible research.

A September 2023 editorial in Nature warned, "The coming deluge of AI-powered information must not be allowed to fuel the flood of untrustworthy science." The editorial added, "If we lose trust in primary scientific literature, we have lost the basis of humanity's corpus of common shared knowledge."

Nevertheless, I suspect AI-generated articles are proliferating. Some can be easily identified through their sloppy and flagrant unacknowledged use of LLMs. A recent article on liver surgery contained the telltale phrase: "I'm very sorry, but I don't have access to real-time information or patient-specific data, as I am an AI language model." Another, on lithium battery technology, opens with the standard helpful AI locution: "Certainly, here is a possible introduction for your topic." And one more, on European blended-fuel policies, includes "as of my knowledge cutoff in 2021." More canny users will scrub such AI traces before submitting their manuscripts.

Then there are the "tortured phrases" that strongly suggest a paper has been substantially written using LLMs. A recent conference paper on statistical methods for detecting hate speech on social media produced several, including "Head Component Analysis" rather than "Principal Component Analysis," "gullible Bayes" instead of "naive Bayes," and "irregular backwoods" in place of "random forest."

Researchers and scientific publishers fully recognize they must accommodate the generative AI tools that are rapidly being integrated into scientific research and academic writing. A recent article in The BMJ reported that 87 out of 100 of the top scientific journals are now providing guidelines to authors for the use of generative AI. For example, Nature and Science require that authors explicitly acknowledge and explain the use of generative AI in their research and articles. Both forbid peer reviewers from using AI to evaluate manuscripts. In addition, writers can't cite AI as an author, and both journals generally do not permit images generated by AI—so no rat penis illustrations.

Meanwhile, owing to concerns raised about its AI-generated illustrations, the rat penis article has been retracted on the grounds that the "article does not meet the standards of editorial and scientific rigor for Frontiers in Cell and Developmental Biology."

Start your day with Reason. Get a daily brief of the most important stories and trends every weekday morning when you subscribe to Reason Roundup.

This field is for validation purposes and should be left unchanged.

NEXT: Review: The Moon Is a Harsh Mistress Underscores How Technology Supports Freedom

Ronald Bailey is science correspondent at Reason.

ScienceArtificial IntelligenceAcademiaResearchAnimalsBiologyPeer review
Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests

Hide Comments (87)

Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.

  1. Jerry B.   1 year ago

    The world needs more giant rat penises.

    1. SQRLSY One   1 year ago

      Donald Trump, Ron DeSatan, and Sidney Powell aren't many enough?

      1. Jerry B.   1 year ago

        I was thinking more Bidens, father and son.

        1. Mother's Lament   1 year ago

          Shillsy is a Bidenista, prepare for a 200 line shitpost.

          1. Á àß äẞç ãþÇđ âÞ¢Đæ ǎB€Ðëf ảhf   1 year ago

            Looks like he's used up his quota of ALL CAPS today.

            1. Earth-based Human Skeptic   1 year ago

              SQRLSY will run out of CAPS when the NFL runs out of points.

    2. Vernon Depner   1 year ago

      Giant Rat Penis would be a good band name.

    3. TheReEncogitationer   1 year ago

      That little vermin doesn't look too grateful. I'd say that's a waste of processing power.

  2. Spinach Chin   1 year ago

    Seems here the problem is peer review, not AI

    1. Davy C   1 year ago

      There's problems all the way down. The peer reviewers, the publishers, the people who submitted this for publication in the first place.

      AI is a problem so far as it makes this sort of thing easy to do. You wouldn't accidentally draw a rat like that if you had to draw it by hand.

      1. Ersatz   1 year ago

        anyone who peer reviewed this should never be asked to peer review again

        1. gah87   1 year ago

          And forced to take Drawing Animals 101 at the Art Institutes.

    2. Á àß äẞç ãþÇđ âÞ¢Đæ ǎB€Ðëf ảhf   1 year ago

      It began with subsidies, of course.

      Vexed by AI's ability to abet the "exponential enshittification of science,"

      The shittification of science began when governments subsidized college research; it's one thing to pay for, say, military research, but another thing entirely to subsidize colleges in general. You get more of what you subsidize, and it's always more diluted and watered down from its natural state. Thus more marginal students who gravitate to marginal professors in marginal fields, culminating in the woke crap of today, which will only get worse tomorrow.

      It's not just student loans. Before that, it was cheap state college tuition (like the $100/quarter I paid). But it really began with research grants, the NIH, all that federal bureaucracy dishing out more and more money. The bureaucrats want to dish out as much as possible. The marginal researchers want as much as possible, because no one else is stupid enough to fund studies of Peruvian hookers.

      1. ducksalad   1 year ago

        Speaking as a lazy college professor…..there’s the supply issue you mention, but there’s also an artificially created demand issue that could easily be shut down.

        Colleges make continuously asking for federal research money and publishing papers a condition of continued employment. Even the little known backwater I work at has a standard that assistant professors should spend at least 40% of their time asking and publishing, regardless of whether they have any real need for the money, and regardless of whether they have any new stuff to write about.

        And then the MBAs ask them to report productivity on a stupid spreadsheet. The Excel spreadsheet award 10 points for the phony giant rat penis paper; the paper discovering the transistor or DNA would’ve got the same 10 points. It awards the same points for asking for $500K whether it’s to study Peruvian hookers or to finalize a new cancer treatment.

        Incidentally, we fire far more people for failing to make their research points than for bad teaching.

        Since a lot of this “work” has net negative value, we could benefit even by telling professors to just not do it and take the day off. But you might even ask them to pay more attention to their teaching.

        Easyish fix: state legislatures could designate maybe one or two colleges in the state to do research. At all the others, just flat prohibit using research to evaluate faculty, insist that the sole evaluation be based on how many students they produce that pass objective measures like licensing, exams, or getting jobs.

        1. Á àß äẞç ãþÇđ âÞ¢Đæ ǎB€Ðëf ảhf   1 year ago

          The problem with that easy fix is that governments have a negative incentive to reduce budgets. Bureaucrats everywhere can only measure their success by their budget and how many employees they supervise. Private industry bureaucrats are constrained by market competition and the reality of bankruptcy. Government bureaucrats are constrained only by how much money they can wheedle out of politicians, and that's a rellay low bar.

          1. ducksalad   1 year ago

            Generally true. But considering how despised professors are at the current moment, it shouldn’t be impossible to get some reforms passed. Don't even have to reduce the bureaucrats' budgets or the legislators' pork, just redirect it to something marginally less wasteful.

            1. Á àß äẞç ãþÇđ âÞ¢Đæ ǎB€Ðëf ảhf   1 year ago

              I don't think many here despise professors in general, just the woke ones.

              1. Stuck in California   1 year ago

                just the woke ones.

                Which describes professors in general.

                Seriously, it seems to me like they've purged anything not on the ideological reservation, not unlike how even liberals got ousted from major newspapers for not being woke and progressive.

    3. Ajsloss   1 year ago

      Skynet has no peers.

    4. Minadin   1 year ago

      Yep, the 'peer reviewers' are morons.

      1. JoeB   1 year ago

        The same ones who have given the thumbs-up to crappy anthropogenic climate-change research and approvals for "gender-affirming" treatments for kids. The state of the ivory tower is similar to the Augean stables. Where is our Heracles?

        1. gah87   1 year ago

          HR-mandated toxic masculinity seminar?

  3. Longtobefree   1 year ago

    "on X (formerly known as Twitter)"

    This is deadnaming, and is considered a mortal sin.

    1. American Mongrel   1 year ago

      But X is ambiguous and X.com and other alternatives don't properly convey our petty hostility.

    2. JesseAz   1 year ago

      I think at this point it is used to disparage post Musk takeover of Twitter remembering how great censored Twitter was.

      1. Minadin   1 year ago

        My opinion is that it's a global troll by Musk to see how many times he can get journolists to repeat the phrase 'formerly Twitter'.

        So far, he's winning.

  4. Longtobefree   1 year ago

    "If we lose trust in primary scientific literature, we have lost the basis of humanity's corpus of common shared knowledge."

    That ship sailed with the COVID response.
    We don't trust your research, we don't trust your papers, we don't trust your conclusions, we don't trust your "recommendations".

    1. Ajsloss   1 year ago

      Facts changed! They were doing the best they could with the information they had wanted! Any of you would have done the same!

    2. Social Justice is neither   1 year ago

      You held on until COVID,? That shit was dead when the climate cultists gained the slightest bit of power.

      1. Stuck in California   1 year ago

        The moment they got the media to call any skeptic a "Denier", equating them with holocaust deniers...

        Not that there weren't plenty of idiocy moments before that, but science saying it is "settled" isn't science.

  5. Nobartium   1 year ago

    If we lose trust in primary scientific literature, we have lost the basis of humanity's corpus of common shared knowledge

    Religion still exists, so this is bullshit.

    Even if it didn't, then good. The death of ScIeNcE! Is overdue.

    1. Earth-based Human Skeptic   1 year ago

      Are you anxious to kill just Fauci science, or also Newton/Darwin science?

      1. ducksalad   1 year ago

        We could do some kind of compromise, e.g. kill anything done by a scientist born after 12/31/1899.

    2. MrMxyzptlk   1 year ago

      Religion is shared superstition. Not knowledge.

      1. gah87   1 year ago

        Except indigenous knowledge: to question the veracity of the nexus of religion, shared superstition and intersectional social justice is blasphemy.

  6. Don't look at me!   1 year ago

    MORE TESTING NEEDED?

  7. n00bdragon   1 year ago

    I find it hilarious that the "tortured phrases" problem arises specifically from anti-plagarism software. This kind of stuff was rolling out in force during the last few years of high school for me and all through college and I'm a bit dismayed that on the way to take a shot at AI as the villain of the day Mr. Bailey missed an equal villain in the anti-plagarism software it's being used to foil. Even when I was in school two decades ago it was quite difficult to submit an unedited paper past the anti-plagarism software of the time. Even without any kind of actual plagarism, the odds that someone, somewhere, at some point, used a turn of phrase similar to one sentence in your 3-15 page paper was staggeringly high. I imagine it's only gotten worse as the number of papers submitted has grown. Paper submission is turning into an academic /r9k/.

    1. American Mongrel   1 year ago

      Tortured phrases like:
      "Basis of the corpus of humanity's common shared knowledge"

      Like what is this fool trying to say we'd lose?

      I'm pretty sure he's trying to say science wouldn't happen without peer reviewed science journals. But what a retarded, word count minded, thesaurus-y way to make that obviously false statement.

      1. JesseAz   1 year ago

        Are you saying chem jeff was the training source for AI? Lots of words that say nothing.

      2. BYODB   1 year ago

        Peer review is actually fairly central to the scientific process. If other people can't reproduce your results, it's generally considered to be a bad thing and a pretty sure sign that what you published is garbage nonsense.

        1. ducksalad   1 year ago

          True, but it's one way.

          Can't pass peer review ----> Almost certainly garbage.
          Passes peer review ----> Doesn't prove anything, e.g. the rat penis paper.

          Proposed revision to peer review: Reviewers are still anonymous, *unless* they get negligent and let something like the rat penis paper get through. In that case the journal should publish the reviewers' names, photographs*, institution, and e-mail address along with the retraction.

          ------
          *Photoshopped, can't decide whether it should be the feet/inches scale and orange jumpsuit, or a dunce cap. Maybe both.

          1. BYODB   1 year ago

            I can agree there. The fact that there is essentially no downside to publishing garbage papers is a problem.

            I mean, the publication should take a hit to their reputation and that should tank them, but somehow that never seems to happen no matter how many bunk papers they shovel out the door.

            Probably because there is no attempt to replicate the vast majority of published papers by anyone, let alone the publication itself. There really isn't any incentive to do so unless it's a paper with major implications. Even then, most papers (in actual science fields, anyway) would be difficult or expensive to reproduce in the first place unless you have highly expensive and ultra specialized equipment.

        2. American Mongrel   1 year ago

          Yeah, science didn't happen at all before 1950 or so when peer review magazines became a thing.

        3. American Mongrel   1 year ago

          The further you are from pure science, the more you need peer review.

          Same as it applies to usefulness. Even if it is pseudo or quasi science.

          Is it reproducible? Is it useful?

        4. American Mongrel   1 year ago

          For example, AI algos don't need to be published for peer review because they stand or don't stand on their side own, and are valuable or not based on whether or not they work.

          Real science is privately funded and closely guarded. If it isn't private, it's military and even more closely guarded.

    2. BYODB   1 year ago

      This is especially true given that many disciplines have specific style guidelines that must be followed, making it even more probable that certain phrases will crop up in their writing.

    3. ducksalad   1 year ago

      Sadly, awful grammar is now the best proof of innocence when it comes to plagiarism.

      Good grammar and style - "reasonable suspicion", better run some checks.

      Good grammar and style, combined with lite, trivial content - "probable cause", call them in for questioning.

      Beautiful writing in support of utter nonsense - e-mail asking them whether they'd rather take the F for incompetence or dishonesty.

  8. Á àß äẞç ãþÇđ âÞ¢Đæ ǎB€Ðëf ảhf   1 year ago

    Anyone got recommendations for playing around with image generators? I tried half a dozen for simple things, and was not impressed, and most wanted me to sign up for a paying account before they'd do more than the simplest (and screw those up). I understand they can't provide free image generators, but from what I saw, none were worth exploring further. I wouldn't mind paying, say, $5 or $10 for a one month trial. but not on a subscription which they'd make hard to cancel, and not without some free experimentation.

    1. Don't look at me!   1 year ago

      Everyone wants to use them to make porn.

      1. Vernon Depner   1 year ago

        Yes, we're on the verge of a Golden Age of porn.

        1. Stuck in California   1 year ago

          You mean a time where the (now AI) models don't have freakish duck lips and the worst boob jobs on earth?

          It may usher in skynet, but at least there'll be some benefit to humanity in the meantime.

          1. Vernon Depner   1 year ago

            Soon, it will produce whatever kind of lips or boobs you want, in full motion video. It won't be long.

    2. Mother's Lament   1 year ago

      Civitai.com

      It's kind of a AI image generation developers hub. Its online generator used to be completely free, but last month unfortunately it went under a semi pay model. However, you can still get some free generation.

      Civitai has a large number of Stable Diffusion models, and a huge number of LoRA, embeddings, samplers and VAEs.

      1. Rick James   1 year ago

        Does it do porn pictures where the woman has only 5 fingers?

        1. Á àß äẞç ãþÇđ âÞ¢Đæ ǎB€Ðëf ảhf   1 year ago

          What should it be, 4? 8? I'm not into amputee porn.

          1. Stuck in California   1 year ago

            A hand with no forefinger has only four fingers.

        2. Mother's Lament   1 year ago

          Depends on the model and LoRA you use. Jack-of-all-trade models like what you get with Bing or DALL-E usually have problems drawing hands, but if you're using a Stable Diffusion generator you can pick a model that can specialize in anatomy.

        3. MrMxyzptlk   1 year ago

          AI has a problem with fingers for some reason.

  9. Earth-based Human Skeptic   1 year ago

    OK, how is this AI-retard science any different from Fauci/NIH/CDC-retard science? Both suggest bizarrely inflated manifestations of reality that, if believed, would make people respond in, um, fear and panic.

    I guess the AI shit is cheaper.

    1. Ajsloss   1 year ago

      So you're telling me that rat dick isn't where Misek came from?

      1. Á àß äẞç ãþÇđ âÞ¢Đæ ǎB€Ðëf ảhf   1 year ago

        There's the old joke that you can't get pregnant from butt fucking, to which he rejoinder is "where do you think lawyers come from?"

        We can now extend that to Misek.

        1. Mother's Lament   1 year ago

          Pretty sure Misek was hatched and pupated in a puddle at the garbage dump.

          1. Rev Arthur L kuckland   1 year ago

            Here I thought he was cloned in brazil

      2. Earth-based Human Skeptic   1 year ago

        Interesting hypothesis. And it makes me wonder about raccoon-dog dick and Fauci.

        1. Minadin   1 year ago

          Fauci is a raccoon-dog dick, wonder no more.

          Invest in woodchippers.

    2. Eeyore   1 year ago

      I find the rat penis more believable and reliable than "the science".

  10. Minadin   1 year ago

    Does anyone recall the other day when Reason writer Matt Petti used an AI-generated image for his article?

    https://reason.com/2024/05/01/feds-worried-about-anarchists-gluing-the-locks-to-a-government-facility/

    1. Rick James   1 year ago

      And the article, too!

      1. mad.casual   1 year ago

        The editing changed a lot recently.

        Seems like Liz is the last human that works at Reason or just the best trained AI. Either way, whaddyagonna do about an center-left-libertarian AI that has a sorta-love/mostly-hate relationship with NYC and takes time off from work to spend time with her kid, surf, and hit the bong?

  11. Rick James   1 year ago

    Well, maybe, but Imperial College London computer scientist and academic integrity expert Thomas Lancaster cautioned that some researchers laboring under “publish or perish” will surreptitiously use AI tools to churn out low-value research.

    Would that ‘low value research’ be in any measure similar to the low-value research that Reason dismissed as “C’mon people, anyone can be fooled occasionally if you like, work hard at it” type articles?

  12. Rick James   1 year ago

    FYI, Chris Cuomo (who I didn't even know was still like... a thing, but I guess he's got some show on the internet or something) is steadfastly reporting that COVID vaccines might not be 100% safe and effective.

    That's like someone from Hitler's inner circle intrepidly reporting about how the German government might have mistreated the Jews in the run up to and including wwii, five years after the war ended.

  13. AT   1 year ago

    Nevertheless, I suspect AI-generated articles are proliferating.

    I suspect that's most of the internet, at this point. That and pornography.

    Might be time to just disconnect.

    1. KARtikeya   1 year ago

      Fuck off you goddamn white trash waste of life.

      I really hope Biden’s able to get anti-American scum like you out of our military.

      1. AT   1 year ago

        Language.

        What have I ever said to give you the impression that I'm white???

        Man alive you like to make assumptions. That's your bigotry and prejudices at work. Were I a leftist, you'd absolutely owe me an apology.

        Luckily, I'm not a leftist - because I have a functioning brain - and I don't make hay about such things.

        1. KARtikeya   1 year ago

          Are you white?

          1. Minadin   1 year ago

            Are you not?
            Why does it matter on the internet?

            1. Ajsloss   1 year ago

              Why does it matter on the internet?

              Because skin color is the most important thing?

          2. JoeB   1 year ago

            That's the title of a great new show on Hulu. It's a de facto sequel to "Are you hot?", but instead of bending over and spreading them for Lorenzo Lamas, you answer 10 questions from Ibram X Kendi. If you're white, they offer you free gender modification. Mandatory, though.

  14. Eeyore   1 year ago

    I plan to go ask AI to generate a CRISPR sequence that will make that picture a reality. You will see, that picture is REAL.

  15. Longtobefree   1 year ago

    I think AI is telling us we are going to get fucked.

  16. American Mongrel   1 year ago

    " the coming deluge must not be allowed to fuel the flood"

    This has to be an elaborate meta prank where they use a bad chatbot to write an oped in nature against the use of chatbots in academic journals.

    And this was a highlight that Bailey found quotable along with "basis of humanity's corpus of common shared" tripe...

    Definitely not written by an English major.

    I'm going to read that whole editorial.

  17. American Mongrel   1 year ago

    Funny, the gist of the article is that corporations aren't making their data and code available for peer review and that AI needs to be regulated.

    That doesn't jibe with Reason's stance at all.

    And there is no authorship attributed.

  18. Use the Schwartz   1 year ago

    AI wrote the article about AI in the AI issue of Reason, and AI are in the comments.

    The future is – not as cool as I expected…

    1. JoeB   1 year ago

      It’s actually hilarious when you think about it. Bots can do the hard work of writing and commenting on articles while we watch porn. Well-played, Mr. Bailey.

  19. mad.casual   1 year ago

    “If we lose trust in primary scientific literature, we have lost the basis of humanity’s corpus of common shared knowledge.”

    Pure. Unadulterated. Newspeak. Layers deep.

    Fuck Ron Bailey.
    Fuck Reason Magazine.
    Fuck The Media.

  20. VinniUSMC   1 year ago

    How? Because there is currently no such thing as AI. LLMs are not intelligent.

  21. YuckFou   1 year ago

    Who's vetting the AIs?

    If Marge is processing your mortgage application using an AI that she doesn't understand, do you think she's qualified to spot a bias? And do you think she's going to against the AI in your favor?

    Who is vetting the AI?

Please log in to post comments

Mute this user?

  • Mute User
  • Cancel

Ban this user?

  • Ban User
  • Cancel

Un-ban this user?

  • Un-ban User
  • Cancel

Nuke this user?

  • Nuke User
  • Cancel

Un-nuke this user?

  • Un-nuke User
  • Cancel

Flag this comment?

  • Flag Comment
  • Cancel

Un-flag this comment?

  • Un-flag Comment
  • Cancel

Latest

Will Trump's Regulatory Reforms Do Enough To Unleash Nuclear Energy?

Jeff Luse | 5.27.2025 3:03 PM

Overcrowding and Dysfunction Produced a Quiet Riot at a Miami Federal Prison Holding ICE Detainees

C.J. Ciaramella | 5.27.2025 2:42 PM

Texas Revs the Growth Machine

Christian Britschgi | 5.27.2025 2:20 PM

The Pentagon Is Getting $150 Billion From the 'Big Beautiful Bill'

Jack Nicastro | 5.27.2025 1:04 PM

Trump's Team Discovers That Diplomacy Is Hard

Matthew Petti | 5.27.2025 11:45 AM

Recommended

  • About
  • Browse Topics
  • Events
  • Staff
  • Jobs
  • Donate
  • Advertise
  • Subscribe
  • Contact
  • Media
  • Shop
  • Amazon
Reason Facebook@reason on XReason InstagramReason TikTokReason YoutubeApple PodcastsReason on FlipboardReason RSS

© 2024 Reason Foundation | Accessibility | Privacy Policy | Terms Of Use

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

r

Do you care about free minds and free markets? Sign up to get the biggest stories from Reason in your inbox every afternoon.

This field is for validation purposes and should be left unchanged.

This modal will close in 10

Reason Plus

Special Offer!

  • Full digital edition access
  • No ads
  • Commenting privileges

Just $25 per year

Join Today!