Surveillance

Patrick Eddington: How to 'Tyranny-Proof' America

The former CIA analyst and Cato scholar discusses Palantir, Trump's new national database, and the sordid history of federal law enforcement on Just Asking Questions.

|

HD Video Download

Do you ever feel like you're being watched? Just asking questions. 

We're told modern surveillance tech will track criminals, illegal aliens, and terrorists while protecting the privacy of innocent Americans. You've got nothing to worry about if you've got nothing bad to hide. 

Today's guest says that's not true. His latest book, The Triumph of Fear, documents the history of the modern surveillance state and the ways in which it's been leveraged since its inception to target not just terrorists and criminals, but political dissidents.

Patrick Eddington was a CIA analyst from 1988 to 1996, but resigned and wrote Gassed in the Gulf, a book alleging that the agency helped cover up the existence of Gulf War syndrome, caused by exposure to chemical weapons. 

He joins Just Asking Questions today to talk about the power and reach of the modern surveillance state, the growing influence of the AI-powered data firm Palantir—cofounded by Peter Thiel—in the Trump administration, and what can be done to "tyranny-proof" America. 

Mentioned in this episode:

"Stopping Waste, Fraud, and Abuse by Eliminating Information Silos," The White House

Palantir contract modification with ICE

"The Scouring of the Shire," an open letter by a Palantir ex-employee

"Palantir Is Not a Data Company," by Palantir

"American Big Brother," by the Cato Institute

"The Triumph of Fear: Domestic Surveillance and Political Repression From McKinley to Eisenhower," by Patrick Eddington

Alex Karp, director of Palantir, address to the Economic Club of Chicago on May 22, 2025

"Why This Palantir Cofounder Left California for Texas," The Reason Interview With Nick Gillespie

"Purpose-Based Access Controls at Palantir (Part 1)," by Palantir

Davos 2023: A conversation with Palantir's Alex Karp

 

Chapters:

0:00—Introduction

2:20—President Donald Trump's executive order "eliminating information silos" is paving the way for a national, unified surveillance database

3:58—Did the Department of Government Efficiency have a "hidden motive"?

12:08—Why the surveillance bureaucracy keeps expanding with little resistance

14:04—Ex-employees have signed an open letter against Palantir. What does it mean?

25:34—What does Palantir actually do?

27:55—Could Palantir actually protect civil liberties?

29:02—What could happen if Palantir's tools fall into the wrong hands?

37:52—Why creating a centralized database is a civil liberties nightmare

42:53—Palantir's CEO Alex Karp on why they defend the West

47:00—Why Eddington wants to take federal law enforcement out of the executive branch

50:32 - Why federal law enforcement has always been politicized

55:17 - The lessons of COINTELPRO's surveillance of activists

55:17 - What was "total information awareness"?

1:10:28 - What is a question Patrick Eddington thinks more people should be asking?


Transcript:

This is an AI-generated transcript. Check against the original before quoting.

Zach Weissmueller: Do you ever feel like you're being watched? Just Asking Questions. The ostensible purpose of DOGE was to cut waste, fraud, and abuse in government. But what if there was also another purpose hidden in plain sight—to make it easier to track your every move?

Of course, our government would never outright say that. We're told modern surveillance tech will track criminals, illegal aliens, and terrorists while protecting the privacy of innocent Americans. You've got nothing to worry about if you've got something bad to hide.

Today's guest says that's not quite true. 

His latest book, The Triumph of Fear, documents the history of the modern surveillance state and the ways in which it's been leveraged since its inception to target not just terrorists and criminals, but political dissidents. Patrick G. Eddington was a CIA analyst from 1988 to 1996 but resigned and wrote another book, Gassed in the Gulf, alleging the agency helped cover up the existence of Gulf War Syndrome caused by exposure to chemical weapons.

He joins us today to talk about the power and reach of the modern surveillance state, the increasingly prominent role that AI-powered data management firm Palantir, co-founded by Peter Thiel, is playing in the Trump administration, and to offer some ideas about what should be done to tyranny-proof America. 

Patrick, thank you for coming on the show.

Patrick Eddington: It's my pleasure.

Zach Weissmueller: Trump issued a curious executive order, and I'm gonna read a little bit about that. The aim seems to be to create a kind of unified national database. It says here that the goal, the purpose, is removing unnecessary barriers to federal employees accessing government data and promoting inter-agency data sharing.

The name of this executive order was Stopping Waste, Fraud, and Abuse by Eliminating Information Silos. It goes on to say about these information silos, "agency heads should take all necessary steps to ensure federal officials have full and prompt access to all unclassified agency records, data, software systems, and information technology systems. This includes authorizing and facilitating both the intra- and inter-agency sharing and consolidation of unclassified agency records."

It says that, "this is to ensure the federal government has unfettered access to comprehensive data from all state programs." So it also is kind of going into the states and saying that the federal government should have unfettered access to state programs that receive federal funding. So I'm trying to put all this disparate data in one place. What kind of problems do you foresee with this approach?

Patrick Eddington: "One ring to rule them all, one ring to find them, one ring to bring them all and in the darkness bind them." I'm a big Tolkien guy, in addition to being a big Star Wars guy, so I just can't help but bring that kind of thing into it.

Yeah, look, I do think it's really fascinating that, given the fact that the Inspector General Act was passed in 1978—that's almost four decades ago, right—and we have literally years' worth, decades' worth of recommendations from these different inspectors general as well as the Government Accountability Office on where the waste, fraud, and abuse has been in the government.

It just seemed to me, and I think to a lot of other observers, that this entire effort by Mr. Musk at the behest of the administration was really designed to try to gather as much data as possible. And, you know, on the surface, that EO sounds kind of cool. It's like, "Yeah, sure, there ought to be—you know, they should be able to—the left hand should know what the right hand is doing and all the rest of that". And they also say unclassified data. And it's like, "Well, what's wrong with that?"

Well, you can take an awful lot of unclassified data—data from automated license plate readers, data from cellphone tracking-type stuff, geolocation data, all the rest of that—and you can get an excellent idea of how people live their lives. If you're able to get purchase history, if you can buy that, essentially, buy that kind of data, it can give you enormous insights into what people are doing.

Now of course, the regime tries to sell this entire concept, essentially, as being about finding illegals and making sure that they're not on the public dole and so on and so forth. And as a theoretical matter, does that sound like a common-sense kind of policy thing? Sure. But I think it's fairly obvious at this point that it goes well beyond that and that they have a desire to basically try to gather as much data as they possibly can on anybody that they're interested in.

And I think that it's really also very telling that when Ed Martin, who was temporarily the acting U.S. attorney for the District of Columbia—when it became clear that he had even less than a snowball's chance in hell of being permanently confirmed to the position—that they moved him over to DOJ to basically engage in this so-called anti-weaponization program or whatever. And they were gonna be calling out regime critics and shaming them and all the rest of that. Which is not an appropriate role for government in any way, shape, or form. It never has been.

But if you want to call people out, even if you don't plan to prosecute them, having as much data as you can to try to muddy the waters on folks that's helpful, right? If that's what your fundamental intent is.

Zach Weissmueller: Yeah, you know, as pertains to DOGE, I mean, coming from a libertarian perspective, there were a lot of things to like about it on the surface. It obviously—to a degree—it kind of embarrasses the government for wasteful spending. That's a good thing, and transparency.

At the end of the day, it didn't cut much money. And, you know, Elon Musk kind of washed his hands of it and walked away. But yeah, I have always wondered about the sort of getting-into-the-systems aspect of it, and if there were two layers operating at the same time. It sounds like you believe that there's evidence that that is the case. That there was more, that there was the kind of surface story and then something going on underneath the surface.

Patrick Eddington: Yeah, and I do think there's been a fair amount of press reporting on this, and not just from legacy media outlets by any stretch of the imagination. I mean, Wired, ProPublica, some other outlets, and I think even Reason has done some work on this particular topic as well.

And again, I want to come back to this issue of unclassified or otherwise publicly available data, because this is, in a lot of respects, how the Peter Thiels of the world and some others have been making just gobs of money at the end of the day. You collect all this data, you package it up, and then you offer it up to federal law enforcement or the intelligence community or whatever, and it becomes—and it is, has been—a very lucrative business for him.

And the thing about this kind of data brokerage, if you will, is that—and I'll just use the FBI because they're my favorite target. If you're gonna have a bad day with federal law—at least in the pre-Trump era for the most part—if you're gonna have a bad day with federal law enforcement it was probably gonna be with the FBI. Because unless you were a fairly frequent foreign,international traveler or all the rest of that, you were going to be more likely to encounter an FBI agent than you would be almost any other kind of federal law-enforcement official.

And in September of 2008, in one of his last major official acts as attorney general, Michael Mukasey modified the attorney general guidelines for FBI domestic investigations to create an entirely new category of de facto investigative tool. It was called an assessment. It is called an assessment.

And the thing about assessments is they can open one on you, me, they can open one on Reason, they can open one on Cato, and they don't need a criminal predicate to do it. It can just be for an official purpose or an authorized purpose. Well, guess who gets to decide what that authorized purpose is? Well, that's the DOJ and FBI. It's a great gig if you can get it. It's like the ultimate make-work program for government federal agents.

And the thing about assessments is, because you don't need a criminal predicate, you can literally go and get this kind of commercially available data and use it. You can also search classified databases like the FISA Section 702 database. You can run confidential informants against you, me, Reason, Cato, et cetera. And you can also conduct physical surveillance of individuals or entire groups of people that are affiliated with a given organization.

And again, none of that requires going before a judge. So it's an incredibly powerful, surprisingly invasive tool that has been extensively used by the FBI. And we've been in court for over 5 years now—close to 5 years now—trying to shake loose essentially a bunch of these so-called assessments.

And we started out wanting them all. And the FBI came back in their official court affidavit and said, "Ok, well, that would take almost 2,000 years." For real, that's what they said. So we said, "Ok, look, we're going to be super reasonable here. We're going to just ask for all of the closed sensitive investigative matter—or SEM—assessments." And those particular kinds of assessments are the ones that implicate First Amendment–protected activities.

So whether you're an academic, a journalist, somebody in a public policy organization like ours—our two organizations—we just want those. And we were relatively narrowly scoped in terms of timeframe and all the rest of that. It worked out to be about 1,100 or so—just slightly under 1,100—of these assessments.

And they still came back in yet another court affidavit and said, "Ok, well, that's going to be about 219 years." So we're still going back and forth, trying to get essentially a federal judge to force them to cough this up.

And I want to make the point that this is a bureaucratic mentality. This goes way beyond whether it happened under Obama, under Trump 1, under Biden, under Trump 2. This is the professional bureaucracy, most of which is still left. I mean, this regime shaved off the entire top of the FBI. And I don't just mean the political appointees down to the assistant attorney general level. I mean, they went after the SES—or the Senior Executive Service—folks as well.

And from what I can tell, they also maybe have gone down to the GS-15 supervisory special agent level. And those are the folks—or special agents in charge, I should say—who actually run FBI field offices. But even with that, there's still the kind of core culture that is there, that's resident at the FBI, that continues to drive this kind of thing. So just absolute, complete, obstinate resistance to making public any of this stuff.

And again, we really did scope it down to say, "Ok, give us First Amendment–implicated investigations," essentially. And they're still just fighting us tooth and nail.

Zach Weissmueller: Yeah. I mean, what worries me particularly at this moment about that is the asymmetry, where the bureaucracy will grind things to a halt and say, "Unless you're getting radical life-extension technology, you're never going to live to see these documents. But on the other hand, we're going to make the means of collecting all this data much more efficient, and we're going to start injecting AI into the system—so in the collection—and we're really gonna put this into overdrive." But nothing's gonna change on the other side of the equation in terms of making it easier to access the information that you're using against, well, who knows who?

And I guess that's where Palantir comes into the conversation. You mentioned The Lord of the Rings in your opening statement there. And creepy enough, Palantir is named after the Palantíri stones in The Lord of the Rings, which are basically these all-seeing stones that can see into, I guess, the past and future and then construct what is gonna…

So, I mean, the fact that you would name your company after that is kind of interesting. But there have been some employees that have come out in recent years. They signed an open letter that I'm going to pull up on screen here. It's called Scouring the Shire: A Letter from Concerned Palantir Alumni to the Tech Workers of Silicon Valley, and it mentions the legend of the Palantíri at the beginning here.

It says that: 

The myth of the powerful seeing stones warned of great dangers when wielded by those without wisdom or a moral compass, as they could be used to distort truth and present selective visions of reality. Similarly, Palantir Technologies' platform grants immense power to its users, helping control the data, decisions, and outcomes that determine the future of governments, businesses, and institutions—and by extension, all of us.

A code of conduct was crafted to uphold democracy, preserve the spirit of free scientific inquiry, and ensure responsible AI development within Palantir. 

And these former employees are saying that the principles have now been violated and are rapidly being dismantled at Palantir Technologies and across Silicon Valley. 

They've grown hostile to diversity, equity, and inclusion. They employ inflammatory language, sow confusion, invite controversy, going as far as to threaten critics with violence.

They mention threats again related to biometric data collection on immigrant children, journalists being targeted, science programs being defunded, and key allies like Ukraine sidelined. 

Trump's administration has sought to greatly expand executive powers while alluding to monarchy. Big Tech, including Palantir, is increasingly complicit, normalizing authoritarianism under the guise of a revolution led by oligarchs.

We must resist this trend.

I wanted to read that list of their grievances just so everyone can get a sense of it. You know, it does sound like a kind of left-leaning progressive group of people, but that doesn't mean that there's not some validity to some of their concerns. Before we get into—we're going to explain exactly what it is Palantir does—give us the overview reacting to that letter. Do you share any of those concerns about this technology?

Patrick Eddington: Well, I mean, I think throwing the whole DEI thing in there just muddies the water at the end of the day, right? I mean, this is about the ability to collect data on anybody—any organization—all the rest of that. And making those tools available essentially for a premium, I'm sure, to any law enforcement organization.

And it's not just federal, right? I mean, this is stuff that they, I'm sure, hope to be able to sell to state and locals as well. State and locals—especially locals—are not going to have quite the same line-item budget capacity to afford the premium packages and all the rest of that.

But then when you start talking about things like state fusion centers—and the Homeland Security and Governmental Affairs Committee… yeah, sure. So the Senate Homeland Security and Governmental Affairs Committee is responsible essentially for conducting oversight of the Department of Homeland Security and also kind of looking more broadly at how a lot of these kinds of things—like Palantir's technology, other technologies, artificial intelligence, facial recognition, automated license plate readers—all those kinds of things, the technologies, but also the structures.

And it was when the late Sen. Tom Coburn of Oklahoma was the chairman of that committee that the committee conducted a look-see at all these fusion centers—these state-level fusion centers that were created in the wake of the 9/11 attacks. Again, as a mechanism of putting federal agents in the same building with state and local law enforcement people on a state-by-state basis, ostensibly for counterterrorism purposes.

Now, of course, they've morphed way beyond that. They're into drug war stuff and all the rest of that kind of nonsense, as you would expect them to ultimately be, unfortunately. But what Sen. Coburn and his staff found—and this was 13 years ago now—was that these fusion centers are essentially just Fourth Amendment violation factories. That's really what they boiled down to.

The Missouri fusion center got caught essentially going after people who had, like, Ron Paul bumper stickers on their cars—when Mr. Paul was actually running for president all those years ago, right? And I think for people like me, and I'm sure you and a lot of other folks who work at Reason and elsewhere across the liberty movement, that's exactly the kind of thing that terrifies us and that we want to find a way to shut down, if at all possible.

So when I see—and I think it's great that they did the letter, I think it's great that they were sending up a five–red star cluster alarm about all the rest of this—but I get concerned when anybody on the left or the right tries to put this in purely partisan political terms or issues a statement that is inevitably going to be interpreted if you use charged language or language that pushes people's buttons to a certain degree—when it comes to some stuff.

The very act of collecting this data, regardless of who they're going after, is itself the thing that we need to be concerned about.

And I should say that this isn't just about Palantir, right? I mean, they're just one example here of the kind of capacity that's out there. I mean, Google has enormous reach in terms of the kind of data that they can provide. And that's one of the reasons why I use my—I have a Gmail account, but I use it incredibly sparingly because they scan all that stuff, right? And at the end of the day, all that kind of stuff can be—and often is—made available to law enforcement and other entities for a price.

And that's the reason that one of the few truly useful bipartisan things that's actually happened in the last 5 years or so is this partnership between Rep. Warren Davidson, Republican of Ohio, and Sen. Ron Wyden, Democrat of Oregon, with a bill called the Fourth Amendment Is Not for Sale Act. That bill actually passed the House of Representatives in the last session, and unfortunately, the Senate didn't take it up.

And the core of the bill is really very simple. It essentially says that, "If you, a law enforcement organization or agent, want to get this data, you need to get a warrant for it." That's it. The entire idea is to close that loophole, because this ability to buy the data right now—as long as the budget line item is available to do it—is what is giving them the ability to, in large measure, circumvent the Fourth Amendment.

And then when you combine it with, in the case of the FBI at least, you combine it with that assessment authority—where you don't actually have to have a criminal predicate to open an investigation on somebody—you can just keep that running for years.

I mean, I've got a lot of the stuff that we have managed to get released. It's not a huge quantity of material, but some of the stuff we have been able to get released on assessments shows that one particular organization, for example—the Muslim Justice League up in Massachusetts—the FBI had an assessment opened on them for years. I mean literally years. And this is an organization composed, I think, entirely of American citizens—Arab Americans, Muslim Americans—with absolutely no evidence of a connection to terrorism, anything like that. And it just keeps running.

But it doesn't just happen to that kind of a group. I mean, Concerned Women for America has been around for 40-plus years. They're a staunchly pro-life organization. And in July of 2015—literally almost exactly 10 years ago—at least one Washington field office FBI agent opened an assessment, a so-called "charity assessment."

I'd never seen that before, and I've been doing this for a while. Opened the so-called charity assessment on Concerned Women for America. There was no predicate. Nobody had come to the FBI—no former or current employee had come and said, "Hey, there's embezzlement going on here, "other kinds of violations of federal law," or "electioneering," or whatever. No, none of that. 

That gets to my biggest hobby horse, probably, which is bureaucratic incentives, right? How does somebody who's a GS-12—a General Schedule 12 in the federal employee rating and pay system—if they're an FBI agent, how do they get from GS-12 to GS-13?

"Well, how many assessments did you open, agent? How many preliminary investigations did you open? How many full-field investigations? How many enterprise investigations? How many sources are you running?"

It becomes a numbers game. And the more you incentivize people to do this, the more you're going to be likely to have massive rights violations. And of course, in the current context, with respect to the Trump regime's attempt to essentially kick every immigrant out of the country—every illegal immigrant out of the country—they've managed to deport at least 70 U.S. citizens, according to the GAO, in the process.

And that's another reason we need to be skeptical about these large datasets. They're not necessarily accurate.

Zach Weissmueller: Yeah. So what I am hearing you say there is that this kind of relates to an argument that I've heard the founders of Palantir make. Which is that really, they are offering a technology layer. They're not the ones going out and collecting the data. They're not the ones using the data.

And really what you need to look at if you want to protect civil liberties are these issues that you're raising—like requiring a warrant or changing the bureaucratic incentives so that you don't get rewarded just based on how much data you collect against an organization or an individual. And it makes a lot of sense to me.

I wanted to make sure that our audience understands what this technology actually does and how it might interface with these current incentives and the current state of the law. So Palantir—you mentioned ICE. It was reported by The New York Times that they just got a $30 million contract with ICE. And what Palantir says—they had a response to The New York Times article and put up a bunch of slides—and they emphasized, "We are not a data company."

And here they have it kind of visualized—what they do. You see kind of the fragmented data, they connect it, and then they organize it. And I actually pulled a clip from their YouTube channel that, in a condensed manner, explains in a straightforward way what it is they do. I'm going to play that real quick, and then we can talk about how agencies might actually use this technology.

(Palantir Clip):

Palantir Foundry is a software platform that allows organizations to bring their data together and then enables their users to conduct sophisticated analytics and operations on top of the unified data.

Access to the platform does not necessarily mean access to all of the data it contains. First, an operational decision-maker proposes a purpose. If this purpose is approved as legitimate by information governance officials, the purpose gets a secure, access-controlled space in the Foundry platform. No users or data can enter this space yet.

Then, data that has been approved as necessary and proportionate for this work is added to the space. Finally, users can apply for access to the space if they've been tasked to work on this purpose.

At each step of the process, a rich history of why any given approval has been made is recorded.

Zach Weissmueller: So what they look to be saying there is that not only are they kind of segregating the data so that there can be this big pool of data, but they can then track who is accessing the data. So, in a sense, they're kind of saying, "We are watching the watchers, because we can track very easily who has access to what data"—perhaps preventing a Snowden-like situation where an outside contractor is getting access to data where they don't see.

Nick Gillespie spoke with Joe Lonsdale, who's one of the co-founders of Palantir, and that's why he made the case to Nick—that this technology could actually, in a sense, be civil liberties–protecting. I want to play that clip as well and then ask you a question about how it might actually be applied. 

So here's Joe Lonsdale.

(Gillespie Lonsdale Clip):

Joe Lonsdale: So the whole core of Palantir was basically a civil liberties engine from the start. It's what data are you allowed to see in what context? And how do we bring that together to show you only what you're allowed to see to let you get your job done, right?

The problem is, I think a lot of these guys—maybe they think they're Jack Bauer in the show 24. I don't know if that's a dated reference now, but—someone who's in charge of catching the bad guys, and so they're going to break the rules. And they're going to break the rule to find the bad guy.

And we don't want you to be able to break the rules if you're not supposed to break them, but we want you to get the bad guys anyway. That's the whole point of this. And so it's actually a really hard data problem—what are you legally allowed to see, and what's the policy? 

We don't set the policy, but we would make it so it was very transparent.

Zach Weissmueller: So, taking all that into account, what do you think the implications of this technology are, since the federal government is clearly doubling down on it?

Patrick Eddington: Yeah, so the key phrase from Lonsdale's commentary was, "We don't set the policy."

And I think that's really what kind of matters here. I mean, they are creating—they have created—and this Foundry platform is just one of several different kinds of packages that they offer. And it makes it sound like, "Hey, we're just providing the tool here, and what somebody else does with it, you know, they shouldn't do this, they shouldn't do that."

But at the end of the day, I think all we have to do is go back and look at the FISA Section 702 program, which—for the benefit of those listening or watching—that program started its life as the unconstitutional Stellar Wind mass electronic surveillance program, literally just hours after the 9/11 attacks. When the then-head of the National Security Agency, Gen. Michael Hayden, told his people to go ahead, essentially, and flip the switch and begin tracking every last bit of communication to, from, and between the United States and Afghanistan.

That was a direct, absolutely direct violation of the Foreign Intelligence Surveillance Act, which was supposed to be the exclusive means for collecting foreign intelligence–related data. It didn't hurt Hayden's career, as some folks may know. He went on to become the director of the CIA, and I think he was the deputy ODNI—I'd have to go back and check his bio again to be sure on those facts. But in any event, nobody went to jail for that.

And to me, I always keep that in mind, because once The New York Times—once Jim Risen and Eric Lichtblau, who were then at The New York Times—exposed the Stellar Wind program in December 2005, that put Congress on a path for two and a half years trying to take that illegal mass surveillance program and somehow make it compliant with the Fourth Amendment.

I don't believe to this day that it is compliant with the Fourth Amendment.

I think a lot of constitutional scholars would agree with me on that. Folks who are of a more conservative, hawkish bent might take a different view. But I think it's pretty clear that the Founders, when they wrote the Fourth Amendment, were thinking an individualized, particularized suspicion dealing with a specific individual or perhaps a conspiracy. Otherwise, you need probable cause.

And when we have a company like Palantir that is in a position to offer these kinds of advanced tools for folks to go in—whether it's the FBI, ICE, whatever—it's always imperative to remember that we're still dealing with a garbage-in, garbage-out kind of technology.

So if the data that's ingested on the front end is error-filled, you're going to get bad outcomes. I don't care how sophisticated the tool is. If the data that's plugged into the machine is not accurate—and this is critical, of course, as it pertains to individuals. Not just their rights, but their identity.

I mean, are you actually going after somebody who is not here in lawful status? Or are you going after someone who's actually been born in this country, and you've got a name confusion? I mean, this is the kind of thing that's happened to Arab and Muslim Americans in the post-9/11 era.

When you talk about the TSA's watch list… Secretary Noem over at DHS says they shut down the Quiet Skies program, which The Boston Globe did a whole series of scandalous stories on several years ago about the number of innocent Americans being swept up in that program. But the sister program, Silent Partner—which is the international travel aspect of it—is still running, so far as we know.

So whenever you have a company like Palantir creating this kind of capability, and you have a garbage-in, garbage-out situation, and you have bureaucratic incentives inside these federal agencies and departments—and then you have the president of the United States breathing down their neck, right? And the president's key advisers—Stephen Miller, Tom Homan—all these folks breathing down their neck to "Find the illegals, get the illegals, get them out of here," you're gonna get what we've gotten.

Which is masked people with badges and guns—maybe badges; don't know for sure in some of these cases—but they're armed and armored, and they're sweeping people off the streets of this country. And in several cases, they've absolutely been American citizens.

So that alone tells you that no matter what technology they're using on the back end to make these raids possible, it ain't working. It's not remotely working.

And I want to make it absolutely clear to anybody who's listening or watching: I want illegal gangbangers out of this country. I want illegal aliens who are murderers, rapists, thieves—I want them the hell out of here.

What I don't want is a system that obliterates hardworking families who made a bad choice—broke the law, no question about it. They crossed the border, that's a violation. But that speaks much more, I think, to the insanity of our immigration system, such as it is.

My colleagues Alex Nowrasteh and David Bier are much more versed in this than I am. But as just somebody who is concerned about the constitutional rights aspect of all of this—and the due process aspect of this… 

This kind of technology can really help to create the kind of real-world, real-time dystopia that we're seeing right now.

Zach Weissmueller: That's the dark version of what is unfolding and could be a future dystopia. Is there a version where this technology could be leveraged to actually focus the government on more of these kinds of targets that we all want to see targeted? Whether it's violent criminals, violent illegal immigrants in the United States?

Palantir has been credited—they've never officially stepped forward and taken the credit for this—but there's been reporting that they were instrumental in targeting bin Laden's compound, locating it in Pakistan. There's been reporting that they've really given a decisive battlefield advantage to Israel. Ukraine is using their technology.

You're describing a very dysfunctional structure right now. Could you imagine a structure where technology like this could actually be leveraged for the kinds of purposes that we, as civil liberties–loving Americans, would want to see?

Patrick Eddington: Well, given the fact that I generally don't trust anybody in government, it's kind of a loaded question.

I think I'd make it broader, though. I don't believe anybody with power. I don't believe that most people who are put into a position of authority—I don't care whether it's in the government, in the private sector, labor unions—there's a really good history of humans over the course of the last 5,000 or 6,000 years of recorded history not being able to handle power responsibly.

Which is why I believe that the more decentralized a system is, the less likely you are—you're not going to eliminate it. The only way you do that is by literally changing humans. Maybe another million years of evolution we'll get there, I don't know. But I always worry about concentrated power.

I think it's almost always at its worst, though, in government—because government in this country doesn't quite have a monopoly on violence or the ability to dispense violence, thanks to the Second Amendment. Which, to be blunt, this administration has done not nearly enough to advance when it comes to guns…. And I'm a fanatic—an absolute zealot—about the Second Amendment, to be clear.

Zach Weissmueller: Are you trying to get on their radar as such? What are you doing today?

What you're saying, though, it implies that just the very nature of, A) creating the centralized federal database that we talked about at the beginning, and then B) pairing that with technology that is just brilliant at sifting through and picking out patterns from that database—there's not much hope for optimism when you combine those two with what we know about the nature of government power.

Patrick Eddington: Right. When you combine it with the fact that data entry is still a heavily human thing—especially when you're dealing with a very people-centric kind of thing like immigration, right? But it can go well beyond that.

The kind of data that can get ingested into a system—if it is wrong to begin with—it can have an enormous impact in terms of how things are steered by government officials.

One of the things I talk about in my latest book, The Triumph of Fear, is the way that this kind of data and the mentality behind it can really drive people to extremes.

This is not a partisan thing. Presidents of both parties have been responsible for these kinds of rights violations. Now, I think in his first six months in Trump 2.0, Mr. Trump is on track to exceed every single one of his predecessors, practically, in terms of trashing the Constitution, trashing due process, and all the rest of that kind of thing.

But the point I always make is that a lot of his predecessors set precedents that have basically made a lot of this possible. And it's not just the executive branch.

I think it's critical to understand the failure of Congress—or Congress actually being a cheerleader. The House Un-American Activities Committee was originally created in 1938. It was not finally abolished until 1975. That committee destroyed the professional and personal lives of thousands of Americans. Its Senate colleague, the Senate Internal Security Subcommittee, did the same thing.

Congress played a role in creating this modern surveillance state—this culture of political repression that is enabled by surveillance.

Then the courts—when the courts engage in actions like the so-called state secrets privilege from the Reynolds case—there's not a word in the Constitution… I haven't found it, maybe somebody else can point it out to me, but there's not a word in the Constitution that allows the president of the United States to classify a single piece of paper. It's not there.

Section 1, Clause 5 of Article I gives Congress the ability to classify or otherwise make secret its information, but not the executive branch. That's all court-made or otherwise statutorily enacted stuff that has subsequently been blessed—not just by the federal district or appellate level courts, but by the Supreme Court itself.

This is where you get this insane concept of "deference to the executive." Again, you can keyword search the Constitution. You will not find the phrase "deference to the executive," to the best of my knowledge anyway. You won't find it in there. But that's what's happened. And that's why we've created this monster.

And of course, my colleague Gene Healy, our senior vice president, has literally written the book on this. It's called The Cult of the Presidency. And that's why we've gotten so far away from the vision of the Founders in terms of what a national, functional republic should look like.

And all of this data, this digital era, and now this AI-enhanced digital era that we're living through is making, at least in a governmental context, the confluence of this stuff absolutely terrifying, at least as far as I'm concerned.

Zach Weissmueller: Yeah, and given that we're in that AI-enhanced digital era, I want to ask one more question that I think is one of the harder objections to overcome when we think about what should be done about technology like Palantir's or other firms that are doing similar things.

It was raised at the Economic Club of Chicago in May of this year. Alex Karp, the Palantir CEO, made a spirited defense of what the company is doing in its partnership with the U.S. government. I want to play that for one moment.

(Alex Karp Clip):

Alex Karp: I believe that Western institutions and the West are a superior way to live, and we fought for that. For me, that means meritocracy, free markets. That has often been very unpopular for almost a decade and a half in Silicon Valley—the idea that you would build your company around helping the U.S. government was viewed as kind of stupid. Which in Silicon Valley is worse than being bad.

That's changed a lot, partly because people realized it was wrong and, quite frankly, because if somebody makes a lot of money on something, then it must be right. So we've changed the world by humiliating people and getting rich. It's the most effective way for social change is humiliate your enemy and make them poor. That's how social change actually happens.

Zach Weissmueller: So what I'm hearing there is that whether you like it or not, this technology is going to exist. Capitalism is what drives forward social change. And at the end of the day, you have to pick a side. Palantir has picked the U.S. and the West, which explains why their tech is also being used by Israel and Ukraine.

There's a definite logic to that argument. What do you think?

Patrick Eddington: He was honest about the money part. Obviously, there's been a lot of money made there. 

I think the idea that capitalism by itself in some way drives social change is really a canard. 

I mean, it's done plenty in terms of the basic structure and helping lift overall living standards and all the rest of that. But in terms of the preservation of basic rights, that's something that…

The best government in that respect, as far as I'm concerned, is the government that's the most restrained. And when we are allowing a situation to happen now where these kinds of technologies are being put into the hands of people in government—and here's the thing:

Let's assume for the sake of argument that we avoid a civil war. I'm not convinced we will, but let's just assume for a moment that we actually do avoid a civil war, that Trump actually does not try to run for president in 2028, and all the rest of that. Let's assume that things kind of, sort of, more or less, go back to some kind of—not quite equilibrium politically—but something more familiar to us, that would kind of pass as normal.

Mike Turner of Ohio, who I'm not a huge fan of—Congressman Mike Turner—because he was the head of the House Intelligence Committee, a guy who's a real surveillance hawk, all the rest of that. Responsible essentially, in my judgment, for being, how should I put this, not entirely truthful about how the FISA Section 702 program was being used, or in my view, misused.

But the one truthful thing that I know he's said lately is that people on the Republican side should really not be cheering a lot of this stuff that's going on right now, because if Democrats manage to retake the White House and both chambers of Congress—which is a very real possibility—they could turn around and use those tools.

And by the way, that's exactly what my own work in this book basically shows. It just goes back and forth.

I've been selling this idea: that the way that we get out of this constant red team, blue team crap when it comes to weaponization of the Department of Justice, for example, is that we take the entire law enforcement function out of the executive branch and put it where it belongs—which is under the federal judiciary. To me, that's how you solve the problem.

Now, it's interesting that a couple of Democrats have kind of begun to warm to this idea, at least in a limited sense. Sen. Cory Booker of New Jersey and Rep. Jamie Raskin of Maryland introduced a bill back in May 2025 called the Marshals Act. And this was done in response to all these threats to federal judges—a lot of which, of course, have emanated directly from Mr. Trump himself, but also from a lot of his surrogates as well.

What the Raskin–Booker bill would do is take the U.S. Marshals Service out of the Department of Justice and lodge it within the federal judiciary. That's a great start. But it doesn't go far enough.

Because as far as I'm concerned, as long as you leave the law enforcement function resident in a political branch—and that's exactly what the presidency and the Congress are, they are the political branches—as long as that law enforcement function rests there, it's going to be ripe for abuse.

To me, I think that's the 200-plus–year failed experiment that shows us that we need to do something different.

Zach Weissmueller: So I'm understanding under this change, for instance, the head of the FBI would be nominated by a federal judge and confirmed by Congress?

Patrick Eddington: No. I don't think you'd  actually have to necessarily have to completely change the nomination process. I'd open it up. I would say that the president could nominate who he wants to be attorney general. But I also think that members of the House and Senate Judiciary Committees should be able to make their own nominees. And then the Senate would still be responsible for sorting it out and figuring out who would ultimately get the vote.

I think you get a better outcome that way. One of the other changes that I would make is to say that if you're going to be attorney general, you've got to serve for 15 years. That's going to get you a different kind of person—someone, I think, who's more dedicated to the concept of the law, and following the Constitution and constitutional norms and standards and prior case law, than the kind of nominee that you get now.

And there are a lot of other changes that I have in mind. My Cato colleague Mike Fox has a competing idea. In some ways, it's more practical than the one I'm offering. What I'm suggesting would actually take a constitutional amendment.

But Mike has suggested that we could just radically downsize DOJ and repeal a lot of the laws that it's currently charged with going out and dealing with. And there hasn't been that kind of a review of federal statutes—at least in a holistic way—that hasn't happened in like 100 years. So we're kind of overdue for that anyway.

So I mean, I'd take his idea if I couldn't get mine, because it would get us to a better place. I don't think it's ideal, but we need something like that, because the idea that people are going to change that much in terms of their partisan stripes or whatever, I don't buy it. That's just not what history shows us.

Zach Weissmueller: Yeah, I mean, what really comes through in your book Triumph of Fear is that because of this politicized law enforcement apparatus it was, from the very beginning—these tools that were supposed to be used to fight crime were used for political repression.

I mean, is that why that happened over and over again? I don't think there's a whole political nature of it.

Patrick Eddington: Yeah. I mean, I think all you have to do is kind of go back and look at Theodore Roosevelt's time in office to really get a sense of where things went off the rails.

The Secret Service was really the only federal investigative arm at the time that Theodore Roosevelt took over after President McKinley's assassination. And you may recall that when he was a political figure in New York state, Theodore Roosevelt really made his reputation as an anti-corruption crusader and all the rest of that. And so when he became president, that didn't change.

He was convinced that lots of people in both parties were corrupt, they were on the take, etc., etc. And so a lot of these Secret Service agents were dedicated or tasked to go out and investigate various members of Congress—House and Senate. Both sides of the aisle. Roosevelt was very nonsectarian, if you will, in his misuse of those Secret Service agents.

It finally was a scandal involving the misuse of a Secret Service agent in late 1907 to investigate an extramarital affair by a naval officer—that surfaced in the press. That finally gave members of Congress the political ammunition and cover they needed to go after Roosevelt. They'd wanted to do it for a long time, but they were afraid that if they tried to circumscribe what the Secret Service could do, that Roosevelt would say, "You see, I told you so. They're a bunch of corrupt swine. We need to take them down," and so on and so forth.

So once they actually went after somebody for their personal conduct—their personal private conduct—it gave them the opening they needed. And so in early 1908, in the Sundry Civil Service bill, they put language in there that expressly prohibited the Secret Service from doing anything other than protecting the president and going after counterfeiters.

Teddy Roosevelt was enraged by it, but he didn't veto the bill. He had a different plan. He went to his then–attorney general, Charles Bonaparte—who, interestingly enough, was a distant relative of Napoleon Bonaparte—and he told Bonaparte, "I want you to create a cadre of agents to go and do this exact same thing. Take it out of general funds and go forth."

And so, on July 26, 1908, that is exactly what Charles Bonaparte did. That's how the Bureau of Investigation, which we now call the Federal Bureau of Investigation, came into being.

And a lot of the Secret Service agents who had been involved in those political investigations—they moved over. Literally, they moved to the DOJ. And they would be involved in some of the first politically motivated investigations. Some of them targeted journalists; others targeted socialists.

This all happens within the first 9 months of the BOI coming into existence. So yeah, this stuff goes right back to the beginning.

Zach Weissmueller: Incredible to realize that our first federal law enforcement agency was born out of a politicized dispute like that. We now think we're very familiar because of Trump's battles with the deep state about this issue of weaponization of government. But it really does go back to the very beginning.

And you helped put together, at the Cato Institute, this timeline. Let me pull that up. It's called The American Big Brother Project. It's got a timeline of different instances of federal law enforcement being weaponized, and it's broken down into different eras.

At the beginning, we've got that story you just recounted. I was wondering if we could just pick out one or two of these from some of the other eras that really stand out to you as just really bad examples of federal law enforcement being abused for political purposes in these ways.

You've got the World War II era, the Cold War era, and then post-9/11. Maybe you could pick one from the Cold War and one from post-9/11 that jumps out at you. I could pull it up here.

Patrick Eddington: Oh, I don't think there's any question that among the most infamous episodes in our entire country's history was Dwight Eisenhower green-lighting the FBI's counterintelligence program, or COINTELPRO. That happened in August of 1956. It's something that I talk about in some detail in the book.

And of course, it started out as something designed to target the Communist Party of the United States—which is ironic, because 2 years earlier, Eisenhower had signed the Communist Control Act, which was the first piece of legislation to ever actually ban a political party in the United States. A completely infamous affair, to say the least.

But yeah, COINTELPRO started with CPUSA, but then it of course branched out. They went after Dr. King, Dr. Martin Luther King Jr. They went after the organization that he founded, the Southern Christian Leadership Conference, and it just took off from there.

We're talking about hundreds of thousands of human beings who were targeted in the course of this, and organizations that were targeted in the cause of this. And it wasn't just COINTELPRO. There was COMINFIL—the Communist Infiltration Program. There was the Ghetto Informant Program, which was a real disaster. Talk about bureaucratic incentives to go out and pad your numbers of informants. That was a complete…

Even the FBI internally realized that they'd completely gone off the rails there. So that's a good one.

There are so many post-9/11 programs that have just been off the rails, but I think the one that's by far been the most consequential on a day-to-day basis—literally for all of us—is the Stellar Wind Program. I mean, what ultimately became, as I think I referenced earlier in our chat, the FISA Title VII, Section 702 electronic mass surveillance program.

The entire concept started out as a counterintelligence and counterterrorism kind of program. As a former serving CIA officer, I will say I'm all for counterterrorism programs when they are grounded. When they can actually show results. And counterintelligence programs are absolutely essential. There's just no doubt about that.

But when you start to add all this other stuff in—and it's the drug war stuff that drives me crazy. I mean, like adding fentanyl to the list of things that FISA Section 702 can be used for, and human trafficking, and all the rest of this.

And I'm not saying these aren't problems that need to be dealt with. But the more that you securitize a program, the more likely it's going to be misused for the wrong reasons and the wrong purposes.

And there's just an enormous amount of pressure inside the bureaucracy that flows ultimately from the office of the president. There's no doubt that pressure from George W. Bush helped to ensure that the Stellar Wind Program not only was started, but that it remained secret.

I mean, this was all handled outside of the Foreign Intelligence Surveillance Court structure—completely, totally off-books, absolutely illegal, absolute violation of FISA. And it wasn't until they were called on it publicly and The New York Times exposed it that we began to get some kind of corrective action there.

Too many people in the House and Senate were just willing to look the other way. Nancy Pelosi was a great example of that, and the late Sen. Jay Rockefeller as well. Former Sen. Rockefeller, I should say. They got briefed on Stellar Wind.

Now David Addington, who was Vice President Cheney's point man on this and one of the biggest cheerleaders for Stellar Wind—when they came to the Hill to brief Pelosi, they said, "You can't have staff in there," and all the rest of that.

And she wrote basically a CYA memo after it was over, saying she raised her objections and so on and so forth. But that was just lame. I mean, she was the senior Democrat in the House, and all she had to say was, "I need your business cards." And then after they passed all the business cards to her, she said, "Thanks. I just need to make sure I've got your names spelled correctly for the impeachment resolutions that I'm going to offer after I leave this meeting."

That would have ended the secrecy of that program right then. They would have had no choice at that point but to play ball. But she and Rockefeller just—they rolled over. And that's not oversight. That's enabling.

And that's another reason why a lot of these programs happen. I mean, you get some people that are real hawks, like the late Congressman Porter Goss, he was an ex-CIA guy like me—super hardline, pro-surveillance, all the rest of that. He was a big defender of all this kind of stuff.

But Pelosi? She had no excuse. And Rockefeller? They had no excuses. They knew what their job was, and they just didn't do it.

Zach Weissmueller: I wanted to ask you about another post-9/11 program, which was called Total Information Awareness. Because in my research for this, it seemed like this was a real turning point, and just changed the entire paradigm of how the government thought about data collection.

After 9/11, this office called the Information Awareness Office, which was established by DARPA. Kind of the technology arm of the Defense Department, embarked on this program: Total Information Awareness. What was this?

Patrick Eddington: Well, it's what Palantir is now or at least that's what they were literally trying to do: come up with the ability to ingest data from pretty much any source and then run that against their de facto target list.

That's really, at the end of the day, kind of a simple, straightforward explanation for it.

And again, this leaked. And it led to a major dust-up. And ultimately they wound up basically having to put it on the shelf the idea. But a lot of these really bad surveillance-related ideas—even if they get knocked down—they're kind of like a vampire, right?

You knock them down, you think you've knocked them down for good, and then one way or the other, they just seem to come back.

That's why I chose the title that I did for the book—The Triumph of Fear—because all this stuff, at the end of the day, is fear-based. That's how we get a lot of this surveillance mentality.

It starts at the turn of the 20th century. McKinley is assassinated. They go after anarchists everywhere they can find them. The police commissioner in New York says, "We're just going to find these people and get them out of here. End of story." Forget due process. That's just completely out the window.

It starts with anarchists and socialists and all the rest of that. And then it morphs over time—then it becomes communists. This becomes the big thing from the 1930s onward, with the interregnum of looking for Nazis and other Axis sympathizers and agents.

But after World War II, of course, it became: it's all commies, all the time. Everybody was a suspected communist.

And so the mentality behind this—by politicians and then by people in the governmental bureaucracy—is what helps to sustain this stuff. And that's exactly why not having the right kind of people in Congress, and having this coercive capability lodged in the executive branch, it's always been the ultimate threat.

I was 12, 13, maybe a little younger than that when the Watergate hearings took place. But one of the things I distinctly remember from that period was when former White House Counsel John Dean was in front of the committee, and he said that he tried to get Nixon to understand that there was a "cancer on the presidency."

He was referring, of course, to the Watergate cover-up and the entire dirty tricks campaign—a COINTELPRO-style program, really—that the Committee to Re-Elect the President ran. But Dean got it wrong. It wasn't the cover-up that was the cancer on the presidency. It was the presidency being the cancer on the Constitution.

That's exactly how I look at it. That's what I've come to believe.

Our senior vice president at Cato, Gene Healy, and I—we kind of joke that we're the two avowed Anti-Federalists working. And that is exactly how I describe myself.

I'm not a so-called libertarian. For one thing, there are so many strains of libertarianism—you have to say, "I'm a Hayekian libertarian," or "I'm a Rothbardian," or "I'm a Randian," or whatever. And I'm none of those things.

I am literally a 21st-century Anti-Federalist. I think the wrong people won at the Constitutional Convention.

Zach Weissmueller: And that's to say that that battle was lost so long ago?

Patrick Eddington: I'm still fighting it. I'll fight it till I die.

Zach Weissmueller: Well, yeah, that brings me to what I wanted to wrap this up with, which was—you've talked about some of the higher-level solutions, like moving federal law enforcement into the judiciary. What about at the individual level?

One of the things that happened after the Snowden revelations—you mentioned Total Information Awareness kind of getting shoved aside—but really, when you think about it, what Snowden exposed was that the paradigm persisted. The idea that, yes, we're going to know everything, we're going to hoover it all up—as much as we can.

His memoir was appropriately named Permanent Record, because that's what they're trying to create.

One of the things that came in the wake of those revelations is what Jay Rosen termed the "Snowden effect," which is: beyond any policy changes, the mere awareness of what was going on changed people's behavior. It changed the way journalists wrote about things. It changed the way that technology companies like Apple promoted their products—integrating encryption and privacy-protecting tools into the technology itself.

What role do those kinds of countermeasures play as we enter this brave new world of AI-assisted mass surveillance?

Patrick Eddington: You heard Lonsdale and others reference the incentives, right? There's a lot of money to be made here.

I think the key thing is helping organizations like the Signal Foundation flourish—because, contrary to Mr. Hegseth's incompetence using the Signal app, it is absolutely the best thing out there. And we know that because Professor Matt Green—one of the most prominent computer scientists in the world—routinely has his students try to crack Signal's core code. And to my knowledge, they haven't managed to do it yet.

We also know from Snowden—and this is data that's over a decade old now—we know from what Snowden released that NSA was just pulling their hair out over Signal. So okay—good to go. That means you've got at least one halfway decent channel for protecting your communications.

For my most sensitive communications—text-based or something like that—that is my go-to app.

If I need to send something of a relatively sensitive nature, particularly if it's political in nature, I use Proton—or what's now ProtonMail. They now have a whole suite of tools and applications. It's not just mail: calendar, storage, all that kind of thing. It's all encrypted. And of course, they operate in Switzerland, which has some of the best privacy laws on the planet.

So those are my go-to's. And those are really good ways of fighting back.

And I also just think, on a personal level, you have to be mindful of the technology—the so-called Internet of Things. There's a lot of convenience that can come. I don't have a Ring camera. My wife and I do not have that particular kind of system. But those kinds of systems, you know, they can be very valuable to you as a homeowner in terms of alerting you to a potential intruder and all that kind of stuff.

But they also capture an awful lot of data and activity that has nothing to do with somebody trying to get into your house. Which is why cops love it. That's why cops have set up these "Share your Ring data with us" and so on and so forth. They try to encourage it.

I discourage that. The cops don't need to have that unless there's an actual crime that's been committed. That's a different animal.

Being mindful of that kind of stuff—and if you're a gun owner, like I am—I always tell my friends: Don't buy a gun safe that's Wi-Fi–connected. Don't do that. Don't make it easier—not only for the government to potentially figure out how many weapons you have, how many weapon lockers you have—but also, don't make it easy for some scumbag hacker to potentially break into your home and, and at the same time, get your weapons.

Because in the legal environment we have now, I'm not 100 percent sure that even if you had a safe—if it was connected to the internet—would somebody try to make a liability case against you? That you didn't really properly secure your weapons?

I don't know. I'm not an attorney. But that's one of the questions I have. Which is why my gun safes are not in any way connected to the internet.

So it's about mindfulness, about how pervasive this technology is, and how easy it is for government entities, law enforcement especially, to get their hands on it.

Do everything that you personally can to minimize it.

And then just stay politically active. And I mean that in the broadest sense. Be an advocate for maximum possible privacy. Be an advocate for strict adherence to the Fourth Amendment. 

No probable cause? No warrant? No exceptions. That's my mantra. That's exactly how it should work. That's what we need to get back to.

If you don't fight for your rights, at the end of the day, somebody will try to take them away from you.

So just be vigilant.

Zach Weissmueller: You know, that's great practical advice. I appreciate that.

I want to put to you the final question of the show that we ask all of our guests, which is: What is a question that you think more people should be asking?

Patrick Eddington: Yeah, that's a really good one.

I think the question that everybody should be asking themselves is this: What can I do to help ensure that the republic survives?

That's the number one question.

And if you're not into the No Kings thing—that's a very left-leaning kind of activity. I'm not being critical of it. I think what they did a couple of weekends ago, I think it was great. That's a good thing.

Living in a republic is not a spectator sport. It's not.

You don't have to be a political junkie like me. You don't have to have been in Washington for 35, going on 40 years. You don't have to do any of that. But if you've got a little bit of money that you can send to support organizations like the Cato Institute or Reason or other organizations that have a liberty-centric focus—that are interested in trying to maximize individual liberty, and thus individual happiness for everybody in this society—chip in a few bucks.

You'd be surprised how much it helps. You get enough people—I don't have to tell you this—you get people that are willing to chip in some money, and it can make a big difference in terms of giving us the ability to get the message out about why a more liberty-centric society is a better society. It's a healthier society, psychologically, physically, all those kinds of things.

And just don't give up.

At the end of the day, this country is a miracle. And I think a lot of us, in this very, very difficult and dark time, we lose sight of that.

It's a miracle that this country managed to win its independence—with a little help from our French friends. We beat back a reconquest attempt between 1812 and 1815. We survived our bloodiest war to date, the Civil War. We won two world wars and the Cold War.

We can get through this. But we have to remain true to those values that got us through all those other things.

And if we do that, we'll come out on the other side of this.

Zach Weissmueller: What a wonderful, important message to end on. I really appreciate it.

Patrick Eddington. His book is The Triumph of Fear. He's got two other books that you can get on Amazon as well.

Thank you so much for coming on the show.

Patrick Eddington: My pleasure.