Our AI Future – Sexbots, Toilet Drones, and Robocops?

Episode 316 of the Cyberlaw Podcast

|The Volokh Conspiracy |

Our interview guest, Peter Singer, continues to write (with August Cole) what he calls "useful fiction" – thrillers that explore the real-world implications of emerging technologies. His latest is Burn-In: A Novel of the Real Robotic Revolution, to be released May 26, 2020.  The thoroughly researched (and footnoted!) book is a painless way to understand the social and economic changes new AI and robotic technologies will make possible and their impact on actual human beings. The interview ranges widely over these policy implications, plus a few plot spoilers.

In the News Roundup, David Kris covers the latest Congressional FISA Follies, leading me into a rant on the utter irresponsibility of subjecting national security authorities to regular expiration – and equally regular ransom demands from the least responsible elements of Congress. Speaking of FISA, it turns out that the December Pensacola shootings were hatched by al-Qaeda's Yemen franchise. Why are we only learning this in May? Because the evidence comes from an iPhone whose security Apple refused to find a way around. The FBI's self-help solution worked in the end, but not until the trail had gone cold.

US-China decoupling is in overdrive this week. Nick Weaver talks about the move by the Trump Administration to achieve semiconductor self-sufficiency – and probably-not-coincidental announcements that TSMC will build a chip factory in Arizona and that the Commerce Department has drafted a new export rule aimed at making it much harder for TSMC to build chips for Huawei. In response, China is preparing a list of unreliable US suppliers of technology. I wonder whether putting companies on that list for diversifying their supply chain out of China will have the long-term effect of making companies more reluctant to open new supply relationships with Chinese companies.

David and I note that recent US accusations of Chinese and Iranian cyber intrusions on COVID-19 research may be more than just the usual imprecations.

And Nick explains why so many US professors are going to jail for undisclosed China ties. The key word is "undisclosed."

Mark MacCarthy previews France's (and Germany's and the EU's and the UK's) increasingly tough sanctions for US social media firms that fail to remove "hate speech" and other bad content within 24 hours (or sometimes one hour). More and more, it seems, Section 230 immunity is just a local US ordinance.

Mark and Nick review the latest trial balloon from Europe's technocrats: How about a Chinese firewall for Europe, ask apparently respectable policy thinkers working for the European Parliament.

David and Nick find themselves agreeing with the latest release from DHS's CISA pouring cold water on online voting.

In quick hits, David notes the Trump administration's now routine extension of the "telecom national security" Executive Order, Nick brings us This Week in NSO Bashing, I touch on a ransomware and doxing threat that has tripped up a celebrity law firm, and Nick and I muse on why cell phone contact tracing seems about to jump the shark.

We close with a surprising catfishing story that leads us into a discussion of the relative hotness of recent NSA directors and whether it's true that being dual-hatted makes you irresistible to women.

 

 

Download the 316th Episode (mp3).

 

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

 

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

 

NEXT: Textualism, Title VII, and "Discrimination . . . Because of Such Individual's Sex"

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

  1. I never thought I would see the topic of sex bots brought up on this site. To me they are mostly interesting for a) the technical challenge involved in making them and b) the reasons people come up with not to have them. Globally, we already have arrests and prosecutions in the free world over private citizens having sex dolls.

    1. Realistic and affordable sexbots — which we are only a few years away from — will put an end to the feminist movement overnight.

      Ever since the ’60’s, women have been trading sex for love and now they will have to go back to trading nurturing for love. Just like in the 1950’s because no woman could ever compete with a sexbot in terms of sexual athletics.

      It will be interesting…

      1. Assuming, of course, you would be allowed to buy one. I think in Texas and a few other states it is still a misdameanor to own a dildo.

        1. To sell, not possess.

  2. On the topic of online voting. I have never understood how we can commit trillions of dollars of banking and commerce to online transactions 24/7/365 with negligible fraud but online voting is “risky.”

    There are so many reasons that it is wrong. Properly designed and implemented, online voting would be far more secure than anything we have in place now. It would also be far more accessible. There are countless thousands of competent software engineers who would know how to do it. (and yes you can do it without effing blockchain technology but that is of course one of several options.)

    Imagine a voting record that was anonymous but anyone anywhere could access it without constraint and without cost. The same system could still allow traceability of votes where each and every voter could verify that their vote counted the way they wanted it to — the exact same way you verify your bank balance is what it is supposed to be. Any identity theft that occurs would be quickly detected and could be easily rectified by the same agency. All of that is easily possible with existing tech in open source software.

    But no — we can’t go there. The workings of newfangled computer systems intertube thing is just too doggone complicated so we will just go with the seemingly easy answer that just happens to make us look wise. It really is sad how we hobble ourselves.

    1. For online voting to be as fraud free as online banking there would need to be an effective incentive for each voter to police their own vote the way that they police their bank accounts. If someone hacks your bank account you take action because of the cost to you. There needs to be something similar for votes.

    2. Part of the issue of on-line voting is the assurance of an anonymous vote — which isn’t possible if you tie the voter to the ballot.

      1. That is simply not true.

        Please look up now public key cryptography works. The wikipedia article is a good place to start but this isn’t even considered cutting-edge tech anymore.

        1. “Could” work — which doesn’t mean “will.”

    3. “I have never understood how we can commit trillions of dollars of banking and commerce to online transactions 24/7/365 with negligible fraud but online voting is “risky.””

      Short answer: The government is in charge of one of them, but not the other.

      1. This is just a typical ideology-based chimera. The government runs many services both in-person and on and web sites with very good privacy protection.

        Have you seen Donald Trump’s tax returns? Or, for that matter, anyone’s that have been leaked illegally? Do you suppose that is because legions of hackers just aren’t interested or because they can’t.

        1. IRS has good digital security because they have to protect the government’s revenue streams.

          The military and intelligence services have good digital security.

          The rest of the government, not so much.

  3. “Why are we only learning this in May? Because the evidence comes from an iPhone whose security Apple refused to find a way around. The FBI’s self-help solution worked in the end, but not until the trail had gone cold.”

    Not sure how to read this sentence.

    Are the writers being critical of Apple?

    Because that would go against what their normal stance.

  4. “Speaking of FISA, it turns out that the December Pensacola shootings were hatched by al-Qaeda’s Yemen franchise. Why are we only learning this in May? Because the evidence comes from an iPhone whose security Apple refused to find a way around. The FBI’s self-help solution worked in the end, but not until the trail had gone cold.”

    This is familiar. Because reason.com, this site, published a debunking of this just a day later: https://reason.com/2020/05/19/justice-department-attempts-to-blame-encryption-for-terrorist-attack-feds-failed-to-see-coming/

Please to post comments

Comments are closed.