Are You Bioconservative, Transhumanist, or Something in Between? Take the Test and Find Out.
The Science of Aging (SAGE) Crossroads offers a Techno Tolerance test. Among the questions asked are would you upload your consciousness or take treatments that would completely stop aging? The test is modelled on the World's Smallest Political Quiz. It will not surprise frequent reason readers that I scored as a perfect Transhumanist-Biotech. Take the test here.
Disclosure: I debated bioconservative Francis Fukuyama five years ago at a SAGE Crossroads sponsored forum on the question: "Early death, disease, disabilty: pro or con?" Okay, the actual topic was: "What are the possibilities and the pitfalls in aging research in the future?" I also get a perfect Libertarian score on the World's Smallest Political Quiz.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
I'll wear a foil beanie to get on the web of the future. I'd even shave my head for a better connection. But no cranial plugs. That's just icky.
I scored "TransHumanist-Biotech". My only "no" was on having my consciousness uploaded into a computer. Virtual sex for eternity? eeyeww...
There were a couple of questions I would have liked to see a "maybe" on.
Just answer Yes to everything for your perfect score.
I am slightly hesitant about giving full rights to AI. I understand the motivations, but more info is needed, and seeing as we don't have AI right now, those questions can't be answered.
I'm comfortable with being a transhumanist.
As long as they don't start calling us "trannies."
I'm shocked...reason editors scoring as true believers on various tests. Hardly representative of the larger libertarian movement.
🙂
I also have to say that this quiz makes some pretty presumptuous assumptions:
"These questions should unearth your inner feelings about technological advancement and its role in our future. While answering the questions, assume that cost is not a barrier and that the technology is widely available. Also, if a question asks you about living a great long time, assume that your years will be spent happily on the shore of your favorite beach rather than in a hospital."
Neither of these seem to be overly safe assumptions. I understand they're trying to limit the test to certain things, but still...the devil's in the details.
I went to the Star Wars exhibit at the Franklin Institute in Philly last weekend. Among all the Star Wars models and constumes, they had several interactive stations one of which was supposed to test your tolerance for cybernetic enhancement. I was shocked to see that 70% of the people at the exhibit (a Star Wars exhibit no less) were against the idea.
As you might expect, I scored transhumanist, but I guess that idea still creeps out alot of people.
I am slightly hesitant about giving full rights to AI. I understand the motivations, but more info is needed, and seeing as we don't have AI right now, those questions can't be answered.
I agree. I also think there's a big difference bewteen a discrete intelligence like Data or one confined to a single machine and an intelligence that is basically the internet. Anyone who's read I,Robot knows there's a significant amount of wiggle room in how a machine applies logic to the concept of humanity.
In other words, I trust a biological organisms concept of responsibility more than a machines. Humans are highly interdependent on each other in ways that an artificial intelligence probably will not be. Given the problems we have with politicians, who are at least marginally human, a fully autonomous AI concerns me.
Of course, I'm not against trying to create one and dealing with the ethical ramifications later.
I scored slightly into the upper right quadrant, whatever it was called. I scored better on the bio than the electronics stuff. However, it wasnt because I was opposed to it, just because I personally dont want it. Im smart enough without enhancements. 🙂
Also, anyone who answered yes to the last two questions is an idiot (so Im calling out all who got a perfect score). The future relies on time moving forward. Thats it. We dont need to master or embrace anything. The future will show up regardless. Those questions dont test anything other than knowledge of how the universe works. They were factual questions, not opinions.
Its quite possible that my problem with almost all on line quizzes is that I answer questions literally.
I scored as a transhumanist as well, but not a "perfect" one. I said no on two questions:
No, for the same reasons I wouldn't take a Star Trek-style transporter. The new copy of my personality is indistinguishable from the real me to an outside observer. The new copy has perfect copies of all my memories and truly thinks it's me. But the real me is still outside the computer (or dead). (Thought experiment: imagine you made two copies of your personality, running on two different computers. Which one does your consciousness inhabit?) If the "perfected" system somehow includes a proof that my original consciousness would be the one in the computer, then I'd go for it. But I think providing such a proof is logically impossible.
I'm assuming this question means "would you get a chip after a few muckety-mucks get them?" Hell no. Being an early adopter means you get the buggy versions, and a BSOD can be fatal when it's in your brain. Plus, it might go the way of the Betamax or the TI-99 4A -- and once support runs out, you're screwed. I'll get one, certainly, but I'll wait until the format wars die down.
Hi Jake: What's wrong with cloning your consciousness? 🙂 Also, what if uploading is the only shot for continued "existence" of some sort?
With regard to buggy interfaces, I get your point, but presumably one would always be able upgrade to better ones as they come along.
One thing I don't like about this test is that it sometimes asks whether you would do something yourself, and sometimes asks you if you would allow other people to do things. I'm naturally suspicious of things like nanotechnology on a personal level, and I probably wouldn't get a chip in my brain. But that doesn't mean I think it should be banned, and if other people want to take the first-generation plunge then they can go ahead.
I'm assuming this question means "would you get a chip after a few muckety-mucks get them?" Hell no. Being an early adopter means you get the buggy versions, and a BSOD can be fatal when it's in your brain. Plus, it might go the way of the Betamax or the TI-99 4A -- and once support runs out, you're screwed. I'll get one, certainly, but I'll wait until the format wars die down.
I generally assumed that all the tech in question was stable and had an acceptable failure rate. We are supposed to pretend it's widely available after all.
But if not, there's no way I'm beta testing anything in my brain.
Ron,
With regard to buggy interfaces, I get your point, but presumably one would always be able upgrade to better ones as they come along.
Its hard to upgrade after the total system failure caused by some hacker.
I wouldnt accept any implants unless:
1. The software was open source
2. I have taken the time to do a line by line audit myself.
I would probably want to audit the hardware too, but I dont really have that skill set, so I would have to rely on others.
Stretch
Windows ME was widely available and not beta.
I can't speak for Jake, but for my part I don't have a problem with cloned consciousness. It might be better than nothing. But I don't personally consider that immortality in the sense most people usually mean it. If I could theoretically hold a conversation with "myself," then it's not immortality. I answered "no" to that question only because of the immortality part. I might upload my personality, depending on the circumstances, but there'd have to be a hell of a lot of evidence to convince me that it's me that would live forever.
3. A way to upload consciousness into a computer has been perfected. This, effectively, allows for immortality of the individual (though does nothing to stop the death of the body). Would you get uploaded?
No, for the same reasons I wouldn't take a Star Trek-style transporter. The new copy of my personality is indistinguishable from the real me to an outside observer. The new copy has perfect copies of all my memories and truly thinks it's me. But the real me is still outside the computer (or dead). (Thought experiment: imagine you made two copies of your personality, running on two different computers. Which one does your consciousness inhabit?) If the "perfected" system somehow includes a proof that my original consciousness would be the one in the computer, then I'd go for it. But I think providing such a proof is logically impossible.
David Brin explored the possiblities of temporarily "cloning" your consciousness in "Kiln People".
robc,
yes, you certainly have a point. Let me rephrase: I assumed the various technologies actually worked properly.
Ron, what Jake is saying is that if your consciousness gets "uploaded" into a computer, it is a copy. Presumably, your feeling of self does not flow into that computer; it stays with your body. When your body dies, you die too. The copy lives on, but you do not.
Unless they could figure out how to transfer your sense of self, the only benefit to this would be that information lives on, because you will still die.
Wouldn't the answer to the uploading question depend potentially on a number of post uploading factors?
Ron Bailey,
With regard to buggy interfaces, I get your point, but presumably one would always be able upgrade to better ones as they come along.
Not if version 1.0 leads to say death or damages the brain.
Same for the Star Trek transporter. The transporter destroys your body on the atomic level. It kills you. Then it sends your pattern to a location and rebuilds you completely, using matter from that location. So what ends up at the final location is a copy of you. Identical, but not you as you were killed during deconstruction. Unless your sense of self is somehow transferred as well, you die the first time you get transported.
But if not, there's no way I'm beta testing anything in my brain.
I wonder how many people there are who would completely freak at the thought of inserting anything electronic in their brain, but who wouldn't have any problem ingesting something chemical they bought on a street corner.
I think I confused the test a bit. I want to live better, but not longer. As a question of perspective, I think technology should be available and studied, even if I don't want to use it personally.
The only questions I answered 'no' to were the pre-natal testing questions. I have concern about the potential for eugenics, but also have doubts about the capacity of humans to judge which genetic trends are "advantageous" in the long run.
Many genetic anomalies that are considered pathological have occasional and important benefits; sickle-cell anemia is the most obvious example (conferring immunity to malaria), but also imagine all the people who have genetic predisposition to mental illness, and then consider that many geniuses and captains of industry have been close to if not full blown mentally "defective" in one or more areas, and that these deviations from the norm may have in fact contributed to their productivity and success.
And, additionally, with the ability to augment capacities and fix many conditions "post-natally" if they do become problematic for the person at issue (as cybernetic and bioengeneering supplements become more advanced), there is much less need to deny a person the chance at life simply because of the presence of a possibly debilitating condition.
NWD -
Which also brings up the question of choice, which I think is critical as advances multiply in these fields. Important issues start cropping up regarding the ethics of using these advances on children, for example, or mandates "for public safety" (such as vaccines presently).
Ron -
Grylliade and Episiarch are correct as to my objection. Whatever wakes up in the computer would be Jake 2.0, a perfect copy of Jake 1.0, but a copy nonetheless. And while I'm not against the cloning of personalities in that way (with the knowledge that it's just a copy) I don't get anything out of it, immortalitywise. Worse, if the process wipes out my meatspace brain, Jake 1.0 is dead sooner than would otherwise be the case. As Jake 1.0, I'm against that.
Also, Jake 2.0 would probably be interested in gaining control over Jake 1.0's assets, since Jake 2.0 believes himself to be Jake 1.0. As Jake 1.0, on the other hand, I would rather have my assets go to my kids than to this 2.0 bozo.
If, on the other hand, my family members said, "You're so great, we can't possibly go on without you," and they understood that Jake 2.0 would be a copy, not the actual me (but to them, effectively indistinguishable), I would submit to having my memories uploaded so as to comfort them after I die in a freak replicator accident. But as Jake 1.0, I won't be there to know if it's consoling or just creepy.
(Note that, since he'll have memories of this discussion, Jake 2.0 would know he's merely a copy. Nevertheless, I know myself well enough to suspect that Jake 2.0 might well conclude that the distinction between "original" and "copy" is overblown or even meaningless.)
When your body dies, you die too. The copy lives on, but you do not.
If its a good copy, it has a sense of self too. When one copy of dies, and the other doesn't, have "you" died?
they understood that Jake 2.0 would be a copy, not the actual me (but to them, effectively indistinguishable)
In what sense is an undistiguishable copy not the real you? Can't there be two instances of the real you, just like there are two instances of that contract I just xeroxed? If "you" are just a pattern of information, then what's the diff?
RC, we all have a sense of "self". If I replicate you using a transporter, down to the atomic level, and you are standing face to face with your copy, do you feel that sense of self in both bodies?
My guess: no. Therefore, if I were to create a copy of you and then tell the orignal (you) that I was going to kill you because there can be only one, you'd get upset, right? I mean, you will be killed. The copy will live on and be identical, but the version in which you feel the sense of self will die, so you will die.
I don't know if that clarified; maybe Jake can do better.
Because there's no mechanism by which my consciousness can "jump the gap" from body 1 to body 2 (though in this case body 2 is a computer instead of a pile of meat). My consiousness -- my sense of "self" -- is shackled to Jake 1.0, no matter how realistic the other Jake might be. It's identical to me, but it's not me. It has all my memories, it knows all my secrets, but I'm not the one "driving" over there. I'm still only controlling my old meat body, and Jake 2.0 is driving the 2.0 body. Just like if you have an identical twin, and you die, your consciousness does not "jump" to his body... even if that identical twin was created via transporter to be identical to you, clear down to the quantum level.
Or, alternately, "What Episiarch said."
Dumb questions on that test. "The future relies on...". Time-- nothing else.
"Humans... ...have the same rights as humans." Uh-- yes? By definition...
I also get a perfect Libertarian score on the World's Smallest Political Quiz
Hmmm, Ron must be a Paleo instead of a Cosmo then. There are many variations o the WSPQ, but most are Rothbard rather than Crane in their underlying libertarianism.
Just as food for thought, there's really no way of knowing that "you" right now aren't actually a copy of "you" ten seconds ago, which personality died through a reorganization of neurons, other cells, etc., and that your bran's only tricking you into thinking that you've been a single consciousness throughout your entire life.
That is, your brain, not your breakfast cereal.
My consiousness -- my sense of "self" -- is shackled to Jake 1.0, no matter how realistic the other Jake might be.
But why doesn't Jake 2.0 have just as much of a sense of self as Jake 1.0? If Jake 2.0 "dies" but Jake 1.0 does not, is Jake dead? If not, then why is Jake dead if Jake 1.0 "dies" and Jake 2.0 does not?
It's identical to me, but it's not me.
Is this a distinction without a difference? Does it matter whether you have the copy of the contract that came out of the printer, or the one that came off the copier?
Scored a Techno-Progressive, right on the border of Trans-Humanist Biotech.
Give me time to get used to the idea folks. I'll be there soon enough.
Therefore, if I were to create a copy of you and then tell the orignal (you) that I was going to kill you because there can be only one, you'd get upset, right? I mean, you will be killed.
Probably, at an emotional level, but that is a function of the degree to which I don't believe that R.C. 2.0 is just as valid an instance of R.C. as R.C. 1.0. If I believe that the two really are equally valid instances, then I might not really care about the termination of one copy or the other (assuming it is instantaneous, painless, etc.).
If I believe that the two really are equally valid instances, then I might not really care about the termination of one copy or the other (assuming it is instantaneous, painless, etc.).
You wouldn't care about being killed, just because another copy will exist? Remember, you will cease to exist, in the sense of your consciousness. Just because another copy of you, with its own consciousness, will survive, doesn't mean that you don't cease to be.
identical twins is a perfect analogy (as perfect as anologies come, anyway) to the Jake 2.0 problem. Identical twins start as 1 fertilized egg that splits into 2 separate people. From that point on, they have different experiences (even place in womb matters) and a different sense of self, even if they were perfect copies of each other originally. Ditto for Jake. Jake 2.0 may have a sense of self, but it is a different sense of self than 1.0 has. See the 2 Riker's episode of start trek. If the original Riker dies, he doesnt say, oh well, thats okay, my copy is out there, so I continue on. His twin is his twin, not himself.
I cant wait until Im rich enough to hire someone to type what I dictate into web forums for me.
Imagine that on a resume - I typed (with correct spelling and grammar) for some rich guy who sat around reading blogs all day and dictating things for me to type into them.
The quality of my posts would greatly improve. Some recently graduated english lit major or something would work fine.
Jake 2.0 has exactly as much sense of self as Jake 1.0. However, from Jake 1.0's point of view (and as Jake 1.0, that's the one I have to deal with), Jake 2.0 is a distinct individual from Jake 1.0. If you copy a contract and then burn the original, the original is gone, no matter how good the copy.
I'm not so much concerned with "is it important for other people to differentiate between Jakes," and here's why:
Imagine that my "selfness" -- my consciousness -- can be represented by a line stretching from 1972 to the present. Each other person in the world has a similar line; it generally starts when they're born, and ends when they die.
In 2008, my brain is downloaded into a computer. Jake 2.0 has a line that's been copied from the 1.0 version. It's identical in every way to Jake 1.0's line, and there are now two lines stretching on into the future as time progresses. (Jake 2.0 has a line reaching back to 1972, even though he just came into being in 2008.)
In 2009, Jake 1.0 is hit by a bus and killed. Since Jake 1.0 has died, line 1.0 ends. Line 2.0 is still going strong. However, as I'm the guy who was living on that 1.0 line, subjectively, my life is over. I can't "jump" to the 2.0 line now that I'm dead, because there's no connection between me and it. My consciousness ends in 2009 and I cease to exist.
In 2010, there's only line 2.0. People who want to talk to "Jake" can interact with Jake 2.0, and Jake 2.0 will be just personable or assholish as Jake 1.0 would have been, but Jake 1.0 will never know it, because Jake 1.0 is dead.
So my concern isn't about how good the copy is, or how many copies of myself are preserved for posterity. My only concern is the length of the line attached to Jake 1.0. Copying myself doesn't change the length of that line, and so it cannot possibly convey immortality.
(The contract analogy is suspect, to me, because contracts aren't -- so far as we know -- self-aware.)
Precisely. And that copy won't be you; it'll be a guy who looks just like you, and has your memories, and is operating under the delusion -- the most thorough and complete delusion imaginable -- that he is you. But he's not you.
Jake, this is the classic response to your problem:
Say that you want to have your brain interfaced with a computer network. The procedure begins with one part of your brain being replaced by functionally identical, but "interfacable" component. After this first procedure is done, you won't notice any difference in everyday mental life. So you keep coming back week after week, replacing parts of your organic brain with synthetic components. After while you'll start noticing the phenomenal difference of being interfaced with a computer network (whatever that feels like) but you'll still retain that subtle feeling of you-ness (after all, the original state vectors held your organic brain were copied to nearly identical representations in a synthetic medium). Finally, you'll have completely swapped organic brain-matter for man-made brain, but without the phenomenal experience of 'dying'.
Its like a modified version of what your body already does - "you" in your current "self" retain no atom identical to any that composed your"self" when you were born.
Leif,
If one were to assume that one's sense of self comes from a particular electric/magnetic field in a particular configuration in a particlar area of space-time (your brain), then swapping out parts is fine--you are maintaining that field in its current form. But creating copies doesn't do that; it creates all-new fields in a new location.
You are very correct that we swap out molecules and cells and still retain the feeling of self, so why not larger components?
But replication is a whole different issue.
sorry for the typos im in abit of a rush
Leif -
This poses an even thornier problem. What if, during that brain-replacement procedure, the removed organic squishy bits were kept in a jar full of nutrients. Once your brain was entirely replaced, the squishy bits are stuck back together, and your original, biological brain is once again functioning as normal.
If you attach a speaker to the brain in the jar, it will claim to be the original. The cyborg in the surgery will claim to be the original as well. However -- and there's no way for any outsider to tell the difference -- the consciousness of the original can only go one direction; either it's residing in the jar or it's residing in the cyborg.
Or, perhaps, the original stopped existing altogether and both the jar and the cyborg contain perfectly-deluded imposters. I certainly can't claim to know what would happen, but I do know that whatever the "me" I perceive really consists of -- even if it's just a chemically-induced illusion -- I want to keep that specific iteration of me going for as long as I possibly can.
Its hard to upgrade after the total system failure caused by some hacker
I hate to say this, but the chance of your biological framework encountering a total, irrecoverable system failure within 120 years is currently about 100%.
TallDave,
hate to say this, but the chance of your biological framework encountering a total, irrecoverable system failure within 120 years is currently about 100%.
Im well aware of this, but I would prefer the cause not be some Brazilian teen wanting to use my brain as a counterstrike server.
If you attach a speaker to the brain in the jar, it will claim to be the original. The cyborg in the surgery will claim to be the original as well.
The correct answer is both are the original, and neither.
You're not the same person you were 10 years ago, 10 months ago, or ten seconds ago. Identity is fleeting.
The really thorny problem that arises out of this is property rights. What if you make a million copies of yourself? Do they all have claims on your assets?
I think eventually the law will say the individual best able to prove physical continuity has claim.
You wouldn't care about being killed, just because another copy will exist? Remember, you will cease to exist, in the sense of your consciousness. Just because another copy of you, with its own consciousness, will survive, doesn't mean that you don't cease to be.
You cease to be and are replaced by a slightly modified version of yourself every instant.
I am of the opinion that what people perceive as a "soul", but is really self-awareness, is created by the chemical/electrical/magnetic mess in our brain, in toto. You can replace small parts (cells, organic molecules, tumors) without overly affecting the feeling of self-awareness.
Therefore, if you were to start swapping out organic parts for mechanical, the in toto mess would stay where is was, in your head. Even if you eventually replaced almost all or all of it with cybernetics, the mess would have been maintained without major disruption, remains in your head, and remains as a constant sense of self-awareness.
That's my theory, anyway. Disrupt the mess too much, you die. So reassembling the organic parts wouldn't do shit; it's too late.
but I do know that whatever the "me" I perceive really consists of -- even if it's just a chemically-induced illusion -- I want to keep that specific iteration of me going for as long as I possibly can.
Yeah, humans are programmed with this compulsion for self-preservation. It obviously has a lot of evolutionary value, but it's essentially meaningless in an absolute sense. We accept the illusion of continuous identity because it's close enough.
So yes, if you upload your consciousness and discard your physical form, you're dead, even if the copy is perfect.
But remember, you voluntarily terminate your consciousness every night anyway. Why? Because you also have a strong compulsion to sleep.
Here's a good book to conquer if you're interested in how the "subjective experience" might arise out of brain-stuff:
http://www.amazon.com/Being-No-One-Self-Model-Subjectivity/dp/0262633086/ref=pd_bbs_sr_1?ie=UTF8&s=books&qid=1206386795&sr=8-1&tag=reasonmagazinea-20
If you copy a contract and then burn the original, the original is gone, no matter how good the copy.
But the contract still exists. What happens to a particular instance of the contract is irrelevant to the continued existence of the contract itself.
My only concern is the length of the line attached to Jake 1.0. Copying myself doesn't change the length of that line, and so it cannot possibly convey immortality.
You're assuming your conclusion here. Your line branches when you create a perfect copy of yourself; continuity is maintained.
Brandybuck: I don't think the version of the World's Smallest Political Quiz can distinguish between Paleo and Cosmo libertarians. As I'm for gay marriage, legal prostitution, and have no interest whatsoever in any legal discrimination with regard to anyone's religion, race, gender, sexual preference and immigration status, I don't think I'm any sort of Paleo.
RC, if we have original RC (A) and copy RC (B), you are A, correct?
If I shoot you in the head, you die. From your perspective as A, you have ceased to exist.
B continues to exist, but that doesn't do A any good because A is dead and has no consciousness.
RC Dean,
Look at my twin example. At the split point they become two separate peoples, not just a duplicate of the original. No continuity is maintained if Twin A then dies.
Once the copy is made, whether biologically be cells splitting or mechanically by uploading a brain, a new person is being created. It may have the same initial conditions, but it is a different person.
From your perspective as A
But from my perspective as B, a copy has been erased, and I continue.
Look at my twin example. At the split point they become two separate peoples, not just a duplicate of the original. No continuity is maintained if Twin A then dies.
The reason there is no continuity in the twin example is because there's no consciousness or personality when the split occurs at the zygote stage.
What about the Paratwa babies?
Won't anyone think of the Paratwa babies?
While I am a biotech transhumanist, the quiz seems somewhat biased.
So a "perfect Libertarian score," was garnered by someone who supported a war of aggression (what purer form of socialism exists?) in Iraq, supports a worldwide dictatorship to fight global warming, supports its imposing new taxes, and supports using those taxes to subsidize pseudo-private oil companies in scams like ethanol production.
It seems either the World's Smallest Political Quiz could stand an overhaul, or "perfect libertarianism" means "agreeing with Hilary Clinton."
Regarding the duplicate self paradox.
The paradoxical twin will be DOOOMED!
Even with it's ass on backwards.
The upload one is tough. I've considered that same thought expirement as Stretch offers: If you're duplicated exactly, which one was is the real you? Both, when t=0, but after that, dunno. If one dies, do "I" die. Shit. Of course the other way to look at is if I was uploaded (or at least perfectly duplicated materially), is that really all that different from when I go to bed in a few minutes and wake up tomorrow morning?
Ken MacLeod's sci-fi addressing these types of questions is pretty top notch.
With many new announcement about the wizard of oz movies in the news, you might want to consider starting to obtain Wizard of Oz book series either as collectible or investment at RareOzBooks.com.