Joe Vukov | Navigating the Technological Divide
For some, technology offers humanity a path into perfection. For others it is the means of our downfall. Somewhere in between there is Joe Vukov.

This image was created with the assistance of DALL·E 2
For some, technology offers humanity a path into perfection. For others it is the means of our downfall. Somewhere in between there is Joe Vukov.
Description
For some, technology offers humanity a path into perfection. For others it is the means of our downfall. Somewhere in between there is Joe Vukov. In the episode, Joe helps to explain the pitfalls of both extremes—on one side, the transhumanists (who embrace technology as a way to become more human) and on the other, the neoLuddites (who shun certain kinds of technology)—and begins to clear a path somewhere in the middle.
- Originally aired on August 31, 2023
- WithJim Stump
Transcript
Vukov:
Christ said, “Blessed are the meek, blessed are the poor.” Not blessed are the 130-year-olds with 130 IQ who have razor sharp memories. I think Christians need to be concerned about what it is we’re elevating as who are the most blessed and the pinnacle of humanity and a lot of Christian teaching subverts that by either holding up the poor as being especially blessed, or even a step back from that by saying all humans have equal human dignity. So it isn’t something like living a long, smart, efficient life that gives us dignity, but rather being human, being made by God that gives us dignity.
I’m Joe Vukov, associate professor of philosophy at Loyola University of Chicago, where I’m also associate director of the Hanks Center for the Catholic Intellectual Heritage.
Stump:
Welcome to Language of God, I’m your host, Jim Stump. For as long as humans have been around, we’ve been finding ways to enhance ourselves with tools. The tendency might even be part of what makes us human, but as our technologies become more advanced, the questions around ethics of human enhancement become harder.
What’s the difference between using technology for enhancement versus helping to relieve suffering? Who gets access to technological advancements? What might be the long-term consequences of merging technology in the human body? There are some groups of people who respond to those questions by avoiding technology as much as possible.
There are others who think that technology could help us become better humans, perhaps even make us more human, but neither extreme is clear cut. And who better than a philosopher to help us wade into these nuances?
Joe Vukov’s recent book is called The Perils of Perfection: On the Limits and Possibilities of Human Enhancement. In the episode, we discussed the book and attempt to find a path down the middle between these extremes.
Whether you tend to be wary of technological intervention or enthusiastic about it, the conversation provides some helpful guidelines with the ultimate goal of finding the good life, not through technology, but through Christ. Let’s get to the conversation.
Interview Part One
Stump:
Well, Joe Vukov, welcome to the podcast. I’m glad to be talking to you.
Vukov:
Thanks for having me, Jim. I’ve been really looking forward to this.
Stump:
All right, so you and I met at a workshop last year and discovered that our backgrounds are both in philosophy and we’re both Christians, and that led to a couple of interesting conversations, but they’re fairly brief and I don’t really feel like I know much about you. So let’s remedy that a little bit upfront here. Who are you? Where’d you come from? How did you end up becoming a philosopher?
Vukov:
Yeah, I was raised in Southern Minnesota, Rochester, Minnesota, home of the Mayo Clinic, and like most people raised in Rochester, everyone around me was health sciences, so my whole family, most of my friend group. That’s kind of what people did in Southern Minnesota, especially in Rochester. Because of that, I grew up with a deep love for the sciences especially.
So actually I was set on studying science in college all the way through high school. This sounds like more of an accomplishment than it is, but I literally took every single science class, on the books, it’s my small rural high school. Again, small high school, so it’s not as impressive as that sounds. But I was completely set on doing sciences until my high school English teacher won me over to the humanities.
And by a series of twist and turns, I eventually wound up getting a PhD in philosophy, but also eventually kind of rediscovered this love of both the sciences and the health sciences specifically. So while I wrote a really theoretically oriented kind of dissertation, a lot of my work since then has been in bioethics and the ethics of neuroscience, and most recently I’ve been thinking a lot about the sciences of human enhancement and how a philosopher might have some perspective on those kinds of new technologies as well. So that’s kind of the 50,000 foot view of my academic development.
Stump:
Yeah. What about your faith background? Do I remember correctly that you converted to Catholicism as an adult?
Vukov:
Yeah, that’s right.
Stump:
What happened there?
Vukov:
So I was raised as a very serious evangelical and I am really grateful for my childhood. I received great instruction from my parents, from my church community, but like a lot of people when going off to college lost faith more or less during my college years. And then again, it’s a long story, but by a series of twists and turns, eventually found my way into theism and then into Christianity and eventually found a home in the Catholic Church. So I ended up joining the Catholic Church in 2016. No, not 2016, it’s been a little bit longer than that. 2011, so quite a while back now and been a Catholic ever since then.
Stump:
As you were growing up and saying you had these interests in the sciences, but you were also evangelical, how do you compare though the way that science and your faith interacted as a kid growing up into your teens, perhaps even college years versus now as an adult and the change to Catholicism?
Vukov:
Yeah, I mean, I think growing up I loved both of them. It didn’t sense a lot of conflict between them. I think that I was, again, a very serious Christian and very serious about studying the sciences. I think as I kept going in my educational and religious formation, I think there were some areas of tension that I started seeing and not necessarily explicit tensions, but I think I just started feeling like is the life of faith and an intellectual life really as compatible as I thought it was?
And human journeys are multifaceted. So that wasn’t the one thing that sort of let me leave my faith for a little while. But I think that was one of the things, sort of trying to reconcile this love of things intellectual, love of philosophy, love of science along with my love of Christ, and figuring out how those two things could fit comfortably together.
It’s one of the things that really brought me into the Catholic Church is that, well, not distinctively, the Catholic Church I think does a really good job of talking about the relationship between faith and reason, science and religion. St. John Paul II has a encyclical called Fides et Ratio or Faith and Reason, which is a really marvelous job of this.
So that really kind of was one of the things that drew me into the Catholic Church. I actually don’t think it’s distinctive to Catholics. I think Christians and theists and religious folks across the board can have a really healthy relationship between science and religion, but it was that sort of thing that was one of the things that really caught my attention with the Catholic Church and that eventually led me there.
Stump:
Good. So now you teach at Loyola in Chicago. What kind of courses do you typically teach?
Vukov:
Kind of a smattering. We are a large institution, so we have a lot of students taking Philosophy 101, so I teach that fairly regularly and getting ready to teach it here. So we’re recording this middle of August, so I’m gearing up to start one of those sections in just a couple of weeks here. I teach some specific courses in sometimes called Neuroethics or the Ethics of Neuroscience, also in the ethics of Enhancement.
So the sorts of issues we’re talking about today, a lot of them actually came up in conversations with students as much as in my reading. Then I also teach a course that’s one of my favorites. I co-teach with a biologist, so he keeps me honest about whenever I try to dabble in the sciences to make sure that I’m actually doing good dabbling. And we teach a course called Philosophy and Biology For the Future.
And in that class what we do is we read science fiction novels and then we cover the ethics, the religion, the philosophy, and the science in those novels. So we’ll read a novel and then my colleague in biology will talk about the science background for it. I’ll talk about the philosophical, religious, ethical background for it, and it’s a really neat class and one of the favorite ones that I teach fun.
Stump:
Are most of your students Catholic at Loyola?
Vukov:
I think by demographic we’re a majority, but not a huge majority. The last time I saw statistics, it’s about 57% Catholic school. We also have a fairly large Muslim and Hindu student body and not a huge, but there’s also several protestant, evangelical non-denominational Christians around as well. And of course, which is an increasing factor on any college campus, we have an increasing number of nones, not N-U-N-S’s, but N-O-N-E’s.
Stump:
So I’m always interested in talking to professors like you who are at big institutions like that to get a feel for what are the science and faith questions that students have these days? Do you find any recurring themes or the sorts of things that students at Loyola who predominantly come from a faith tradition of some kind but come to a school like that and are engaging in the sciences?
Vukov:
Yeah, I mean, I think one thing is this idea that the sciences are very cutting edge and up to date, while religion is sort of passe and not modern anymore. I think that’s probably the most common thing I see. In the classes I teach, part of it is because of the content I teach, so it just doesn’t come up as much, but sort of these historical questions about evolution generally or human evolution specifically, again, I’m not teaching those as much.
They don’t come up as much, but I see it more so in my students in this sort of general sympathy towards the modern scientific way of doing things and what sometimes they’ll think of as an outdated way of doing things in Christianity specifically or religion generally. So I think it’s kind of more that sentiment is something that is something that I obviously disagree with, and I think that there’s a lot of room to be done to show students, and I think it’s not just my students.
I think it’s a sentiment in a lot of different corners of society that, and I know you agree with this, Jim, that of course it doesn’t have to be that way, that it’s not musty religion talking and losing out to modern science, but that there can be a much more better dialogue going on between those two.
Stump:
Good. Well, let’s enact some of that dialogue here, particularly about the topic that you’ve just written a book on, which is, I’m holding up to the microphone here if you can hear it, The Perils of Perfection: On the Limits and Possibilities of Human Enhancement.
So first of all, what attracted you to writing about this topic? How did you get interested in human enhancement, or the big fancy word we’ll use, transhumanism?
Vukov:
Yeah. So I kind of got into it by way of some specific issue. So as I said, I do most of my strict academic research, so my publishing journal articles kind of research, in the ethics of neuroscience, and a lot of that scholarship focuses on specific new neural technologies that are coming out that allow us to do things we haven’t been able to do before.
Stump:
We keep hearing of Elon Musk and the Neuralink, right?
Vukov:
Exactly. No, yeah. So I mean he is making waves and making all sorts of business for people that write on the sort of topics that I do. Yeah, I mean that’s a perfect example of generally Musk’s Neuralink is part of a family of devices called BCIs or Brain Computer Interfaces, which is a mouthful, but is exactly what it sounds like. It’s interfaces that attempt to make some sort of connection between brains and computers. And in turn, plausibly could allow you to do all sorts of different things.
One of the applications that Musk has been talking about, and a lot of the people working on BCIs is not just sci-fi out there, Terminator type stuff, but it’s allowing people to have better use of prosthetics. So you might think of how can I include a chip that allows me to use a prosthetic device in a more natural way?
So there’s a lot of interesting research going on there. And of course, that comes with all sorts of ethical questions about when is it okay to start doing these new neural technologies? When is it not okay? What are the sorts of privacy concerns? What are the sorts of monetary concerns? What are the sorts of social concerns, ethical concerns, religious concerns?
So I was doing a lot of research on some of those specific questions and then started thinking about, well, how does this sort of specific different kinds of new neural devices hook up more generally to questions about what does it mean to be a human? How does my convictions as a Christian, generally, and as a Catholic, specifically, hook up with that vision of what it means to be a human?
And then as I started asking those questions, I started broadening my vision to think about things like genetic engineering, and as you mentioned, the transhumanists who are this group of scholars and armchair intellectuals. Some of them are real intellectuals, some of them are armchair intellectuals who are, it’s hard to define them, but they’re very into enhancement is maybe one way to put the transhumanist. They really like it.
Stump:
I was just going to ask what the overlap here is between some of these technologies you’re talking about and transhumanism. Do you have a quick and easy definition for our audience of transhumanism?
Vukov:
Yeah. So the transhumanist, the trans part in the beginning of there means it’s from a similar route to transcend. So the idea is to move beyond the sort of existence that humans have been living up till now and to transcend it in various ways. One of the ways that gets probably the most attention in social media and general media is life extension technology.
And that’s kind of been one of the defining features of Transhumanists is humans have a lifespan. I forget the Psalm reference, but there’s in the Psalms talks about humans having the lifespan of 70 or 80, if we’re fortunate. The transhumanists say, “Well enough of that, we can get it farther out than that.”
And that’s kind of the attitude to our lots of different things, whether it’s lifespan, whether it’s IQ, whether it’s memory capacity, whether it’s strength. The transhumanists are united in this idea that we need to be satisfied with that kind of human existence and we can transcend it in some way.
Stump:
So that’s where the little plus comes from that you see sometimes with H-plus or humanity plus.
Vukov:
Exactly.
Stump:
It’s not just making us more of what we ought to be or we’re intended to be. There’s lots of assumptions packed into that already, but going beyond, becoming something different, right?
Vukov:
That’s right. And yeah, there’s another group that you might hear about sometimes called the post-humanists, which are fairly similar to the transhumanists. They maybe don’t get quite as much mileage and popular attention, and they’re very similar in some ways, but they have these quibbles about, is what we’re striving to become something post-human as in something entirely different altogether, or is it the H-plus?
So the transhumanists are more the H-plus crew who think we want to transcend our existence and make it something more, and the post-humanists think we actually are aiming to try and become something other than human altogether. But at the end of the day, they’re both excited about and talk about the same sorts of technologies and the same sorts of ways to get there. They just have a little different understanding of where we’re going.
Stump:
Okay. So related to that, I saw in the book really only one little line about ChatGPT. I assume most of this was written before ChatGPT really broke onto the scene late last year, but I wonder how artificial intelligence is changing the transhumanism conversation at all. These two different groups, the post-Human versus the transhuman, is there a different role for artificial intelligence in any of that?
Vukov:
Yeah, there’s a good question, and actually it’s one I’m thinking about a lot after The Perils of Perfection came out because like you said, there’s just a small part that talks about ChatGPT, and this was right when all of it was coming out. They actually asked me to do a follow-up book. So I am currently rushing to try and get that book out the door, which will be a kind of follow-up to it that focuses specifically on questions about artificial intelligence.
And it’s a complicated question in that you can ask both about what artificial intelligence is to begin with because there’s a lot loaded just into that term, there’s intelligence. Is it really an intelligence? Does that even quite make sense or is it something else? Is it a fake or a foe or a pseudo intelligence?
So there’s those kinds of questions. And then there’s also the questions about how can AI be integrated with or supplemented to the kinds of other technologies that transhumanists have been talking about a long time to make ourselves even better?
Stump:
So let’s get to your book here a little more specifically then. We will perhaps talk again when the new one comes out about artificial intelligence, but transhumanism. So let’s start with the title, the Perils of Perfection and relate that perhaps more explicitly to some of the goals of transhumanism. And let me also toss in a potential objection here that Jesus said, “Be perfect,” right? “Be perfect, even as your father who is in the heavens is perfect.”
What is it about this attempt to be perfect that we ought to be cautious of with regard to transhumanists here and maybe throw in at least a kind of response to Jesus and how he might hear this whole conversation?
Vukov:
Yeah, Jesus did seem to think that perfection was a good thing and he’s an important person to answer about that. To teaser trailer, I’m not going to disagree with Jesus on that one. I think it’s a good thing that we should be aiming for as Christians. I think it turns on what you mean by perfection.
And the concern that I address in the book is that the transhumanist idea of perfection, and Jim, it can be so easy to throw the transhumanist under the bus in a way because they tend to be Silicon Valley style billionaires, at least the ones who get a lot of news play. And it can be easy to think it’s sort of this niche thing that silly billionaires with too much money that don’t know what to do with it. So why not try and extend our lives, talk about… But I really do think it’s the sentiment behind transhumanism is one that almost all of us participate in.
So I think sometimes it’s helpful to think about the transhumanist mindset rather than actual Elon Musks of the world out there who are maybe actual transhumanists. And I think the transhumanist mindset is one all of us fall into fairly regularly. And the problem with it is this idea that what perfection is amounts to a longer life, a higher IQ, better integration with technology, a more efficient life, and all these sorts of things that the transhumanists are aiming for.
And as they list those off, those sound like things that a lot of us that even aren’t transhumanists think about is when we think about what does it mean to attain perfection? We can get into this more, but I think there’s all sorts of reasons to be worried about aiming for that kind of perfection, all sorts of things that both the Christian and non-Christian alike should be worried about.
Where I think that Jesus is correcting for this is that the kind of perfection he’s talking about is the perfection of his life and what is that life? It’s not long-lived. He died when he is 33 years old in a horrible way. It’s not a glamorous life, it’s a life of persecution and a life of suffering in many ways, it’s a life born to a poor family in backwoods of Nazareth.
I mean, it’s not what we think about as perfection. The transhumanist, but I think most of, I’m lumping all of us in there when we first think of perfection and Christ is telling us, “Well, that kind of perfection, you shouldn’t be aiming for. You should be aiming for the sort of perfection that I’m going to show for you in my life.”
And as soon as we flip what we mean by perfect, then of course I’m okay with perfection. It’s just that I’m a philosopher, right? We’ve got to be careful about our terminology. If we get that straight, then yeah, I can be a big fan of perfection as well.
Stump:
For sure. Okay, so one of the things I really appreciate about your book is that you’re not just throwing transhumanism under the bus, that you’re showing some of the messy middle where there are obviously these two extremes like most sides. There’s on the one side complete transhumanism or humanity plus. On the other side though, you talk about the Luddites, or at least the Neo-Luddites, and I wonder if you might give a snapshot of each of those more extreme positions and why each of them may have a kernel of truth, but probably needs a little more nuance, and then we’ll get to your middle position after a while.
Vukov:
Okay, great. Yeah, so the Luddites take their name and I use the term…
Stump:
Who are these people? Who are Luddites anyway? Where does that come from?
Vukov:
Yeah, so it’s a group of actual people in the past who… I’m going to get my dates wrong here. I’m not a historian so I’m not even going to try. But they were a group in England who at the rise of the industrial revolution would go out and smash technology at night. So they were losing their jobs and they didn’t like that they were losing their jobs. So they would literally, not metaphorically, these are not people who are writing angry tracks to the London Times. They’re literally going out and smashing this technology.
It was a fairly short-lived movement, but since then, the term Luddite has sort of been broadened to include anyone who has this reaction to technology, whether they’re actually smashing it or not. It’s the person who thinks that modern technology is no good and is depriving us of something. And honestly, my sympathies lie more with the Luddite than the transhumanist in many ways because I think that one thing the Luddite does very well, and I’m using the term here broad, not the actual technology smashers, but people who have this sort of mindset to whether it’s human enhancement, things like Neuralink or genetic engineering or integrating artificial intelligence into everything, like we’re seeing right now.
The Luddite in all of us has this reaction that says, “Is the thing we’re pursuing here actually a deeply human thing? And let’s slow down. Hold on for a minute, and is this deeply human?” And the Luddite says or tends to say, “Well, no, it’s not. The technology here is not adding on to anything that is going to make my human life more human or more valuable, and in fact might even detract from it.” Modern technology may help us do things more efficiently, but may erode social relationships and family relationships. And of course, we’ve seen all that in social media and modern technology that a lot of those predictions are absolutely right. So that’s the Luddite side of things. And I think just the tone in which I’m talking about them, I’m pretty sympathetic to a lot of those arguments. Where I think that they go wrong or can go wrong is in sort of getting too caught up in the technology and thinking that it’s the technology that’s robbing us of our humanity and not thinking more deeply about how technology might actually further our humanity or how modern technology or forms of human enhancement might be compatible with really robust and flourishing human life.
I’ll just take one more minute on the Luddites, then we can turn to the transhumanists. But one thing that I point out in the book is that a big chunk of our lives includes technology that was modern and novel and maybe even scary at some point. So we’re sitting in different parts of the country right now talking over the internet, recording a podcast. But you know what? It’s kind of good because we didn’t have to take up a whole two or three-day trip to go drive and see each other and we can still have this conversation. Think about even things like the clothes we wear, right? We’ve perfected ways of making winter clothing. I live in Chicago where having really good winter clothing is a good thing and it helps me get through the winter in a way that simple wool clothes might not have.
Stump:
And I think fair to say, even I really like this example that you’ve given because it helps to break down what seems to be too easy of a distinction that’s made sometimes between simply relieving suffering versus enhancement as though we can draw a straight line and say that enhancement’s always bad and relieving suffering’s always good. So clothing has been an enhancement.
Vukov:
It’s absolutely an enhancement. Here’s another good one that I use in the book that I think is helpful for bringing in this gray area, and I think that is again, what we’re talking around here, but to make it more explicit, I think a problem for some Luddites, because if the Luddite is saying, “Any enhancement is bad,” here’s a problem for you is how do you define an enhancement?
Well, it’s tricky, but a definition that’s broad enough, most will agree with it, is that it’s taking us beyond healthy functioning in some ways. So whereas therapy or treatment is something that’s aimed at restoring human health, enhancements try and take us beyond human health. So here’s two examples. On those definitions, my glasses work as a treatment. I have horrible eyesight. I need to wear glasses in order to see things. Also on those definitions, my binoculars absolutely count as an enhancement.
Actually, on any definition I can think of, a pair of binoculars is a human enhancement. They take human eyesight beyond how healthy human eyes typically function, but I don’t know of anyone that thinks binoculars are morally problematic. So already you have these kinds of examples that sort of raise this question mark about, well, maybe enhancement itself isn’t a bad thing. Maybe it’s something else that’s going on behind the scenes. And that I think is the sort of thing that Luddites can… If Luddites draw a stark line in the sand in saying no to new technology, no to enhancement, I think there’s just all these sorts of things that we’ve already adopted and are ethically okay with that are really hard to maintain that kind of position consistently.
Stump:
Yeah. Well, still, I’m conscious of where we are in my questioning that I want you to talk about the other extreme still of transhumanism, but while we’re still talking about the Luddites, another response that’s sometimes, at least superficially or as a caricature, made of Luddites is that they say that the argument is, “Well, we shouldn’t be playing God.” And you give a nice response in your book here to that as well. If you could say a little bit, what’s wrong with saying technology is bad or enhancement is bad because we shouldn’t play God?
Vukov:
So it’s something that I inevitably get whenever I teach these topics or I’m presenting on them. Inevitably somebody raises their hand in the first two or three hands and says, “Well, I know why we shouldn’t pursue human enhancement. It’s playing God.” And part of me thinks it’s a good response, and I want to acknowledge that, but I think it’s more complicated than it initially seems. I think when people say, “It’s wrong to play God,” they’re meaning something like, “We shouldn’t be messing with nature in ways that go beyond what humans are allowed to mess with nature.” And I think that’s problematic for at least two reasons. First of all, humans by nature are things that mess around with nature.
So you can think here about the ant that builds the anthill. It’s not like you look at the ant building the anthill and say that ant is usurping its ant hood by messing with nature and making all these hills. Well, no, making anthills is part of what it does. And same thing, humans, part of what we do is we innovate, we invent, we build cars, we build buildings, we make air conditioning. We do all these sorts of things that, they’re not going against human nature, and a lot of times they are affecting nature in ways that haven’t been affected before. But I think it’s overly simplistic to say, just because we haven’t done something before, it goes against our nature.
Stump:
They might even say there’s a biblical mandate for such too. “Fill and subdue the earth.” Right? God didn’t make the world filled.
Vukov:
Exactly. We could have a whole conversation about what subdue the earth means there. But yeah, no, I mean I think that part of what that’s getting at is that humans, we affect nature in the way, by the way, that all sorts of other organisms do. There’s ways that I think we do maybe overstep our bounds, but just affecting nature can’t be the problem. Also, it could be some people interpret the Imago Dei, the Image of God, as being our creative capacities too. So there’s that whole conversation that maybe it’s not only part of our nature, but it’s like the divine spark, the Image of God part of us is the part that we can be creative and how we affect nature around us. The second reason I think that the playing God charge doesn’t quite land for me is that the motivation here is to preserve God’s sovereignty, which is an absolute good one.
As Christians, we want to keep God sovereign. However, I actually think it has the opposite effect because by saying that we’re messing with things only God can mess with, it’s kind of like we’re putting God in a bubble and saying, “Here’s the things that we do. Here’s the things God do and we best not compete with God.” But that’s actually diminishing God’s sovereignty. God is not sovereign over his bubble. God’s sovereign over everything and everything we do is in the purview of God’s watchful eye and his control. I think this idea that there are some things humans shouldn’t just do, motivated by the right thing, but ultimately I think diminishes God in a certain way because it suggests that God gets this stuff over here, but not the stuff we’ve been doing up until now.
[musical interlude]
Interview Part Two
Stump:
Okay. Let’s switch over to that other side, the more extreme sorts of transhumanism that you’ve already started to describe some of that, but give a little bit of, and here’s perhaps why this is too far. Here’s why we don’t want to go to this extreme.
Vukov:
Yeah. I think what the transhumanist problem is, it comes down to trying to dig into why pursue all these projects? So again, just to give you a laundry list that’s not exhaustive or complete, but things like extending our life by 50 years or using genetic enhancement to make our genes better in some way, whether that’s to be disease resistant or for aesthetic purposes, whether that’s things like there’s really cool research into memory modifications.
So there’s things that we’re starting to be able to learn how to do to modify memories or Elon Musk’s Neuralink, hooking up our brains to iPhones and monitoring our brains and using them for prosthetics and things like that. I think the question I always want to ask is, “Well, why are you pursuing those things? What’s the motivation?” And I think there’s two things transhumanists can say, and one is problematic and one is disappointing.
So the problematic one first, you might say something like, and this is what I suspect, not only transhumanist, but a lot of us sort of think at the back of our minds is that that kind of life, a really long-lived, efficient, high IQ, beautiful life is the best kind of human life, that’s like the humanity at its pinnacle. I think the Christian especially, but I think anyone has reason to be suspicious of that. Christ said, “Blessed are the meek, blessed are the poor.” Not blessed are the 130-year-olds with 130 IQ, who have razor sharp memories. I think Christians need to be concerned about what it is we’re elevating as who are the most blessed and the pinnacle of humanity. And a lot of Christian teaching subverts that by either holding up the poor as being especially blessed, or even a step back from that by saying, all humans have equal human dignity. So it isn’t something like living a long, smart, efficient life that gives us dignity, but rather being human, being made by God that gives us dignity.
Stump:
Yeah. So let me see if I can speak on behalf of them for a second. I totally, totally agree with you on this, that particularly technologies like this end up benefiting the rich, right? They’re not distributed, and that’s, I think, a serious concern in this. I wonder though whether the reasoning you’re giving there might equally be applied to other sorts of technology, other sorts of medical interventions. Do we say, implicitly, when we try to cure cancer that people who don’t have cancer are better, their lives are objectively worth more than people who do have cancer? Or is there a difference in your mind between, again, trying to relieve suffering and bring us to a normal healthy state as opposed to enhancement working in that regard, particularly with respect to this question of the value of a life? Does that question even make sense to you?
Vukov:
It absolutely does, and I think there’s a couple of things I want to say. I really appreciate the question, Jim. The first thing is something you were hinting at towards the end of it, which yeah, I think there is an important distinction between interventions that are aimed at enhancing us and interventions that are aimed at treating us for something.
So Christ says, “I came that you might have life and have it to the fullest.” In the Catholic tradition, there’s this quote from a saint who’s now escaping me, but, “The glory of God is the fully alive human,” or something like that. And so there’s this whole tradition going all the way back to the gospels that says, part of what Christ wants for us is to flourish. So if there are interventions that can help us get rid of cancer and help us flourish better as human beings, then that’s a good thing, and that certainly is part of what Christ wants for us and what we should be aiming for.
The second thing, though, I wanted to say is even when it comes to those sorts of things, I think we have to make a distinction between a thing that is good for us and the thing that makes our life valuable as a whole. So is living longer a good thing for us? Well, I want to say, yeah, of course, it’s good. Is being healthy a good thing for us? Well, yeah, sure it is. Is being sharp, mentally sharp? Is that good? Well, sure, that’s a good thing. I mean, that lets us make the world a better place or it can do that, but is it the thing that confers our fundamental worth onto us? That’s where I want to stop short and say, no, that’s not the thing that confers fundamental worth.
So yeah, I think that at the end of the day, the transhumanist, if they can’t say, without falling into these problems, that the smarter, faster, longer lived life is the intrinsically best human life, and they have to fall back to just saying, “Well, no, it is better or it’s a good thing to live longer.” That’s true, but kind of disappointing because it doesn’t tell us how to balance that good with other goods. It doesn’t say that’s the thing that’s ultimately conferring value. It’s one of the good things that are out there. It’s almost as disappointing as saying something like, “My mom’s carrot cake is really good.” Well, yeah, it is really good, but that’s not going to then tell you exactly how I should live in light of recognizing that thing as a good.
Stump:
What do you make of… So here’s an example of a potential enhancement that I haven’t read very widely on, but it’s kind of piqued my interest and attention. So there’s been a few people that have said, one of the ways that we could really make the world a better place, and maybe even to use the language you were just using there of human flourishing would be if we could figure out how to genetically modify ourselves to be nicer people.
So there’s a movement here of some sort that thinks that we might eventually be able to figure out, can we use CRISPR to look at these embryos and select for the ones that don’t have as bad of a temper? Would that be a good thing? And can we genetically engineer the sorts of things that might really lead to human flourishing and a better existence for us here?
Vukov:
Yeah, good. So it is something I actually have been thinking about recently.
Stump:
Good.
Vukov:
And here’s my long story short, is that it’s not as easy as sometimes the headlines make us think it is. Why? Well, because even if we are able to, let’s say, locate a gene that correlates in a fairly substantial way with niceness or generosity or empathy or take your pick, we could try and modify humans that more of them carried that genes. Even if we do that, though, when you think about what it takes to be a flourishing human being, it’s not just the predispositions you have, but it’s also the willingness to go along with those dispositions. So forget about genetic enhancement, there’s all sorts of things we know lead to more moral human beings, good upbringings, good education, certain kinds of socialization, forming lasting friendships, not engaging in certain antisocial behaviors. So all those are part of the recipe. But I think sometimes these genetic, sometimes it’s called moral enhancement, gets pitched as, “Here’s the switch we can flip.” And there’s no aspect of human morality that works like that. There’s just nudges that we can give people in one direction or another. So that’s the first thing I would want to say is that I’m highly doubtful that there’s any switch we could flip.
Stump:
It’s not a point mutation somewhere, you can’t change one letter of the genome.
Vukov:
Exactly. So because again, humans are complicated, and even if we’re getting a nudge in one direction, we might say, “I don’t want to go in that direction. I want to go in a different direction.” So there’s this thing called the human will that can get in the way of the nudges that are pushing us to and fro.
That said, are there interventions that could be okay for us to pursue to try and make us more moral? I’m open to it… So we could get into the weeds if you want to. I think there’s certain, especially genetic enhancements, some of the procedures, I think there might just be reason not to pursue that on its own, sort of independently. But yeah, I mean, if Elon Musk’s Neuralink finds a way to give people a nudge towards morality better, and we’ve checked off a whole lot of ethical concerns you might have with that, I’m open to it. So I think one thing in this book is I lean towards the Luddites, but I’m also open to new technologies, especially if they’re in pursuit of the good in some way.
Stump:
Yeah. So speak a little bit then about another issue you bring up in the book called The Moral Parody Problem, where many of the reasons that we give for resisting transhumanist technology could equally be applied to our iPhones or automobiles or vaccines or clothing. Unpack a little bit what that problem is.
Vukov:
Exactly. So The Moral Parody Principle, the idea here is it’s a mouthful, and this is the philosopher in me coming out. I had to give it a sort of fancy name I guess, but it’s a simple idea. And the idea is if you want to say, consistently, that two things are morally different, you have to point to the moral difference between them, and it has to be a distinctively moral difference.
So here’s a non enhancement example. Suppose you have a friend, Tina, and Tina is morally opposed to drinking beer, but she’s okay with wine. And you ask Tina, “Well, why the moral opposition?” And she says, “Because beer has alcohol in it.” You’re going to be cut short because you’re going to think, well, then why are you okay with drinking wine that has alcohol in it as well? And if she says, “Oh, no, I guess that’s not it.” The difference is instead, because I am okay drinking red things but not brown things. You think, well, that’s a difference, but it’s not a moral difference. There’s no moral difference between the color of your food. The reason you’re puzzled here is that she’s not satisfying The Moral Parody Principle. She hasn’t found something that is morally different between the two things she says are morally different.
So apply that to conversations about enhancement. And again, you might say something like, “I’m against messing with my brain using Neuralink because I am against using any sort of external stimulus to make myself think differently.” At that point, I’m going to point to your cup of coffee and say, “You better put that thing down pretty quick because that thing is messing with your brain in ways that affect your behavior and your attitude.”
So yeah, it’s this idea that we have these knee-jerk reactions to a lot of new enhancing technologies, and The Moral Parody Principle challenges that reaction to say, what is the moral thing that’s going on here that makes you not okay with this new thing? Even though there’s all sorts of other things we’re already doing, like drinking coffee, like wearing efficient parkas in the winter, doing all these sorts of things that are enhancing us in ways that at the end of the day, aren’t that different than some of this new, and I’ll be the first one to admit it, scary stuff coming down the pipeline.
Stump:
Yeah. Okay. Well, let’s give you a chance here then to chart a middle course then between the transhumanists or human enhancement, human plus on the one side versus the Luddites or the Neo-Luddites on the other side, how do we find that middle way between those two extremes?
Vukov:
Yeah, I think, so there’s two concepts I talk about in the book, and we can talk about both of them. The first is a view that I call The Fallen Dignity View that’s sort of distinctive to the book’s discussion. And there’s another one that’s less distinctive just thinking about what a good life is. So we can tackle the first one first.
The Fallen Dignity View is this view that I take to be explicitly Christian, but not only available to Christians. In a way, I think it’s a view that a lot of folks today would hold to. So The Fallen Dignity View tries to balance this idea that humans have fundamental dignity, that we are all equal in an important sense, that every human life matters, and that, yeah, there is this intrinsic value that comes along with being human. And then it balances that with the idea that, but we are also fallen. We’re not perfect. We fall short. You might want to talk about that in terms of sin or original sin, or just this idea that humans, we’re wandering in our motivations. We are imperfect in our aspirations, where we don’t give people the benefit of the doubt. I mean, take your lot. I don’t know of anyone that thinks that human beings are just perfect and are perfect in every way.
And I think when you balance those two ideas, so again, human equality and dignity on the one hand, and human fallenness on the other hand, I think if you keep that in mind, I think you’re steered away from the excesses of both the Luddites and the transhumanists. So you’re encouraged not to think about human perfection as being faster, longer lived, smarter, stronger. The idea of human dignity and human equality says, “No, that’s not where human value comes from.” So we ought to be cautious in pursuing projects that are assuming that’s where our human value comes from.
On the other hand, this idea of dignity pushes us away, or challenges the Luddite view in saying humans are pretty interesting and great in some ways, and just because it’s new doesn’t mean it’s necessarily bad. Maybe that’s part of what human dignity consists in, is our ability to alter ourselves in certain ways and alter nature in certain ways. And of course, we have to do that with caution, but maybe that’s part of what comes along with human dignity and then the fallen side of the view. So it’s The Fallen Dignity View, and the fallenness part really pushes us to think about, again, I think it challenges this idea of human perfection being something attainable this side of eternity, and challenges us to think about the ways in which we do fall short and in which we are not going to be perfect, and that we’re always going to be fallen in some way or another.
Stump:
Good. So this is a helpful way of bringing theology more specifically to bear on this conversation. And in that vein, so back before the pandemic, I actually went to a meeting of the Christian Transhumanist Association, which you referenced a little bit in your book.
Their stated goal is to use science and technology to participate in the work of God, to cultivate life and renew creation. That sounds okay, doesn’t it?
Vukov:
Yeah. No, I mean, it’s interesting, and your listeners can look up the Christian Transhumanists online and you’ll find them very easily. They’ve got a whole manifesto and stated list of goals, and they’re the sorts of things you just mentioned, Jim, to work with God in pursuing a more perfect creation and to fully realize our humanity. And there are things that sound very Christian in a lot of ways. The problem here is that, so that the analogy I give in the book is that if you say something like, “Cleveland is really close to Chicago, if you’re looking at a world atlas.” Right? So if you back up far enough, two points might look fairly close, but when you’re actually on the ground as if I’ve driven from Chicago to Cleveland and the Ohio Turnpike is a long drive of not much. So it turns out that those two points are not that close after all.
And I think something similar is going on here where if you boil down transhumanism and Christianity to a certain set of lowest common denominators, then sure you can find some overlap between the two. It’s like looking at the world atlas and saying, “Yeah, Chicago and Cleveland look kind of closely to each other.” But when you actually start looking closely at the two of them, I think they start pulling apart. And this circles back to what we were talking about earlier where you really look at what did Christ mean when he said, “I want you to be perfect.” Well, He wants you to be like Him, and what’s His life? Not much at all like the transhumanist life that’s getting held up. So I think that, again, the language is maybe similar, but I think it’s a superficial similarity that when we dig down, I think the transhumanist goals and the Christian goals don’t really align. Does that mean that a Christian can’t pursue any form of human enhancement? No, because again, I think that…
Stump:
We’re already doing that.
Vukov:
That’s inconsistent with our practices. Yeah, we’re already doing it.
Stump:
Let me see how this response to that strikes you, and it might lead you then into this last point you wanted to make about the good life. So I’ve given a couple of talks about transhumanism and technology and such, and one of the things I’ve said, and it may not be original to me, but I can’t remember who actually said it first, but it’s that too often we treat technology as this completely neutral sort of thing that some people use it for good, and some people use it for bad, so let’s just make sure we use it for good.
And I want to say technology is neither good nor bad in and of itself, but neither is it neutral. It does something to us. And I like to use an example of a friend of mine who moved from Europe to the US and lives in a fairly big Midwestern city where there isn’t very good public transportation. And she was reflecting on this quite a bit of how having to use a car, which you drive into your garage and put the garage door down and you go in, keeps you from interacting with people. She said, this has just been a fundamental difference that I don’t interact with my neighbors now because I don’t walk to the end of the block where we all get onto the bus, we all drive our cars into our garages, and we never see each other. So this isn’t to say that automobiles in and of themselves are bad, but it certainly means that it does something to us. It does something to our way of life. And I wonder if that’s a kind of way of evaluating some of these technologies with regard to transhumanist proposals as well. And let me launch that up there for you to give a shout-out to Ignatius of Loyola and the quoting of him you do about the good life at the end of the book and how you might evaluate some of these things.
Vukov:
Good. I was hoping we would get to St. Ignatius because I think that that is the direction I was going to head with this. So yeah, I mean, I think what sometimes gets missed in these conversations, the Luddites are less guilty than the transhumanists, but I think it’s this deep reflection on what kind of life we’re ultimately trying to pursue.
Now, St. Ignatius is close to me. I teach at Loyola University of Chicago, a Jesuit school, and the Jesuits are founded by St. Ignatius. So we talk about St. Ignatius quite a bit here. He’s famous for founding the Jesuits and therefore all the Jesuit institutions, Georgetown, St. Louis University, Fordham, all the Loyolas, but he maybe should be more famous for this work of spirituality called The Spiritual Exercises. And The Spiritual Exercises are kind of what they sound like. It’s a group of meditations and prayers, and he called them exercises. He was a former soldier. So everything gets put into very military terms, but there are things to do to work yourself out spiritually. And there’s one part in The Spiritual Exercises that’s really central, and it’s so difficult, but it’s where I ultimately end the book on and I think is the right way to think about human enhancement. And he’s got this long litany of different things that humans might aim for and tries to ground them in our pursuit of Christ.
So he talks about how… I’m really trying, I’m paraphrasing here. I don’t have the text right in front of me, but “I don’t want either health or sickness. I don’t want riches or poverty. I don’t want…” And then he goes on and on all these different things. But what’s the takeaway? I only want to do those things insofar as they help me become closer to Christ or the way he puts it to praise and reverence and serve God. So he’s identified the good life. It’s the praise and reverence and service of God, and he’s going to evaluate all the other things in his life, even things that are goods like health and riches and say, those could be good, but only if they are relevant to and helping me seek out this good life that I ultimately think is what a good human life looks like.
And guess what? If that’s not helping me, then bring on the sickness and poverty because maybe that’s going to help me in service of this good life. So if you’re a Christian, you probably agree at least an outline with St. Ignatius on this, that that’s ultimately the thing that makes a human life a good life. And if so, great, I think that’s the right way to think about new forms of human enhancement that you were just talking about, Jim, is that what does a good life actually look like? Well, part of it might mean serving God, but maybe down the pipeline a little bit, it’s things like actually getting to know my neighbors. So maybe this new form of technology, all driving cars or not taking public transportation, this form of technology is not good for that, so I shouldn’t be adopting it.
Obviously, your friend here can’t help, but if I had the choice, maybe it is actually detracting from a good life, even though it’s helping me in terms of efficiency and comfort. And I think that’s a really good way to think through new enhancing technology as well is, is this new thing, maybe it’s not good, maybe it’s not bad, but how does it help me grow closer to God if you’re a Christian? But even if you’re not a Christian, just substitute in there. How does this help me flourish as a human being? And I think a lot of times things that initially seem very exciting and new are going to kind of be deflated. We’re going to kind of think, “Oh, that’s actually, maybe it makes life a little bit more comfortable, but it’s actually not really in service of what I recognize to be a full and flourishing and good human life.”
Stump:
Well, in typical philosopher fashion, you’ve not given us a neat and tidy formula or algorithm that answers all the questions, but instead have pushed us to think more carefully and deeply about some of these bigger questions. What is the good life? How do I take my use of technology and have it conform to my desire to have a good life? Are there any next steps for people who are taken with that to say, “All right, I’m all in. I want my life to be a good one. What do I do next then in trying to evaluate my use of various technologies or think about in a Christian way, technologies that may be coming down the pipe?”
Vukov:
Yeah, I mean, I think one thing is to really take time and think about what we mean as Christians or more broadly, what we mean as human beings by a good life. What does that look like? What role does prayer play? What role does friendship play? What role does family play? What role does active participation in your church community and in your local community look like?
So I think that’s the first step, is some real soul-searching. Again, I think most of your listeners could probably trot out, here’s some couple sentences on what it means to live a good life as a human, and it would probably be about right. But then really thinking through the details of no, what does that actually look like in practice? And then having done that, I think going through the same Ignatius exercise, which is so difficult, which is being able to pray sincerely and to live sincerely things like if health and riches aren’t helping me achieve that, or we can substitute in, if comfort and efficiency aren’t helping me–I think that’s one of the big things a lot of American society is structured around–If those things aren’t helping me achieve the things I’ve discerned to be, what makes life valuable, then why would I want those things? They’re not ultimately serving those ends. So I think cultivating a prayerful attitude that helps us discern how things coming down the pipeline do fit in or don’t fit in to those larger goals we’ve discerned, and then putting them into practice, which is also difficult because at times it might look counter-cultural and it might make us be open to the less efficient, less comfortable, less rich life as the one that we’ve got to end up pursuing.
And like you said, Jim, I don’t think there is, here’s the formula by which you can decide, should I ever try out Neuralink? Or is memory modification ever okay? I think it’s thornier than that because I think that there’s all sorts of considerations that are going on in the background and all sorts of different life circumstances we find ourselves in that might make that different at different places and times. In the same way that St. Ignatius would’ve said, for some of us, a life of wealth actually is okay and in service of God because maybe we need access to wealth to distribute it to others or to have access to certain social circles. So for St. Ignatius too, that’s why he kind of lists all the options. There isn’t one, right one. It’s the unfortunate conclusion, but I also think the true conclusion is that life is more difficult and messy than that.
Stump:
For sure. Sounds like maybe a philosophy course or two might be in order for us to work through these things.
Vukov:
Yeah, that would maybe be the first thing. Go sign up for a philosophy. Exactly.
Stump:
All right. Well, we’re at the end of our time here. We’ve just covered a good deal of the book you’ve written. We like to end these interviews by asking what books you’ve been reading lately. Any interesting ones on your list?
Vukov:
Yeah. One thing I’ve been reading lately, I’ve never, which is embarrassing, but I’ve never read C. S. Lewis’ Space Trilogy. It’s embarrassing because I do work on futuristic stuff and science fiction type stuff and from a Christian perspective, but I’ve never read his Space Trilogy. So I am about a third of the way into the second book, Perelandra, and loving every minute of it. I’m a huge C. S. Lewis fan, and I’ve read tons of his nonfiction and Narnia to my kids, but never The Space Trilogy. So I don’t know, give me another few months and then I will be able to say, I have now read The Space Trilogy as well.
Stump:
Well, very good. Well, Joe, this has been a fun conversation for me. I really appreciate it and I look forward to talking again sometime about some of your other work you have going. So thanks so much for talking to us.
Vukov:
Yeah, thanks a lot, Jim. This was a blast, and thanks for having me.
Credits
BioLogos:
Language of God is produced by BioLogos. It has been funded in part by the Fetzer Institute, the John Templeton Foundation, and by individual donors and listeners who contribute to BioLogos. Language of God is produced and mixed by Colin Hoogerwerf. That’s me. Our theme song is by Breakmaster Cylinder.
BioLogos offices are located in Grand Rapids, Michigan in the Grand River Watershed. If you have questions or want to join in a conversation about this episode, find a link in the show notes for the BioLogos forum, or visit our website Biologos.org, where you’ll find articles, videos, and other resources on faith and science. Thanks for listening.
Featured guest
Joe Vukov
Joe Vukov is an Associate Professor in the Philosophy Department at Loyola University Chicago and the Associate Director of the Hank Center for the Catholic Intellectual Heritage.
Join the conversation on the BioLogos forum
At BioLogos, “gracious dialogue” means demonstrating the grace of Christ as we dialogue together about the tough issues of science and faith.