Home
ViSmedia
News

Successful “Book Bath” on Surveillance and New Technologies

“A magic hour,” said one of the attendants at the seminar at the House of Literature in Bergen. Read the full 60 minute conversation here.

Jan H. Landro and Deborah G. Johnson
Photo:
Astrid Gynnild/ViSmedia

Main content

She referred to the nobel-peace-prize-like conversation between Deborah G Johnson, professor of applied ethics, and reporter and author Jan H. Landro. The topic was new technologies and their impact on surveillance and transparency in society.

ASTRID: Hello everyone. Good to see that so many people are here on such an indian full night. It is so hot outside that we don’t know how we could best dress, it is hard to believe that this is going last, but it is supposed to last for a week. So here we are.  My name is Astrid Gynnild and I am professor in Media Studies at the University of Bergen. I am very happy to introduce our two discussants tonight. Deborah G. Johnsson who is professor of Applied Ethics at the University of Virginia in the United States. Deborah has been working on Applied Ethics in engineering and technologies for several decades. I am very happy that I discovered Deborah a couple of years ago on the Internet actually, because she is one of the very few that I have found who work in this field of Applied Ethics and Technologies. And she has published several books on Routledge which is the outmost academic publisher. She has worked particularly much on drone technologies, particularly military drones, but now she is into autonomy, robotism, privacy, surveillance, transparency and responsibility, which this discussion or interview is going to be about. Then we have Jan Landro, I guess most of you know him from earlier. Jan has been a journalist in Bergens Tidende, and he does how many book baths a year?

JAN: 30. 35.

ASTRID: Yes. That means that he also reads a lot of books. In Norwegian we call him ”revolver-intervjuer”. This is kind of an experiment. We challenged Jan to do a book bath on Deborah’s Routledge book, and also three of her latest papers. And these are very academic texts, they are quite abstract in form, and it does take quite a lot of work to decipher Deborah’s texts, so I’m really excited to hear what is going to happen now.

JAN: So am I.

ASTRID: Finally, I would say that this is also an experiment because usually we would have invited Deborah to do a guest lecture at the University of Bergen. She is part of an international research project that is called VisMedia, which is about visual surveillance technologies in the news media, and she has been here at a workshop, and actually this morning she did give a guest lecture to our students, but we wanted this session to be out in the public, so here we are. The stage is yours.

JAN: Thank you. Deborah, you are professor of Applied Ethics, what does that imply?

DEBORAH: Ooh. So actually the term is controversial, I’m not happy with it, but the idea is that ethics is this broad theoretical general set of ideas, but it is too abstract in itself, and so it needs to be applied to a context. So actually the term of the first field was medical ethics, and actually technologies were causing all these questions to be raised, in medicine, and the philosopher/ethicist were called upon to sort of try to help sort out things like how to allocate particular resources or, you know, euthanazia and other issues, anyway, so, it started out in medical ethics, then business ethics also sort of got developed, but the applied partI don’t like, because it implies that there is ethics, and then you take that and you apply it.

JAN: Applied as opposed to theoretical?

DEBORAH: Right, because the theoretical is supposed to be the kind of philosophical, abstract, so it’s applied, but people like me prefer something like practical I think, sort of down-to-earth. In academics we sort of talk of this model of principles and then applying the principles, and then it gets cryptic because, you know, even if you take a principle like ”tell the truth,” when you then have to do that in the case of medicine, let’s say a doctor, do you tell the patients everything, you know, how much do you tell? It’s not so simple, it’s not just applying the principles, you actually have to figure it out. So it is not applied.

JAN: I see. Your particular concern in research seems to be the responsibility, or accountability, for robotic technological innovations and what they can do. Why? Why is it so important to you?

DEBORAH: Good question. Why is it so important to me. Well, it seems to me to kind of go to the heart of how to manage control, or do something about technology, holding people accountable for what happens. It seems to me we loose track, that technology has it’s own power or you know, control, or it has its own trajectory, and it distresses me that people actually think that, when it’s actually people that are making the technology and making the decisions about when to use it and when not to use it. And so my thing in life is to draw attention to the people that are connected.

JAN: But some scientists already envision a future where the robots can’t be controlled. What then?

DEBORAH: So, I mean, I actually thought a lot about this, and I think it is possible, but if it happens it won’t be because the robots have taken over, it will be because humans have conceded or have given…

JAN: Well, that is more or less the same thing, isn’t it?

DEBORAH: Well, yes.

JAN: I mean, the practical consequences of robots?

DEBORAH: The consequences will be the same, but I don’t think it is inevitable that the robots take over. Let me unpack a little bit people.

JAN: Go ahead.

DEBORAH: So there are some people and there are other people. When it comes to robots taking over, it really has to do with certain institutions and organizations and certain people that have an interest in robots being developed and taking over certain activities. And if robots end up being in charge it will be because those people won over other people who don’t want that to happen.

JAN: Yes. And that might happen?

DEBORAH: When I say it might happen, I am, yeah, and I am worried in particular about, I mean, we already have in some sense robots making a lot of decisions about you know, diagnosing your illness or deciding whether you get credit from a bank, what size loan you get, modelling space or whatever. We already have them making decisions, but I am particularly worried about the fact that some people want to not just develop robots to make those decisions, but they want to develop robots that look like humans.

JAN: Humanoids.

DEBORAH: Humanoid yes, humanoid robots. And so that is sort of thought a lot about,  could there ever be a point, (this is what the philosophers argue), where we would consider the robots to have the same moral status as humans, so we will be reluctant to turn them off.

JAN: How come? How could that be?

DEBORAH: Well, so, the more I thought about that, having looked at humanoid robots, that we could come to feel that way. Because they are doing a very good job developing these robots to look like humans and to draw on our sensibilities, whatever you call it, sentiments so that we would come to feel that we didn’t want to turn them off. I think that is possible, but I don’t think it is inevitable.

JAN: But it is still within us?

DEBORAH: It is within us. But some people argue, and I tend to agree, that we shouldn’t build them to look like humans, we should build them to look like robots, who we have always know…

JAN: You think that will make a difference?

DEBORAH: Yes, I do.

JAN: Mentally?

DEBORAH: Well, definitely. Because I think we treat machines differently than we do humans.

JAN: Well, we tend to at least.

DEBORAH: We tend to. Yes, it’s true, you are right, we also tend to yell at our computers like they are people and… But I think we should know whether we are dealing with a robot or a person.

JAN: In one of your papers I saw you cited a researcher whose name I can’t remember, he was describing robots as a race.

DEBORAH: Yes, I don’t remember who that was.

JAN: No, it doesn’t matter, but a race? How can robots be a race? That is a human category, or animal.

DEBORAH: So I think, these philosophers who think that we should give computers status of, the moral state…

JAN: You are not among them.

DEBORAH: No, I am not among them. They make the argument that human beings have been wrong many times. We thought that slaves weren’t human, we thought that animals weren’t worthy of respect, we thought that the environment wasn’t worthy of respect, so they say that we are making the same mistake about robots. And so we ought to extend to robots the status, the same way we give status or value or recognition to these others entities.

JAN: What do you think of that?

DEBORAH: I think it is wrong. I think it’s kind of silly.

JAN: Quite. I agree.

DEBORAH: But they get a lot of attention, and a lot of followers.

JAN: You are the co-editor of this book, and you have contributed to this book, titled ’Transparency and Surveillance as Sociotechnical Accountability’. It is a bit too technical for this discussion, but to me, transparency and surveillance sounds like contradictory terms.

DEBORAH: Yes, because you are thinking it is about watching people?

JAN: Yes. Secretly.

DEBORAH: And transparency is about somebody revealing…

JAN: Yes.

DEBORAH: So you see that is where I started out and I had done a lot of work on surveillance, then I started looking into transparency and I realized that there was this similarity in the sense that, I was thinking most of the transparency of corporations, companies, the purpose of transparency is that we want companies and corporations to reveal something about themselves, as a way of actually kind of controlling them. If they are forced to tell us about their business and how much money they are making, they are less likely to do all kinds of illegal things.

JAN: Yeah.

DEBORAH: So if you read the stuff on surveillance, it is a similar motive, at least the original concept of surveillance is that it is done on people to make sure they don’t misbehave. So both transparency and surveillance is a kind of means to diminishing misbehavior. And that is what sort of got me thinking: ”wait a minute, what is the difference?” And then there was this great case, I don’t know if you read it, that really got me going, which went from transparency to surveillance, I might have mentioned this some other time, in a project meeting, and that was the case where… In the U.S we have, I’m sure you have something similar, a campaign finance disclosure system?

JAN: Yes?

DEBORAH: Called ’Transparency’? The whole point of this is that you make candidates or campaigns reveal who has given to them, in order to pressuret them not to take illegal contributions. The idea is that the sunlight will cleanse, that if you keep it out in the open the candidates won’t do any bad things. And in this case it involved a referendum on gay marriage in California, and so the campaign revealed all the donors that had donated to the campaign, and the people who were on the other side took up that information and used it to track down donors and harass them. So it started out as a system to make campaigns transparent, and it flipped to a system where the donors were being surveilled by people on the other side.

JAN: Which only goes to show that any system can be abused.

DEBORAH: Well I don’t know if it was an abuse, that is the thing. The thing is, I see it more as, when you reveal, if you just make things transparent, you are making things transparent for a particular reason, but certainly when transparency is done through the Internet or through the Web, there are all kinds of ways that the information can be used that you didn’t think about when you came up with the system for transparency. You know what I mean? We are just thinking about holding campaigns accountable, and in the old days that meant sending the information to an election commission who kept it in files, and if journalists wanted to look at it they could go and publish it, but there was a natural distance for that information to be revealed. When you put it on the Internet, then all of a sudden it is open to everybody, and you get something you didn’t plan.

JAN: Talking of campaign funding, do you know the level of campaign funding in Norway? I’ll tell you.

DEBORAH: Different system?

JAN: Not very different. In Norway, the labor party, the biggest party, last year, got between seven and eight million dollars, that was all.

DEBORAH: Oh. But it is a smaller population?

JAN: But still.

DEBORAH: Yeah.

JAN: Consider the presidential campaigns. Madness in the system.

DEBORAH: I just got an e-mail from Hillary, saying that Donald Trump raised 95 million dollars last week. And this is a real problem, so I need to donate to make…

JAN: You have to think of this, of a salary. They have to raise your salary. Well, anyway, when I think of surveillance and transparency as contradictory terms, I think you have the kind of surveillance which nobody is supposed to know of, and where we could use some transparency, but there is no such.

DEBORAH: I am hesitating because I am thinking, so, the problem is, who is transparent and who isn’t being surveilled. So the real reason I sort of tried to do this stuff about accountability, was that I was hoping I could figure out why, we don’t like Google keeping track of all of our searches. So it had an element of us, the users, customers or whatever, being held to account by Google, because they are developing these accounts of us and using them to sell. But the point I was going to make is, so, there is an unequalness in the sense that they keep track of us, but they don’t reveal anything about what they do.

JAN: Would it be unfair to describe Google as the maximum of surveillance and with the minimum of transparency?

DEBORAH: I would agree.

JAN: And still, maybe Facebook is even worse?

DEBORAH: Facebook is, yeah, is even more complicated, in the case of Facebook the transparency is person to person. I mean, what they think they are doing is making us transparent to one another, but they are not transparent.

JAN: Not at all.

DEBORAH: Not at all.

JAN: Some years ago you wrote, and I quote, ”privacy is perhaps the most important of the ethical issues surrounding computer and information technology.” Two years ago, you wrote: ”Today, privacy issues are pervasive.” Where are we heading here?

DEBORAH: I don’t know. I don’t know. So the thing I have struggled with from the beginning was why privacy is so hard to protect. And maybe you have more than we do in Norway.

JAN: We like to believe that we have more.

DEBORAH: Yeah, and I hope you do. So I don’t know why it is so hard to protect. One thing I think I know that I learned, is that it is not so much an individual good, it is not something that each of us individually value, it’s real value is social rather than individual, it is the kind of world that is created when you have privacy as opposed to when you don’t have privacy, and it is not something which you know, can work in the market, it just doesn’t work in the market. So, it is a mystery to me why we can’t stop it, stop the invasion of privacy.

JAN: Do you have any suggestions?

DEBORAH: For how to stop it? None that are very realistic, I mean, this is a case where the technology is just, we are caught because the technology makes it possible, all this information gathering possible, and there are great benefits to some in collecting that data, and people just seem to be sort of willing to take the benefits. Deals have been struck, like Google, the deal is that I get to use that search engine for free, it doesn’t cost me anything, and the reason is of course that I am not their customer, I am their product. So how that deal got set up…

JAN: And how easily we accepted.

DEBORAH: We accepted. I have to confess, I accepted, because it is so convenient.

JAN: I did too.

DEBORAH: But I would much prefer that they did not use the data in the way they do.

JAN: I am afraid they know very well what they are doing.

DEBORAH: I do too, yes, exactly. So I am not optimistic, but I don’t know how change could come about at this point.

JAN: Do you see no reason for it whatsoever?

DEBORAH: Well, only in the sense that it is so hard to control.

JAN: What do you think it does to an individual, knowing that every step he or she takes, is surveilled, and that the data may be stored somewhere and used for some unknown purpose?

DEBORAH: So there are at least two, and probably more things, but there are two that come to mind. One is, I think there is a general effect of people being reluctant to behave in ways that are perfectly legitimate ways of behaving. My favorite example is someone who wanted to rent an apartment, and couldn’t figure out why they weren’t being accepted, they had to apply, and it turned out that this person in the past had a really bad landlord who didn’t fix things or take care of things, so they sued the landlord and won, then went to look for another apartment and discovered there was a list of all people who had sued their landlords. It was of course sold to landlords, so that new landlords didnt want this person who had sued. Now the reason that I liked this case is because you have every right to demand from your landlord, you know, what is promised in the lease agreement.

JAN: Yes.

DEBORAH: So this person was simply pursuing their legal right, and yet. So it squashes that kind of freedom, your rights, squashes your rights. So that is the first thing, that there is a kind of chilling of rights, people are reluctant. And the other thing is, it shapes who we are, so, now we switch from Google, we can do Google or we can go to Amazon, you guys have Amazon. So Amazon is keeping track of all the things that I buy, and then giving me more of what I want. Facebook too, giving me the news I like, customized. So instead of me being exposed to variety and different things, I am becoming more and more the kind of person I already am. Pretty scary. So people come to think of themselves in the categories that these organizations use, the watchers…

JAN: But you can break out and not use these systems.

DEBORAH: You can

JAN: After all Amazon is not your…

DEBORAH: Right. And there we have another one of those deals. It turns out that some of what Amazon does is actually convenient and useful. You know, they send me things that I am actually interested in. I mean, I still think Amazon is kind of stupid in the sense that if I go shopping for a lamp, then I’m buying a lamp..

JAN: They will offer you twenty more.

DEBORAH: They keep giving me ads for lamps, it never occurs to them that I no longer need a lamp.

JAN: I’m going to Rome the day after tomorrow. I booked my hotel three months ago. And every day I get new offers, ”hotel in Rome”, ”hotel in Rome”.

DEBORAH: And maybe they will get better at that.

JAN: Hopefully, yes. But this surveillance which we don’t like, in this country, probably also in yours, there is a lot of people who ask for more surveillance. What do you think?

DEBORAH: I think they are sort of short-term thinkers, rather than long-term thinkers.

JAN: Or not thinkers at all?

DEBORAH: Or not thinkers at all. I mean, they are assuming there is a trade-off between security and privacy, they are willing to give up privacy for security, and it never occurs to them that they could have both, if we work at it.

JAN: But of course, maybe this is too late?

DEBORAH: I don’t, when you say it is too late, I told you I wasn’t optimistic, it’s not so much that it is too late, but it’s going to take something dramatic or big I think, to shift. Or not.

JAN: Or not. That’s where I am. Or not.

DEBORAH: I mean, I would love to be optimistic and say that maybe we will just learn, as in the case of security, there are lots of, if people were creative,  lots of ways to get security without giving up your privacy. But it takes a little bit more thinking about the algorithms, and the systems…

JAN: And maybe cost more money.

DEBORAH: And maybe cost a little bit more… Well, and the question is who does it cost more.

JAN: In the end it will be you and me. It always is.

DEBORAH: Yeah, it doesn’t have to be, yes. But I also like, maybe optimistically think, that the solution may turn out to be less expensive, it is just less likely to find them because there are interests who want it to be more, you know, people want to make money.

JAN: Let us go back to Google for a moment. Imagine that the two of us were making identical search queries, say, on transparency and surveillance. You on your machine, I on mine. Do you think Google would deliver the same answers to us both?

DEBORAH: It does not.

JAN: It does not.

DEBORAH: We know that.

JAN: Yes.

DEBORAH: We know that. Because it has kept track of your search queries, and it’s kept track of mine, and it has used those in deciding what I’m interested in and what you are. And that is enourmously, again, it doesn’t even have to do with privacy, that is just enormously problematic for knowledge, what we call knowledge, what you are going to get as the answer and what I am gonna get as the answer are completely different.

JAN: But I would like to get what you get.

DEBORAH: Yeah. You think, I mean, they don’t think you do. Google does not think you do.

JAN: I don’t deserve it.

DEBORAH: You don’t deserve the same thing as what I get?

JAN: Since you are the expert, I am the amateur.

DEBORAH: Well, whatever, they think that they have figured out what you want better than what you think you want.

JAN: Yes. True.

DEBORAH: Because they have all this information about what you did in the past. It is a real problem. And also it means by the way, if we both went to Amazon and searched for the same product and decided to buy the same product, you might be charged more than I am charged.

JAN: Is that so?

DEBORAH: Yes, to me, very fundamental principle, prices should be the same for everybody. Not true on online shopping.

JAN: You know that as a fact?

DEBORAH: Yes, they tell me that.

JAN: I see.

DEBORAH: In fact, somebody recently told me, and this I don’t know if it’s true, but it was a reliable person, that when you searched for an airline ticket, now they can keep track of how often you come to the site, and the prices are going to go up, not down.

JAN: Yes, I’ve noticed that, and I don’t like it. Alright. New technological capacities opens up, new scenarios, and to take the extremes: some people think that computers and information technology will potentially enhance democratic institutions as never before. On the other extreme, we have those who predict that this will ultimately lead to totalitarian control. What is your position?

DEBORAH: My position is that this is a case where the technology can go either way, and what makes the difference is people, or the society, not the technology. So there is this argument, people used to say that Internet is a democratic technology.

JAN: It is not.

DEBORAH: Well, it can be.

JAN: It can be, yes.

DEBORAH: It can be. But it is not inheritably democratic. And today I would say, the way I think of it is, we have now built this infrastructure which is now being used in a way that sort of gets the global world that we have, but with a little switch in political and leaders, we have built the perfect infrastructure for totalitarian control.

JAN: Is it a little Trump-ish?

DEBORAH: I don’t know if it is Trump-ish, but when I first started working on this stuff in like 1981 or 1982 there was this article that said something like: ’what would Hitler had done with a computer’, or something like that, and the idea was that if you had Hitler, and you had this technology… Boy. I haven’t seen that kind of thinking since that time, but it seems to me it is highly relevant.

JAN: It is.

DEBORAH: I don’t know about whether Trump knows about the Internet.

JAN: How do you think the world has changed during the 15 years after 9/11?

DEBORAH: You don’t ask small questions.

JAN: I’m paid to ask big ones.

DEBORAH: When you ask how has the world changed, are you talking about the U.S or?

JAN: Well, the U.S is the world, isn’t it?

DEBORAH: Okay, so I think what you are asking me, because that is such a big question, I think that the easy thing to say is that that trade-off between security and privacy just got … People now just want security in our world and give up privacy, even though it didn’t have to be that way. I think we could have gotten more security and still have privacy. But as you say, it might cost more, it might be more complicated. That is certainly one of the ways that the world has changed. At the same time we have become more intensely globalized, that is a trend that has continued.

JAN: Intensely, yes.

DEBORAH: More intensely, yeah. It has always been global, it’s just, the intensity, the scale and frequency.

JAN: Given the present state of technology and computation, what is the worst case scenario you could envision?

DEBORAH: Do you mean in terms of what kind of world that we…?

JAN: Yeah.

DEBORAH: I guess, I don’t know, that is not so easy, the worst case scenario? The worst case scenario would actually be to continue on the trend that we are now in which these deals are realized to the extent that you got massive populations who are quite content to have their needs satifisfied by these technologies and just become consumers, and are not very critical citizens, you know, they are not involved, they are not capable of democracy. But you would always have some small percentage of the population that has to make things work, have to run things, so. I don’t know. That is the answer, I don’t know.

JAN: No. And of course, there is a millitary side of this too.

DEBORAH: Very much so?

JAN: Far too much too.

DEBORAH: Right.

JAN: And what about this scenario, I’ll jot it down. Some artificial agents or robots learn as they operate. Thus, those who design or deploy them may not be able to control them or even predict what their agents will do. There is an ethical but also very practical side to this.

DEBORAH: So, just to give you an example, this is the stuff that I am now hearing from people who work in artificial intelligence, who are designing these systems that are huge and make decisions, and just to give it some reality, think about something like the international stock exchanges, how those systems are in place and sort of becoming more and more automated. So what they are saying, which is somewhat true now, is that those systems are so complicated and will become even more complicated, that no humans really understand them, how they work. And to some extent that is already true, although there are ways to monitor and figure it out. So, the kind of dark futuristic scenario would have these things so much in control that no one knows how to do anything when things start going wrong.

JAN: In the initial phase of the industrial revolution, we had these workers smashing the spinning jennies. Is that what we will have to do? Smash these robots?

DEBORAH: I don’t know. But that is the kind of thing I think about when I think about these training systems. Right now we are kind of like in an escalating warfare, so I was at one of these conferences in which these things were being discussed, and there were people there who are sort of designing the counter-technologies who were worried that, because, you know, they send out algorithms, bots, to do trades, and there are certain rules about what kind of trades you can make and under what conditions. But these are learning algorithms, and you tell the algorithm to figure out which stocks I need to buy to make the most money. So one of the scenarios which I found very interesting was that there were some people that were worried that the algorithms, the bots, might figure out ways to make illegal transactions. So they would put in bids to buy something to get the price up, and not buy it, but something like that, violates the rules. So what they were worried about is the fact that nobody would be able to detect this or figure it out, so you would have these algorithms, bots out there working for Goldman Sachs, who would be making lots of money for them, illegally. So these people were trying to design the algorithms to monitor those.

JAN: But you know, Goldman Sachs doesn’t need bots to do this, they can do it on their own.

DEBORAH: But they can do it on a much bigger scale with bots.

JAN: That is true.

DEBORAH: I mean, you are right, they could do it on their own.

JAN: As we know. Why do you think there is so little popular debate and discussion, on where the progression of technology takes us?

DEBORAH: I think these deals that have been implicitly made, somehow pacified people, I don’t really understand.

JAN: Perhaps it is too complex?

DEBORAH: Too complex?

JAN: Yes, too complicated to understand.

DEBORAH: I don’t really know that people want to understand.

JAN: That is a good point.

DEBORAH: That somehow the inconvenience… we have been framed into consumers, and we are treated as consumers and we are rarely called upon as citizens.

JAN: And we love to be treated as consumers. Well, anyway.

DEBORAH: Well, I don’t know about you, but for me it is also a complete failure of our educational system in the U.S. Probably not here, but.

JAN: More or less. I would like to quote you again. In one of your papers you write, ” public understanding of AI is being shaped in a way that may ultimately impede AI research.”

DEBORAH: Why am I saying that?

JAN: Yes. And what would you like to do?

DEBORAH: This is kind of me hoping for the self-destructive technique. So what is happening now is very complicated, all these AI researchers who are developing these algorithms are expressing this AI anxiety, and they are sort of talking about how they are worried about AI taking over the world and… So if you step back and think about what we have been talking about, how does the public get informed at all, these stories actually are telling the public to be afraid, rather than to accept. So a part of me is sort of like, oh, this is good.

JAN: So good for the film industry.

DEBORAH: Well, it’s true. And people love sci-fi stories, but ultimately they have to see the connection between their lives and these scenarios.

JAN: But you want another discourse.

DEBORAH: I want another discourse, yes.

JAN: Shortly.

DEBORAH: As soon as possible. Another discourse, but the question is also where that discourse should take place and what it should be. Because I’m well aware that you can’t expect individuals and citizens to expect to know everything there is to know about these technologies. I mean, you should be educating them, but I can’t know everything about nuclear power, nano-technology etc. So it seems to me that we need some other ways of… And one way is this kind of responsibility-thing, where the experts in those areas would take some responsibility for what they are making and would actually make an effort to do it well, rather than to do it out of control.

JAN: Is that what you think they are doing?

DEBORAH: Well, it is certainly what they say when they… Yeah, I had a moment at this conference in which I was saying: ”why are you creating all this anxiety?”, and essentially this guy basically saiDeborah: ”because we are worried and want something to be done”.  The problem is that the anxiety, when they express this anxiety about AI, they also just suggest it is out of human control, and I’m trying to say that it is in human control.

JAN: In one of your articles in this book, you cite a researcher by the innocent name of ’Sparrow’, and he writes: ”weapons that require human oversight are likely to be at a substantial disadvantage in combat with systems that can do without”. He has a point, hasn’t he?

DEBORAH: Yes. So he’s imagining that if you have autonomous robots in warfare, they will do better than people who don’t have autonomous… So what he is worried about there is that warfare competition will lead to more and more technologies that are out of human control. I don’t know that that’s true, but I certainly do worry about the forces of competition, and not just war competition, but capitalist competition, pushes things to more autonomous systems. It is really efficiency and you know, competition, pushes us to more and more… So yeah… But my gripe with him is really more when they call these things autonomous, they are somehow misleading us. Because they are not really.

JAN: There is someone behind them.

DEBORAH: There is someone behind, if not in creation, at the point of operation. And it is always a person, you know, the millitary, the government, who puts that autonomous thing out there.

JAN: By the way, the man behind, we have a guy, by the name of Barack Obama, who has introduced us to the killer drones. Where is the ethics in this procedure?

DEBORAH: I don’t know. What is interesting to me is, so, I went to a conference at a millitary camp with millitary folks, who’ve been out there, and they actually said that drones are causing us to loose more because we are loosing the heart and minds of people in these countries.

JAN: I wish that were right.

DEBORAH: Well, I think it’s true that we are loosing their hearts and minds, but somehow our…

JAN: For how long?

DEBORAH: Well, I think it’s just my leaders don’t care.

JAN: Who is protesting?

DEBORAH: There are some people who protest, but there is a kind of movement to try to get autonomous weapons out, there is a U.N declaration. I mean, it hasn’t passed, but. Not very much.

JAN: You might get the impression that someone has decided that some people don’t deserve their trial or trial at all, just kill them.

DEBORAH: Yeah. I don’t know what to say about that.

JAN: Not much.

DEBORAH: I have watched it happen.

JAN: Yes, we all did.

DEBORAH: There was a time when assasinating someone was just wrong, and now we publicly discuss whether to assasinate someone.

JAN: And with this nobel peace price winner at the head of it. Wonderful.

DEBORAH: Yeah. I don’t know, I mean that is a complicated story.

JAN:  Yes. We will see what happens to the next president.

DEBORAH: Yes.

JAN: Our time is running out, but one final question. As you know, in many countries, including this one, consider to publish the news stories and feature articles through Facebook. What would you recommend?

DEBORAH: Well, I mean, I haven’t thought that through very much. I think the question is… I certainly wouldn’t want that to be an exclusive, in other words I wouldn’t want, if they publish their stories there, if that means they can’t publish them elsewhere. But no, I would think that would be supporting…

JAN: But the problem is more… Like the story with this Vietnam-picture. Nudity and whatever, the Facebook editors don’t accept.

DEBORAH: So, the question is, we’re not going to avoid censorship.

JAN: Right.

DEBORAH: The question is: ”who do you want doing the censoring?” I don’t want it to be a private company like Facebook. On the other hand, Facebook does have… I’m just trying to be fair here, does have a history of it’s users protesting what it’s doing, and then Facebook is sort of coming around, so there is some sort of response at things.

JAN: But most often protesting in vain? I think.

DEBORAH: The cases I can think of, Facebook actually did come around. But I think there is a lot of stuff they already censor especially if it gets in their way of them making money. I just don’t see the point of journalists publishing on Facebook. I mean it must be because they think they get a bigger audience, maybe? I mean, I can see that but, it still worries me. Not everyone is on Facebook, so.

JAN: Yes. We have been bouncing around and of course we could have done much more bouncing. But this was the hour, thank you very much.

ASTRID: Thank you so much. I was wondering, Jan, do you think we could, if there are a couple of questions, is that okay? Yeah?

Audience: I don’t know if I have a question, I’m trying to make up a question but. Listening to you was very interesting. I speak for myself, I am old. When I see young people today, they are in a way living in a virtual world. They are playing with computers, they are playing games, whatever, even really young people. Do you think this, especially against this privacy complex which at least I would say we old people have, do you think there is a sort of evolution here, where young people, when they get older, doesnt need that much privacy anymore, because they have lived in a totally new world, in a virtual world, they have lived more intensely together, without being together. They are sitting all over, playing with the chinese, playing with the brazilians, playing with norwegians, and they have a sort of community where privacy is going away. I’m not sure, I’m not sure what was the question.

DEBORAH:  I mean, I have the same back-and-forth with you thinking… you know they have grown up in a different world with a different experience, and it is very different from my own. And I used to think maybe privacy wouldn’t be as important, but my experience now is, in teaching, that once they ”get it”, they do care about privacy. And a lot of what they tell me now, you should ask around, a lot of people are a lot more careful, these young people, about what they put on Facebook than they used to. So a part of this is a kind of, we have been in a learning and transition phase, and so it may be, if i am optimistic, that Facebook somewhat morphs into being a little different than what it is now. That is my hope because, I mean, the reality is that the lack of privacy will affect many of them, personally. So if we are talking to them the fact that, these are again students, that when they go to interview for jobs, that these employers will look at their Facebook sites, you know. So their lives will be affected by the lack of privacy. But it seems that it is a slow learning process, and it may be too late, I don’t know. So I go back and forth. I mean, I do try to give them credit, they know something that we don’t know. They understand something, their lives are different than our lives. They are much more peer-to-peer than we, than I ever was.

Audience 2: You said something about the education system is lacking somehow, could you explain that more, what you meant by it?

DEBORAH: Yeah, I just think, I mean, I don’t understand it as well as I should, but I don’t see the educational system in the U.S preparing students for democratic citizenship, and there is a lot of variation in the U.S from state to state and community to community. I have really good students at the University of Virginia, and they are capable of kind of a formal critical thinking, if you know what I mean, if you ask them to write an essay critiquing something, they can sort of do it, but they are not big critical thinkers, you know what I mean. They don’t think about their life critically and make choices. Now, why that is, I don’t.. I mean, what I can tell you and what it seems like is that there has been so much pressure about getting jobs and getting ahead. My students are very focused, and so they don’t have the luxury it seems, they don’t even consider a wide range of options.

Audience 2: So they are not critical to the system, is that what you would want them to be, be more critical to the systems?

DEBORAH: Yeah, critical thinkers in a positive way, not necessarily against everything, but critical, wanting to change the world. So there was a kind of phase, and I think it still exists, we do have a lot of students who care about the environment and do want to change the world in that way, but they… and maybe through that they get political, but they are not political. It is very much like the election you are watching in the U.S, you grow up among republicans, or you grow up among democrats, and that’s it.

Audience 3: I was just thinking about that, could that have something to do with income inequality? Like in the 50s and 60s there were a lot more income equality and maybe the kids felt more safe to think big, and werent as anxious you know, ”I have to..”.

DEBORAH: But the anxious people are the middle class as well, it’s not just the poor kids that are thinking narrow. My experience is just the opposite. I have these kind of ”well off-students” that are very narrowly focused.

Audience 3: Yeah, they would be more likely to have parents who are more pushy.

DEBORAH: Yeah, exactly. It is like… But they follow what their parents tell them to do much more than people used to, than my generation. That I think is one of the things that I do puzzle about. I don’t really understand but, I don’t think… I think our educational system has also fallen into the consumer market, we are certainly a victim of that at the university, where we have to make our students happy. We are not allowed to challenge them too much or say certain things in the classroom or whatever. It’s just such a different world, I don’t know what to say, and I don’t know what it’s like here but, when I went to college, I went to college with the understanding that it would change me.

JAN: And it did?

DEBORAH: It did. Now students come just for this layer of education.

Audience 3: It’s a product.

DEBORAH: It’s a product that they’ll add to their resume. I mean, they are better than that, I hate to be so critical, but there is something, there is this change going on. It does have to do with this technology that makes you more insular. So even friends you know, you now have much more control over who your friends are, through the technology, whereas instead of figuring out to live with the people in your neighborhood. Yeah.

Audience 3: Yeah. I have another question. It’s a very interesting topic, because I was thinking about that as well, and if that could have something to do with urbanization as well, the fact that so many of us live in big cities, and that makes it easier for us to pick friends as well.

DEBORAH: It is urbanization together with this technology, you know, I was raised in the city, right. Because being urban you would think would make us closer together and therefore have to deal with each other and be more, more interdependent, but because of this technology we are allowed to create these little isolated communities online and so we move closer to some and more distant from others.

ASTRID: Okay, so thank you so much for interesting questions and answers. And who knows, maybe there is somebody watching us here, or there could have been, that we don’t know about. Earlier we talked about that norwegian-developed drone that weighs 16 grams and is sold to friendly millitaries in other countries.

JAN: Friendly, indeed.

ASTRID: Yeah. And with 16 grams like you said, it could go anywhere here and we wouldn’t see it. But we trust that we are here alone. Thank you so much.