On Wednesday, I hosted a discussion with former secretary of defense Ashton Carter, who is now the director of the Belfer Center for Science and International Affairs at the Harvard Kennedy School. The conversation was part of WIRED’s CES programming, which tackled the biggest trends that will shape 2021, from medicine to autonomous driving to defense. We took questions from viewers in real time. The conversation has been lightly edited for clarity.


Nicholas Thompson: You’ve had an incredible 35-year career in the US government and in the private sector, working for Republicans and Democrats, always trying to identify what the most important issue of our time is, the smartest solutions to it, and the fairest ways to think about it.

When you were secretary of defense, you had a very rational policy that in every kill decision, a human would have to be involved. So if there were an artificial intelligence weapon, it could not make a decision to fire from a drone. And the question I’ve been wondering about is whether that justification remains the same for defensive weapons. You can imagine a future missile defense system that can more accurately than a human identify that there is a missile incoming, more accurately than a human aim the response, and more quickly than a human make the decision. Do we need to have humans in the loop in defensive situations as well as offensive situations?

Ash Carter: Well, defense is easier in the moral sense than offense. No question about it. By the way, Nick, you used the phrase “person in the loop.” I don’t think that’s really practical. It’s not literally possible, and hasn’t been for quite some time, to have a human in the decision loop. What you’re talking about instead, or we’re both talking about, is how do you make sure that there is moral judgment involved in the use of AI? Or, said differently, if something goes wrong, what’s your excuse? You stand before a judge, you stand before your shareholders, you stand before the press, and how do you explain that something wrong was not a crime or a sin? And so let’s be very practical about that.

If you’re selling ads, of course, it doesn’t matter that much. OK, I pitched an ad to somebody who didn’t buy anything—type-one error. Or I failed to pitch an ad to somebody who might have bought something—type-two error. Not a big deal. But when it comes to national security, use of force or law enforcement, or delivery of medical care, these are much too grave for that.

Now, within defense, offense is the most somber responsibility, and defense less so. For example, nuclear command and control is heavily human-loaded, starting with the president of the United States. I, as secretary of defense, had no authority, and the people below me had no authority. And to the extent it is possible, we’re dis-enabled from launching nuclear weapons. We required a code from the president. I carried a code myself which would authenticate me, because that’s the gravest act of all.

A simpler case is launching an interceptor missile at an incoming missile. Now, if that goes wrong or you do it by mistake, a missile goes up in the air and explodes and you wasted some money, and it’s embarrassing, but there’s no loss of life. That decision the president delegated to me, and I in turn delegated it to the commander of the United States Northern Command, General Lori Robinson. And it was OK that it went down to a lower echelon, because it’s a less grave act. In between is the authority to shoot down an airliner, which is grave also, but the president did delegate that to the secretary of defense, so that was kind of an in-betweener, and I bore that on my shoulder every day. And you’d be surprised how many times that happens, where an airplane goes off course, the radio’s not working, it’s heading for the US Capitol, and that’s a no-win situation. So it does depend.



Source link