12:04PM May 15, 2020
Note: This conversation was published in "Welt am Sonntag" (May 31, 2020, in German). http://weigend.com/files/press/KahnemanWeigend2020_de.pdf
"Nothing in life is as important as you think it is when you are thinking about it."
This, Daniel Kahneman told me, is his favorite line in "Thinking, Fast and Slow". My name is Andreas Weigend. I was the chief scientist at Amazon. I would like to invite you to this 40-minute conversation with Daniel Kahneman on decisions and data. We will discuss trade-offs in decision making, like the current one, between lives and livelihoods, and we'll talk about some of the fears people have, including fears about their data, that data can be used against them, and will suggest some ideas including differential privacy that might be helpful in the current crisis. But first, here is the former chief executive and president of NPR.
My name is Vivian Schiller. I'm the Executive Director of Aspen Digital. We are a program of the Aspen Institute. And I'm very pleased to be your moderator for this session. And I'm incredibly excited to hear that conversation. The topic of this conversation is incredibly timely, not just because we're in the middle of a pandemic, but because of the particular moment of the pandemic that we're in. We're now eight weeks into lockdown and around the country, in a patchwork manner, parts of the country are beginning to open up. So at this critical moment, data has never been more important. It has also never been so inconsistent, for reasons since having to deal with both logistics as well as politics. And most of all, and this is where this session comes in: Never has our imperfect human reasoning and decision making in the face of data been so consequential. How we choose to conduct ourselves literally puts lives at risk, our own and others. Luckily, we have just the people to unpack what's going on when it comes to our data, and our very human responses to it. their bios are so spectacular that I can hardly do them justice in just a few words. So with deep apologies for what I leave out, let me introduce our two panelists who you're going to meet in a minute.
First is Andreas Weigend. He is an expert on the future of big data, social mobile technologies and consumer behavior. He teaches at Stanford Berkeley and in China. He was previously Amazon's chief scientist, where he helped create the firm's data strategy and he is the author of "Data for the People: How to make our post privacy economy work for you." With him is Danny Kahneman, Daniel Kahneman. He's been called the world's most influential living psychologist. He basically created the field of behavioral economics. He's a professor at Princeton, and a fellow at Hebrew University. He's been the recipient of countless awards, among them the little-known Nobel Prize in Economic science and the Presidential Medal of Freedom. He is the author of many books, including "Thinking, Fast and Slow", the contents of which will no doubt inform this conversation. Again, we're so honored to have both of them with us on bout to turn it over to Andreas, who will have a in depth conversation with Danny. I'll be back after their conversation in about half an hour to ask a few of my own questions, but mostly to ask your questions. So with that, I'm going to turn it over to Andreas we'll bring both Andreas and Danny up on screen now. Hello to you both. Hello. And I will say goodbye to you for a little while. I'll be back later. And I'll leave it to you for the for the conversation.
Thank you, Vivian and hello, Danny! So when you introduced me, Vivian, you pointed out that I was the Chief Scientist at Amazon. Many people are wondering, what would a chief scientist at Amazon do? And the answer is, what any scientist does: experiments. When I was there, we had maybe 50 or 100 million subjects. And it was amazing to come up with hypotheses about human behavior, set up the experiment, and to then see, sometimes within minutes, how people actually react. Now that, of course, was nothing to what we have now. Namely, an experiment called COVID that involves 7 billion people. When I was reflecting on what insights I might have - for instance, how has my behavior changed? - I was wondering what the underlying general principles are. For example, all my friends started cooking, but you know, this is not a cooking show. I am curious, what can we learn from this experiment? The other thing I want to refer to that you mentioned is the subitle of my book, the "post-privacy economy". Meanwhile....
... meet my friend Otter. The website is otter.ai. It is an amazing app that records audio and transcribes in real time. So for example, if you right now go to weigend.com/kahneman, you see the transcript happening in real time. Now imagine that this app opens when you get up in the morning, when you start talking (or maybe even before if you talk in your sleep), and closes in the evening, except maybe a couple of moments when you actually don't want it to record. How would that change how you do things? ("When Everything Is Recorded" is also how my book starts.)
I remember showing this app to Danny a year ago. In terms of context, a few years prior to that. I showed Danny another app, Google Latitude, that showed on my website my geolocation in real time. I thought that was amazing. And Danny only said, "I'm not sure whether I want the world to know that." So I think one topic in this conversation about data is data about individuals, about to behavior between people. And of course, in the context of COVID, data for tracking and tracing,
Maybe I tell one more story. A friend of mine went back to Hong Kong, and was given a bracelet upon arrival at the airport. He went with his mom, his mom downloaded an app. And he paired his bracelet with his mom's app. And, you know how moms are, they don't always think about changing their phone, and he didn't know about it. The next day at noon--knock knock--two policeman came to check in on them. I think it's interesting that we live in a time where in some way, maybe we should be giving up civil liberties, like privacy. But my worry is: what will happen down the road when we don't have the COVID problem any more. Will governments keep on checking up on us?
But I'm jumping ahead here. I'm just excited to be here. And I would like to start by asking Danny a question I have always been curious about: Danny, what is a good question?
Oh, come on.
A good question for one thing is a question that has an answer. A good problem is a problem that has an answer. That throws out many problems that don't have answers. Further, a good question has an intelligent answer, and an answer that is new. That tells something that people haven't known before. That's a good question.
I also think that a good question is a question where the answer points to potential trade-offs, not just "yes or no". It shows that there are different aspects. Good questions, I think, reveal those trade-offs. And then it's up to us, both as individuals and as society, to think about the big questions, including COVID. It's up to us to figure out what weights we want to put in front of the different terms.
You are talking about a special kind of questions--you're talking about decisions, about questions about what you want to do. There, of course, trade-offs are of the essence. And the most interesting ones are trade-offs that we don't want to face. Because the important questions do have that characteristic. Trading life against anything, or trading love against anything... Those are the trade-offs that people don't want to think about. And in some situations, like in the present, those are the trade-offs that you MUST think about. So that's the problem that's with us these days.
I remember last year, I had just come back from Germany from a meeting with Angela Merkel from the "Digitalrat", the German Digital Advisory Council, and a lunch presentation to some CEOs about what I am passionate about, namely about making the implicit explicit. And it turned out that that they just did not want to hear it. When I saw you a couple of days later in New York I first thing told you that I felt that lunch presentation really failed. And you asked me "What did you do tell them?" and I told you what I told them. And then you said--to my surprise--that you were NOT surprised. And you said "Actually, Andreas, here are some reasons why decision analysis is dead." That sentence of yours has stuck with me for the last year. I would love to understand where decision analysis is well and alive, and which aspects of it are dead, and why?
Well, about 50 years ago, decision analysis was in its prime. Today, you can't even be sure that people will know what decision analysis is! It was a way to elicit from people, both their beliefs about what's going to happen, and the values that they attach to things, for example, the trade-offs that are involved. And after obtaining from the relevant people the values and the probabilities, you could compute the values of different options. And I remember thinking at that time, with my colleague Amos Tversky, that this is going to conquer the world. It seemed to be so obviously the way to think rationally. But it turns out it hasn't caught on. And it hasn't caught on, I think, for an interesting reason. Which is that you have leaders in place in organizations who do not want a program that will compute decisions. They want to make them themselves. And so decision analysis, in fact, hasn't caught on at all. I wouldn't say that it's dead because there are many people call themselves "decision analysts", and there is a thriving Society for Decision Analysis. But the kinds of problems where decision analysis is used are like oil companies, needing to make decisions about where to drill. I think that Chevron has hundreds of decision analysts. Now, what makes that a good question, is that this is a question that the CEO of Chevron is quite willing to delegate. It doesn't engage him personally. He wants it mechanically to be done well. It's important; Billions of dollars are at stake. But it's a repeated decision that he is willing to delegate. In contrast, decisions that are unique the CEO, the leader doesn't want to delegate. They won't want them computed either.
What I'm interested in is, of course, anything about people. Think about face recognition. Think about emotion recognition, not only through the visual, but also through the voice. And think about a future where there will be no more secrets. Danny, as you probably remember, I had a house in Shanghai for 17 years from around when we met until two years ago. And I've worked with Alibaba. I've worked with Tencent, and I know a bit how these companies they think about data, and what the role of the government is. And that's part of the reason why I'm out of China (laughter). The key question is: Given we know so much about people, what should we do? How as individuals, should we deal with it? I do want these data of the people and by the people to be data for the people. Data we actually benefit from.
Speaking for myself, I don't think I really care if there is some somebody watching me. When I hear this--what they can do with it, and when they will do with it-- I think, Otter transcribing what I say my sleep really doesn't matter. What really matters is whether somebody can use that and exploit that in some way.
Actually, I never really understood your enthusiasm for knowing, and for recording, where you are, at every minute. It doesn't seem to me to be very useful. Because, you know, it will take you in real time a whole day to know where you've been all day. And what's the point of that, that I never understood. But there are certain things where the automatic recording of things are useful, for example, the fact that photographs label themselves, also with a date stamp. Some features are useful. Other features I don't care about. But I want guarantees that they won't be used against me, or that they won't be used in a bad way against anyone.
Yes. My fascination with geolocation is that if you know time and space, you can recall many activities you did. For example, if you want to know when you were at a certain restaurant, you just search for that restaurant, etc.
Geolocation is very timely these days, because there have been those proposals. Google and Apple have been developing an app that uses geolocation to figure out whether somebody who turned out to have COVID was near other people. Israel has been following people for a long time. They have that facility of the shelf. Now, interestingly enough, what I hear from my European friends, this is a non-starter in Europe. This use of technology to follow people around, even for their own good, is a no-no. In Europe, there is a real deep fear of data. And it's true, I suppose, that if the data exists, it's going to be difficult to guarantee that they won't be misused. If we could guarantee that they won't be misused, if there was an organization we could trust not to misuse them, and to have those intentions forever, the existence of data wouldn't be a problem. But in Europe, there is what seemingly almost an allergy to technology, certainly to American technology, to any technology that that threatens privacy. Is that what you encountered in Germany?
Absolutely. I have been wondering quite a bit what the reasons are for people's genuine data fears. There is no question. They're real. And in Germany, we can see, after terrible things, Hitler-Germany, Stasi-Germany, that people are afraid for data to exist. I think it's time to try to figure out what these fears really are. I don't know the answer. But understanding data fears would be a great topic for researchers to spend some time on. And we need to address those that can be addressed. For example, the crypto world has solutions for problems that initially, without crypto, could not be solved. Differential privacy is an example of how you can learn about about the behavior of a group without revealing anything about the specific individuals. There are some good examples my book about that. Differential privacy is an example of how to protect the privacy of the individual, and, at the same time, be able to learn something about the group.
In general, actually, I'm always very suspicious of false trade-offs. When authorities tell you that something is a trade-off, but it isn't really, it is so they can perpetuate their authority. One of the questions I always have is when somebody says, "Well, this is your choice" is: "Is it really?"
Let me use health as an example, because I fear insurance companies could abuse the information that they see in your patterns. They could infer that you might be higher risk, and then, in a very non-transparent way, quote you a much higher rate than they would have, had they not known that about you. That's why I used health data as an example where I would worry. What other things do I worry about? I worry about things that I say that governments might not like, and then lock me up behind bars.
Nobody can argue with you on those fields. The real issue here is: What guarantees do citizens have in this day and age against high tech organizations that are enormously powerful, and against governments that are enormously powerful. This is a problem because, obviously, there is more and more power being concentrated. We need more guarantees. But ultimately, this is going to happen.
I was wondering why you were interested in geolocation. Geolocation data are being collected. There's no question. And they're going to be available. The issue is really how they're going to be used and what controls we can have. As for health, it's true private insurance companies are going to be interested in anything that can help them cherry pick their patients. And in a way, this is the admission to the extent that they have shareholders that this is their responsibility to their shareholders. So I'm not sure that they should exist. But that's, that's another problem.
So what I'm curious about is: We met 20 years ago, when you were in your 60s. I'm now in my 60s. What have you learned in those 20 years?
Oh, I hope I've learned a lot! But... what kind of an answer would you like? This is a question that gives me no hint about what's the proper answer.
There is no proper answer! I'm just curious as your friend who has seen you in many different situations: When you look back over those 20 years, what was important to you?
Over the last 20 years? You know, it's been work and family all my life. It's been work and people. So that hasn't changed. And, you know, my guess is that I have forgotten more than I have learned over the last 20 years. This is certainly the case over the last few years. I'm forgetting more than I'm learning.
But I remember reading about how social media and preference analysis leads to polarization.
And in preference analysis, all you have is an algorithm that finds out what you're interested in, and what your basic position is. And then it presents you with versions of your basic position that are more extreme than your current position, because you will find them more interesting than other things. So any algorithm that maximizes clicks or maximizes attention is going to create that sort of polarization.
That is why it's so important to negotiate, as a society, what objective function we want a company like Facebook to optimize for. For Amazon, it is simple, sell as much stuff as possible. Nobody questions that. For Facebook, it's very tricky. It can't just be the clicks. In a data-literate society, this is what we should be talking about.
Well, I'm going to jump in here on that very, very big question that I don't know that we're are going to get a chance to answer today. I'm tempted to let you guys just talk for the rest of the hour because it's so interesting. But I do want to make sure that we can get to questions. We've got a number of questions coming in from people who are watching this right now. I'm going to start with a few of my own. Listening to this conversation, there are so many different threads we could pull including what you're talking about now: Facebook and Amazon. And trust. But that's going to be a whole other Socrates session that we should set up for another time.
Now, I'm going to look at everything through the lens of the current pandemic. I noticed there are two main threads that each of you represent here, both in your comments and in your in your bodies of work. Andreas, in the trade-off here between public health and privacy, how do we protect data? How do we protect human privacy while serving public officials? Danny, how do people behave, respond? How do they process what's happening, and how do they synthesize the data in determining how they're going to behave? Danny, at the beginning of our of our session, you said something quite intriguing I want to follow up on. You said, the idea of trading life for something, trading love for something, for anything... no one wants to think about it. And you then followed up with a very provocative notion: But now they have to! So I want to ask you to talk a little bit more about that. We are in a situation where all over the world, certainly, but specifically in the US, individuals are having to make trade-offs in the face of sort of uneven guidance and direction at the local, state, and federal level about how they should behave, which effectively does come down to: What are the trade-offs we are making for lives, for ours and for others. What does the current pandemic say about human decision making?
The trade-offs that I was thinking about are at the societal level, not at the individual level. For me the the most important statistic about COVID is the fact that (if I'm to believe what I heard this morning) 97% of the dead are 65 years old and older. That means that there is an enormous difference, and really an enormous conflict of interest between the young and the old.
This is the trade-off that people don't want to face, and that societies don't want to face. And I think they should. I really think it is important, even if it's very uncomfortable for people like me: There is a conflict of interest that we refuse to recognize at this time between the young and the old. This is an example of a societal trade-off that's hugely important in the lives of individuals. Individuals have to make decisions that they're not really equipped to make. Primarily, they've got to decide how afraid they should be. People are afraid of things they shouldn't be afraid of, and they are not afraid of things they should worry about. And that is a problem. Not so much, I think, a trade-off because people don't think that way. But I think at the moment, and in the near future, when reopening takes place, fear is going to be a determining factor in the behavior of lots of people. That is going to be an interesting issue.
It's interesting that you identify fear as the determining factor in terms of individual decision making. Point taken about what's happening at the political level. It strikes me, and I'm curious about your view, that a lot of it has to do with how worried I am I about my own health, my own life, and my own livelihood, versus what is my responsibility to society? Do you see that there is different reasoning or thinking going on in the face of the same information in terms of my responsibility to myself versus my responsibility to my fellow citizens?
I would say that, in the same way that soldiers in wars don't really fight for the country, they fight for their buddies, the responsibility that people feel about, for example, self-quarantine, I first thought was an altruistic act. I'm told that I was in contact with somebody infected--I don't even know who--and now I'm supposed to leave my work for 14 days. I thought this is a very difficult thing to ask for. And I was talking to my daughter, and she said, it's actually extremely easy. You are responsible for the people you work with, and they will know that you are infected, and they will know that you might have suspected you were infected, and you just don't want that to happen. So it's at that level. I don't think there is a conflict. I don't think that people do a great deal in terms of their responsibility to society. You know, they do recycling, but that is not hard. A difficult thing like self-quarantining for 14 days, not working, and giving up income, those are the kinds of things that people will certainly do for their acquaintances and for their friends. I don't think that they will do it for an abstract idea. You cannot count on this.
Yeah, your analogy of the soldier in war actually is fascinating. The idea that they're not fighting for the country but they're fighting for their buddies. But in the end, it has the same effect.
Andreas, I want to just turn to you and then we're going to go to the audience. Andreas, your book "Data for the People" really foretells the dilemma we are in today when it comes to contact tracing apps in many ways. And the points that you made about privacy, of course, are paramount, depending on where you are in the world.
Did your views about privacy change when we're talking about the potential of saving human lives as in contact tracing apps?
I still believe that we should have rights to our data. And those rights come in two flavors. if you will. First, transparency rights, to see what is being collected about you (let's say with that wristband in Hong Kong), and to see what the data refineries do with it. Second, agency rights, empowering you to act upon the data. There is no big change in my view: those rights should rest with the consumer, with the citizen. And our duty is is to help people become more data literate, to help people understand what really is going on, or what could be going on, so they can make better informed decisions. I was actually surprised, when I reread some parts of the book a couple of weeks ago, how relevant it is in COVID time.
I want to interject here. I think that the current situation is one which I think you did not anticipate in your book. It's a situation where people create externalities. The reason that two policemen come to your door, or to somebody's door in Hong Kong, is to protect citizens from somebody who heedlessly could infect them. I'm not sure that people have the right to infect other people. This is the kind of right that can be agreed on by society. And you know, it's debatable. There clearly are trade-offs involved. But, in principle, it is not absurd for a society to abridge the rights to geolocation to prevent people from infecting others.
Your comments are perfectly distilling the societal debate. I do want to move now to the questions from the audience. With apologies for anybody's name I mangle, I'm going to start with a question coming from Sergio Maldonado who is with PrivacyCloud in Spain. Sergio's question is for you, Danny. He asked: "Is it fair to say that there is no such thing as purely data-driven decision making, as long as it is humans who make such decisions?"
Well, as long as it is humans who make the decisions, I think it's right that there is no such thing as purely data-driven decision making, because emotions and values are part of any decision we make. So data are never sufficient to guide the decision. There are values somewhere, and you cannot do it with data alone.
Thank you. The next question is from Aparna Mukherjee. She asks "Is there an opportunity for a concept like deliberative polling to enable better public health decision making. I'm trying to think of how to address the data fears that Andreas mentioned." Andreas, how about if you take that one?
I don't quite know what deliberative polling is, to be honest. And you, just reading the question might not know either. May I change the question a bit, and reflect it to Danny, because, after all, he is the person who I am asking the questions I don't know to answer. How do people actually know what they want? You spent your life, Danny, showing people how the assumptions of how they think are mostly wrong. You changed the way the world thinks about thinking. So talking about polling, what do you think?
Deliberative polling is a very interesting innovation. It's the interesting idea of having people think more deeply about the questions that they are asked, and in many cases, thinking together, with other people. This kind of questioning of the public requires groups to get together and to think before answering a question. This technique is interesting for democracies, and I think it should be used more.
Great, thank you. I didn't know what deliberative polling was either. So thank you for that. The next question comes from Kathryn Myronuk. She asks what current reactions to COVID-19 are most surprising to you. This could be groups reacting well or reacting badly, what is missing or present that you think caused this reaction? Danny, since we have you up, let's start with you. Is there anything about the human responses, either--as we were referring today, I'm just going to add --at the societal or at the individual level that has surprised you?
Well, I surprised myself. It's something that I knew in principle. I knew that people have a lot of difficulty with anything exponential. And that's the characteristic of an epidemic. It's exponential. And that's something I knew. I must have mentioned it in my teaching. But on March 6, I was about to fly from Los Angeles to Paris. And in Paris at that time, there were about 100 cases. And it looked like not very much, not a lot. But you know, it was growing at 27% a day at that time, which means that over a month, it would go by a factor of 1000. Unless it was tricked. I was incapable of seeing that. I really didn't see that. I didn't go to Paris, but not because I appreciated that actually 100 is a very threatening number when it's a number that's associated with an exponential function. So that's one of the things that surprised me personally, the difficulty that I have thinking about exponentials, and the difficulty the public has, including epidemiologists, by the way. There were surveys of epidemiologists about what they thought was going to be the situation 10 or 12 days later. And they were missing it because it's so non-intuitive: When something is doubling every three days, you know, it will be multiplied by 16 in a very short time.
Well, if someone like you who deeply understands exponentials still has a hard time making that intuitive leap, then that explains while the rest of us have had a hard time as well. So thank you for that. Andreas, I'm going to put the same question to you: What has been most surprising to you? By the way, I got a correction on the pronunciation of Kathryn's last name. It's Kathryn Myronuk. Yeah. So now I'm on the record.
Let me first respond to Danny's point. I absolutely agree that the concept of exponential, of a process being characterized by the time it takes to double, for instance, is something which is super important for people to understand. And it really is brought home with COVID. But as a physicist, as an experimental physicist, I also think that there is another point which is important to be understood. That is that measurements have errors--false positives, false negatives, etc, in the tests. So, if anything good comes out of COVID, maybe people will become more data literate, as they are thinking for themselves and maybe discussing with others, what the implications are.
To answer the question about surprise: I am surprised how happy I am. I did not expect that. Had you asked me to imagine staying home for weeks, I would have said that that's terrible--not having dinner with my friends. not being on airplanes, etc. But it turned out that if I take a, say, one-month moving average of my happiness, this is among the happiest time of my life. And when I told Danny that in the prep talk we had a couple of days ago, he said, well, and what did you learn from that?
(laughter) And what was your answer?
I think what I love is to have my assumptions challenged, to see the world with new eyes. Now, here is a big caveat. I say this as a person who is lucky to have a nice house in San Francisco, who can afford the food, etc. Sadly, it is a terrible situation for the many homeless people here -- their situation is really a disaster. But from my perspective, what I learned is, being home is not that bad.
Well, I have to admit it, I feel the same. Unfortunately, we have so many more questions than we have time to answer. But this has just been an incredible conversation. I want to thank you. I'm actually now going to turn it over to Cordell Carter who's going to say a couple of words.
Please join me in thanking Vivian, Andreas, and Danny, a Nobel Prize winner, for joining us today. This is exactly the type of engaging content and we're trying to bring to our global community of almost eight thousand leaders every single week. And we'll see you next time. Thank you all so much, and see you soon.
Thank you all.
Thank you. Bye bye!
So that's it. Thank you for taking the time to listen to this conversation with Daniel Kahneman on decisions and data. I am Andreas Weigend. I don't have a regular podcast or blog or anything like that. But if you're interested in more, please go to weigend.com/next . That takes you to a simple Google form where you can let me know who you are, and what you are interested in. Thank you.