“Do Nothing” leadership doesn’t mean that you can play golf every day. Instead, it means doing less than you did in your last job so you can focus your time and effort on facilitating and orchestrating. Thus, “Do Nothing” leaders don’t really do nothing in a literal sense. Instead, they think of great strategies and help others implement them. They spend their time preparing for the future. They take a broad, comprehensive view, even as they attend to key details, so they can confidently choose the right forks in the road, and sometimes even create those forks in the road. In essence, “Do Nothing” leaders think, make key decisions, help people do their jobs better, and add a touch of organizational control to make sure their final recipes come out okay.
Are there ever times when it pays for a leader to literally “Do Nothing”? Obviously, leaders might find it useful to pause and consider options when they really don’t know what to do, i.e., when they are faced with a novel situation that requires insights and skills that they don’t have. These are times when it is critical to avoid doing anything until you can get relevant information.
Recent research has also identified another key time when leaders should literally “Do Nothing.” Two new research papers, one that was just published in a leading management journal and the other that will soon appear in a leading psychology journal, both suggest that leaders who face moral decisions should stop and Do Nothing!
Shaul Shalvi, Ori Eldar, and Yoella Bereby-Meyer investigated whether people would limit their dishonest behavior if they had enough time to think about their choice and if they didn’t have an easy justification for being dishonest. In their experiments, people rolled a die and reported the results of their own roll. They knew that a six would pay more than a five which would pay more than a four, etc. Because they rolled the die inside an upside-down cup, with a hole in the top, only they could see their result – and they knew that no one else could see it either. This gave them an easy opportunity to inflate their results, since no one would be the wiser.
In the first of their two studies, people were told that they would only report the results of their first roll of the die, but they should also roll the die two additional times. The instructions emphasized that their payoff would only depend on the outcome of their first roll.
The extra rolls made it easy for people to construct what psychologists call ‘counterfactuals,’ i.e., thoughts about what could have happened (even though it didn’t). People think of counterfactuals all the time, e.g., “If only I had set a second alarm clock, I wouldn’t have slept through the first one and missed my plane” or “If you had warned me, I would never have done something so foolish.”
When subsequent rolls of a die yield higher outcomes than the first roll, as they often do, it is easy for people to think “If only I could report that result rather than my first.” Thus, previous research has shown that people lie more when they roll a die several times.
In addition to rolling the die three times, half of the people in this first study were asked to report their outcome either quickly (within 20 seconds) or at their leisure, i.e., they had no time limit at all.
Odds tell us that rolling a 1, 2, 3, 4, 5, or a 6 is equally likely. Thus, if everyone truthfully reports the outcome of their first die roll, the average should be right around 3.5. Instead, the average for the slow reporters was 3.87, suggesting that some of them reported a better outcome than they actually received. The quick reporters, in contrast, lied even more, reporting an average of 4.56. This suggests that waiting and doing nothing before making a moral decision increased truth-telling and reduced the number of people who lied.
In their second experiment, a different group of people rolled the die only one time. Previous research has shown that this reduces lying because people don’t have an immediate counterfactual that they can use to justify lying. Again, people were required to report their outcome either right away – within eight seconds – or with no time pressure at all.
This time the data suggest that the people who had no time pressure did not lie at all: their average report, 3.42, is just less than what we would expect from a normal distribution of rolls. People who had time pressure, however, again seemed to inflate their outcomes, reporting an average of 4.38.
Many previous research studies have shown that, when people think that they are acting anonymously and lying pays, many people lie. These results suggest that people act more ethically and lie less if they take their time deciding and they don’t have easy justifications for lying.
In another, independent study, my colleagues Brian Gunia, Long Wang, Li Huang, Jiunwen Wang and I also presented people with an opportunity to lie to obtain a better payoff. Rather than rolling a die, the people in our experiment had a very clear choice: they could send a truthful message to an anonymous person in another room and receive a payoff of $4 or they could send a message that was not true to that same person and markedly increase their chances of getting $6. Thus, we tempted them to lie so that they could boost their pay by $2.
Some of the people in our study experienced what we called a “contemplation condition”: they sat in front of a computer screen for three minutes before they could send their message. The entire time, the message on the screen said “Please think very carefully about which message to send.” Once the three minutes was up, these people knew what they were going to do and they chose very quickly.
In contrast, people who were in our “immediate choice condition” were asked to send their message right way. On average, it took them about 11 seconds. Thus, their choices were not literally immediate, but they certainly didn’t contemplate very long.
The results? Almost 90 percent of the people in our contemplation condition sent truthful messages, compared to just more than half (56%) in our immediate choice condition.
People in both the contemplation and the immediate choice conditions could also embellish their message by adding “this is the truth” if they wished. (This made a lie even more unethical.) None of the contemplators told a lie and called it the truth, but 18 percent of the immediate choosers did. We might call them ‘really big liars.’
After the experiment was over, it was interesting to see what people said about their choices. Truth-tellers emphasized the fact that this was a moral situation and that social norms supported telling the truth. People who lied, in contrast, felt that this same situation focused on and supported self-interest. Unlike the truth-tellers, they also said that they viewed the situation as being more business-oriented. Thus, individuals’ explanations for their decisions were internally consistent: they were very good at justifying their actions, whether those actions were to lie or to tell the truth. (To their credit, the people who lied did express a bit of regret for their actions.)
The bottom line from these two studies is very clear: don’t rush into moral decisions. Instead, take your time and think about things for a little while. Theories of morality suggest that our immediate reactions are often rooted in our evolutionary heritage: years and years ago it was important to be self-interested as this helped people achieve self-preservation. If our ancestors passed these successful tendencies down to us, as evolution suggests that they did, then our instinctual, immediate responses to temptation may well be to act only in our own personal interests.
Over our life-times, and particularly in the currently civilized social world, we learn that lies are immoral acts that often have reverberating consequences that don’t provide a simple satisfaction of our self interest. Thus, when we take the time to do nothing but think about our moral decisions for a little while, it increases the likelihood that we will overcome our temptations and do the right thing. In the end, then, To Do Nothing may truly help us Do the Right Thing.
Gunia, B. C., Wang, L., Wang, J., Huang, L., & Murnighan, J. K. (2012). Contemplation and conversation: subtle influences on moral decision making. Academy of Management Journal, 55, 13-33.
Shalvi, S., Eldar, O., & Bereby-Meyer, Y. (2012, forthcoming). Honesty requires time (and lack of justifications). Psychological Science.