by Leon Neyfakh
You hear it all the time: We humans are social animals. We need to spend time together to be happy and functional, and we extract a vast array of benefits from maintaining intimate relationships and associating with groups. Collaborating on projects at work makes us smarter and more creative. Hanging out with friends makes us more emotionally mature and better able to deal with grief and stress.
Spending time alone, by contrast, can look a little suspect. In a world gone wild for wikis and interdisciplinary collaboration, those who prefer solitude and private noodling are seen as eccentric at best and defective at worst, and are often presumed to be suffering from social anxiety, boredom, and alienation.
But an emerging body of research is suggesting that spending time alone, if done right, can be good for us — that certain tasks and thought processes are best carried out without anyone else around, and that even the most socially motivated among us should regularly be taking time to ourselves if we want to have fully developed personalities, and be capable of focus and creative thinking. There is even research to suggest that blocking off enough alone time is an important component of a well-functioning social life — that if we want to get the most out of the time we spend with people, we should make sure we’re spending enough of it away from them. Just as regular exercise and healthy eating make our minds and bodies work better, solitude experts say, so can being alone.
One ongoing Harvard study indicates that people form more lasting and accurate memories if they believe they’re experiencing something alone. Another indicates that a certain amount of solitude can make a person more capable of empathy towards others. And while no one would dispute that too much isolation early in life can be unhealthy, a certain amount of solitude has been shown to help teenagers improve their moods and earn good grades in school.
“There’s so much cultural anxiety about isolation in our country that we often fail to appreciate the benefits of solitude,” said Eric Klinenberg, a sociologist at New York University whose book “Alone in America,” in which he argues for a reevaluation of solitude, will be published next year. “There is something very liberating for people about being on their own. They’re able to establish some control over the way they spend their time. They’re able to decompress at the end of a busy day in a city…and experience a feeling of freedom.”
Figuring out what solitude is and how it affects our thoughts and feelings has never been more crucial. The latest Census figures indicate there are some 31 million Americans living alone, which accounts for more than a quarter of all US households. And at the same time, the experience of being alone is being transformed dramatically, as more and more people spend their days and nights permanently connected to the outside world through cellphones and computers. In an age when no one is ever more than a text message or an e-mail away from other people, the distinction between “alone” and “together” has become hopelessly blurry, even as the potential benefits of true solitude are starting to become clearer.
Solitude has long been linked with creativity, spirituality, and intellectual might. The leaders of the world’s great religions — Jesus, Buddha, Mohammed, Moses — all had crucial revelations during periods of solitude. The poet James Russell Lowell identified solitude as “needful to the imagination;” in the 1988 book “Solitude: A Return to the Self,” the British psychiatrist Anthony Storr invoked Beethoven, Kafka, and Newton as examples of solitary genius.
But what actually happens to people’s minds when they are alone? As much as it’s been exalted, our understanding of how solitude actually works has remained rather abstract, and modern psychology — where you might expect the answers to lie — has tended to treat aloneness more as a problem than a solution. That was what Christopher Long found back in 1999, when as a graduate student at the University of Massachusetts Amherst he started working on a project to precisely define solitude and isolate ways in which it could be experienced constructively. The project’s funding came from, of all places, the US Forest Service, an agency with a deep interest in figuring out once and for all what is meant by “solitude” and how the concept could be used to promote America’s wilderness preserves.
With his graduate adviser and a researcher from the Forest Service at his side, Long identified a number of different ways a person might experience solitude and undertook a series of studies to measure how common they were and how much people valued them. A 2003 survey of 320 UMass undergraduates led Long and his coauthors to conclude that people felt good about being alone more often than they felt bad about it, and that psychology’s conventional approach to solitude — an “almost exclusive emphasis on loneliness” — represented an artificially narrow view of what being alone was all about.
“Aloneness doesn’t have to be bad,” Long said by phone recently from Ouachita Baptist University, where he is an assistant professor. “There’s all this research on solitary confinement and sensory deprivation and astronauts and people in Antarctica — and we wanted to say, look, it’s not just about loneliness!”
Today other researchers are eagerly diving into that gap. Robert Coplan of Carleton University, who studies children who play alone, is so bullish on the emergence of solitude studies that he’s hoping to collect the best contemporary research into a book. Harvard professor Daniel Gilbert, a leader in the world of positive psychology, has recently overseen an intriguing study that suggests memories are formed more effectively when people think they’re experiencing something individually.
That study, led by graduate student Bethany Burum, started with a simple experiment: Burum placed two individuals in a room and had them spend a few minutes getting to know each other. They then sat back to back, each facing a computer screen the other could not see. In some cases they were told they’d both be doing the same task, in other cases they were told they’d be doing different things. The computer screen scrolled through a set of drawings of common objects, such as a guitar, a clock, and a log. A few days later the participants returned and were asked to recall which drawings they’d been shown. Burum found that the participants who had been told the person behind them was doing a different task — namely, identifying sounds rather than looking at pictures — did a better job of remembering the pictures. In other words, they formed more solid memories when they believed they were the only ones doing the task.
The results, which Burum cautions are preliminary, are now part of a paper on “the coexperiencing mind” that was recently presented at the Society for Personality and Social Psychology conference. In the paper, Burum offers two possible theories to explain what she and Gilbert found in the study. The first invokes a well-known concept from social psychology called “social loafing,” which says that people tend not to try as hard if they think they can rely on others to pick up their slack. (If two people are pulling a rope, for example, neither will pull quite as hard as they would if they were pulling it alone.) But Burum leans toward a different explanation, which is that sharing an experience with someone is inherently distracting, because it compels us to expend energy on imagining what the other person is going through and how they’re reacting to it.
“People tend to engage quite automatically with thinking about the minds of other people,” Burum said in an interview. “We’re multitasking when we’re with other people in a way that we’re not when we just have an experience by ourselves.”
Perhaps this explains why seeing a movie alone feels so radically different than seeing it with friends: Sitting there in the theater with nobody next to you, you’re not wondering what anyone else thinks of it; you’re not anticipating the discussion that you’ll be having about it on the way home. All your mental energy can be directed at what’s happening on the screen. According to Greg Feist, an associate professor of psychology at the San Jose State University who has written about the connection between creativity and solitude, some version of that principle may also be at work when we simply let our minds wander: When we let our focus shift away from the people and things around us, we are better able to engage in what’s called meta-cognition, or the process of thinking critically and reflectively about our own thoughts.
Other psychologists have looked at what happens when other people’s minds don’t just take up our bandwidth, but actually influence our judgment. It’s well known that we’re prone to absorb or mimic the opinions and body language of others in all sorts of situations, including those that might seem the most intensely individual, such as who we’re attracted to. While psychologists don’t necessarily think of that sort of influence as “clouding” one’s judgment — most would say it’s a mechanism for learning, allowing us to benefit from information other people have access to that we don’t — it’s easy to see how being surrounded by other people could hamper a person’s efforts to figure out what he or she really thinks of something.
Teenagers, especially, whose personalities have not yet fully formed, have been shown to benefit from time spent apart from others, in part because it allows for a kind of introspection — and freedom from self-consciousness — that strengthens their sense of identity. Reed Larson, a professor of human development at the University of Illinois, conducted a study in the 1990s in which adolescents outfitted with beepers were prompted at irregular intervals to write down answers to questions about who they were with, what they were doing, and how they were feeling. Perhaps not surprisingly, he found that when the teens in his sample were alone, they reported feeling a lot less self-conscious. “They want to be in their bedrooms because they want to get away from the gaze of other people,” he said.
The teenagers weren’t necessarily happier when they were alone; adolescence, after all, can be a particularly tough time to be separated from the group. But Larson found something interesting: On average, the kids in his sample felt better after they spent some time alone than they did before. Furthermore, he found that kids who spent between 25 and 45 percent of their nonclass time alone tended to have more positive emotions over the course of the weeklong study than their more socially active peers, were more successful in school and were less likely to self-report depression.
“The paradox was that being alone was not a particularly happy state,” Larson said. “But there seemed to be kind of a rebound effect. It’s kind of like a bitter medicine.”
The nice thing about medicine is it comes with instructions. Not so with solitude, which may be tremendously good for one’s health when taken in the right doses, but is about as user-friendly as an unmarked white pill. Too much solitude is unequivocally harmful and broadly debilitating, decades of research show. But one person’s “too much” might be someone else’s “just enough,” and eyeballing the difference with any precision is next to impossible.
Research is still far from offering any concrete guidelines. Insofar as there is a consensus among solitude researchers, it’s that in order to get anything positive out of spending time alone, solitude should be a choice: People must feel like they’ve actively decided to take time apart from people, rather than being forced into it against their will.
Overextended parents might not need any encouragement to see time alone as a desirable luxury; the question for them is only how to build it into their frenzied lives. But for the millions of people living by themselves, making time spent alone time productive may require a different kind of effort. Sherry Turkle, director of the MIT Initiative on Technology and Self, argues in her new book, “Alone, Together,” that people should be mindfully setting aside chunks of every day when they are not engaged in so-called social snacking activities like texting, g-chatting, and talking on the phone. For teenagers, it may help to understand that feeling a little lonely at times may simply be the price of forging a clearer identity.
John Cacioppo of the University of Chicago, whose 2008 book “Loneliness” with William Patrick summarized a career’s worth of research on all the negative things that happen to people who can’t establish connections with others, said recently that as long as it’s not motivated by fear or social anxiety, then spending time alone can be a crucially nourishing component of life. And it can have some counterintuitive effects: Adam Waytz in the Harvard psychology department, one of Cacioppo’s former students, recently completed a study indicating that people who are socially connected with others can have a hard time identifying with people who are more distant from them. Spending a certain amount of time alone, the study suggests, can make us less closed off from others and more capable of empathy — in other words, better social animals.
“People make this error, thinking that being alone means being lonely, and not being alone means being with other people,” Cacioppo said. “You need to be able to recharge on your own sometimes. Part of being able to connect is being available to other people, and no one can do that without a break.”
The world is obsessed with stories of success. There is a well-known concept in the management literature called “the survivor bias,” which refers to the erroneous conclusions that researchers draw from focusing excessively on successful organizations and people. Pick up any magazine, and you will see the survivor bias in action: the stories are almost always about the successful; very few stories focus on the failures.
At one level, the focus on the successful is understandable; after all, we all want to be successful, and so, focusing on those who have already “been there and done that” would seem appropriate.
There is, however, a flip side. An obsession with success can have negative side effects on what arguably matters even more in life: being happy.
One of the key drivers of success is perseverance and a “never say die” attitude. This is epitomized in a variety of sayings, such as, “the harder you work, the luckier you get,” (a quote by the South African Golfer Gary Player), and “Never, never, never, give up” (Winston Churchill’s famous quote). The focus on hard work and achieving success appears to have reached a feverish pitch in recent years: Even kids in kindergarten are reminded of the importance of perseverance, as the picture of the mural that I took at my son’s school indicates. Children today are so overworked that they don’t get the requisite amount of sleep . All this hard work and focus on goals has probably enhanced our productivity, but what is not as well known is the cost at which such success is earned.
Are successful people necessarily happier?
At least two streams of research are relevant for addressing this question. First is research on ego depletion, which suggests that willpower is a limited resource, much like muscle strength or mental energy. When a person spends willpower on one activity (to study for an exam), the amount of willpower left for a subsequent activity (to overcome temptation to have a dessert) is diminished. This means that when one is obsessed with a particular goal (getting good grades at school), other goals (going to the gym, maintaining healthy relationships, etc.) whose achievement also depends on the same pool of willpower, naturally flounder. Ego-depletion theory would thus predict that the more one is driven to achieve success, the less one will be able to focus on other important determinants of life-satisfaction.
Findings from research on hyperopia also lead to similar conclusions. Hyperopia is the opposite of myopia, and myopia, as we all know, refers to the tendency to be too shortsighted and impulsive. Most of us are constantly warned against being myopic: “Don’t be extravagant, save for the future,” or “avoid unhealthy food for the sake of future health,” or, “exercise regularly to be healthy,” etc. Perhaps as a result of exposure to such messages, many of us are habituated to thinking about the future consequences of our present actions. For instance, rather than choosing to work in an area that is intrinsically motivating, many of us choose to work in an area that we think will be “hot” in the near future. Likewise, when buying a home, we focus too much on whether it is a good investment rather than on whether we will enjoy living in it.
In other words, most of us sacrifice our present-day enjoyment for the sake of a future that may never really arrive, as a set of studies by Kivetz and Keinan showed. These researchers interviewed people in the winter years of their life, and asked them what they would change about their past if they could re-live their lives. Findings from one study revealed that people consistently wished that they had been a little less work-oriented, that is, a little less focused on being successful, and a little more pleasure-orientated, that is, a little more focused on enjoying life. Other studies, both by these authors and by others, yielded similar results.
What this suggests then, from the perspective of maximizing well-being and happiness, is that it may be more important to give up on goals that take too much out of us than to pursue them at all cost. Studies by Wrosch and his colleagues confirmed this thesis. Across three studies, they found that people who are able to disengage from unattainable goals are happier than those who continue to pursue them. Studies from another paper, also by Worsch and his colleagues, showed that those who disengage from goals that are exceedingly difficult to attain experience health benefits, like lowered levels of cortisol (the stress hormone).
The million-dollar question, of course, is: how does one decide when to give up a particular goal? This is not an easy question to answer, which is why deciding which goals to give up, and when, is an art rather than a science. Perhaps no single answer is appropriate for everyone. However, if you feel that you are highly stressed (e.g., if you need sleeping pills to fall asleep), and if you feel that your stress is mainly due to your obsession with goal-attainment (as opposed to, say, failing health or poor relationships), you could take it as a sign that you are too goal-directed for your own good.
This is not to say, of course, that any goal that produces stress should be abandoned; indeed, in an earlier post, I argued against giving up too fast—before reaching the tipping point of expertise. To figure out which goals to keep at and which ones to jettison, ask the question: What am I trying to prove—and to whom—by achieving the goal? The only goals worth stressing about are those that help you grow as a person, either by helping you enhance your expertise in a domain or by helping those around you. Goals that are pursued for the sake of making even more money than needed, or ones that are pursued for the sake of signaling superiority are simply not worth losing sleep over.
But even more basic than figuring out which goals to pursue and which ones to abandon, is having the clarity to accord the goal of leading a happy and fulfilling life your number 1 priority. Do you have it? If not, be aware that you may grow to be one of those who, like the participants in Kivetz and Keinan’s study, regret having sacrificed enjoyment for the sake of success.