Advertisement

Unifor Leaderboard

Paying attention: Focus and distraction in the digital age

Media Culture

Today’s most popular social media platforms are essentially ad-revenue-creating attention machines. Their success hinges on their power to bring content to users, users to content, and users to users in a way that maximizes engagement. Photo from iStock.

Most of us are familiar with the headline-grabbing report by Microsoft five years ago that our average attention span had shrunk to a meagre eight seconds. That’s less, we were told, than that of the humble goldfish (gasp!). It didn’t seem to matter much that subsequent efforts to locate and evaluate the research behind this claim came to naught. Nor did it matter that it makes little sense to talk about overall “attention span,” or make comparisons to a species that lacks a neocortex and therefore any executive functions such as attentional control. In fact, no amount of debunking could put this pseudo-fact to rest. Why?

We believed it because it felt right. It captured the prevailing sentiment that we’ve become spread too thin for our own good—attending to too much, over-connected, chronically interrupted, overly distracted, unable to concentrate as needed. Homo distractus emerged as the new face of humanity, at least those on the “haves” side of the digital divide.

All this attention to attention is hardly new. Literary scholar Natalie Phillips identified it as a preoccupying theme in fiction as far back as the 18th century. Focus and distraction are, it might be said, the yin and yang of modernity, complementary forms of our increasingly overtaxed capacity for attention.

In the 1930s, the English poet T.S. Eliot described urban dwellers caught up in the grind of mechanical time as, “Distracted from distraction by distraction / Filled with fancies but empty of meaning / Tumid apathy with no concentration.” That sounds eerily familiar. Like holding up an antique mirror to see ourselves in the present. But is it a fair reflection?

Attention, as a dialectic of focus and distraction, has become central to understanding mediated life in the information economy. If we are our experiences, and there are more possibilities to experience than ever before, then the highly selective allocation, surrender, and capture of attention determines what we become—and all that we can become⁠—as individuals and as a society. William James understood this over a century ago when he wrote, “My experience is what I agree to attend to.” James, however, could not foresee a world where the concentrating and diverting of attention would become as much a matter of seduction and compulsion as one of autonomous choice.

The new millennium has provoked a good deal of hand-wringing over the battle for attention in our wired world. “What are the costs to a society,” asks media law scholar Tim Wu, “of an entire population conditioned to spend so much of their waking lives not in concentration and focus but rather in fragmentary awareness and subject to constant interruption?” Author and blogger Andrew Sullivan goes further: “This new epidemic of distraction is our civilization’s specific weakness. And its threat is not so much to our minds, even as they shape-shift under the pressure. The threat is to our souls. At this rate, if the noise does not relent, we might even forget we have any.”

Celebrated tech writer Nicholas Carr joins the scrum to describe our Faustian pact with our digital devices: “We willingly accept the loss of concentration and focus, the division of our attention and fragmentation of our thoughts, in return for the wealth of compelling or at least diverting information we receive.” In the same vein, former Microsoft exec Linda Stone laments that, “In a 24/7, always-on world, continuous partial attention as our dominant attention mode contributes to feeling overwhelmed, overstimulated, and unfulfilled. We are so accessible, we’re inaccessible.”

There’s no doubt that the acceleration of digital life has made us busier and more scattered than ever before. So much so that we’re even distracted by things that merely remind us of distraction. Research by Adrian Ward and his colleagues reveals that simply having our smartphones in sight, turned off and faced down, is enough to reduce attentional capacity and impair cognitive performance. The iPhone as kryptonite.

Today’s most popular social media platforms—Facebook, Instagram, YouTube, Twitter, Pinterest, Reddit, Snapchat, and WhatsApp—are essentially ad-revenue-creating attention machines. Their competitive success hinges on their power to bring content to users, users to content, and users to users in a way that maximizes engagement. Success is reflected in the number of active users, time spent on the platform, and depth of participation (communication, sharing, purchasing, linking, content creation, etc.).

In a proliferating mediascape suffused with alternatives, platform providers understand that the effective modulation of affect is key to winning the attention game. Users navigate these virtual spaces as a form of emotional regulation, in a desultory bid for amusement, excitement, sexual arousal, communion, corroboration, acknowledgment, shock, awe, disgust, voyeuristic or exhibitionistic thrills, enlightenment, and inspiration, depending on their mood.

Charles Baudelaire’s poetic image of the 19th-century flâneur, meandering the streets of Paris in search of stimulation and diversion, has been reconstituted as today’s social media user chasing the next dopamine rush. To enhance the chances that users will find or receive what they’re looking for, the platform relies on a personal history of clicks, visits, likes, shares, posts, and social connections to custom-fit content to desire. The result: a continually refreshed reservoir of individualized distraction.

Popularized terms like multitasking, continuous partial attention, and hyper-reading reflect the reality that mobile digital life has spawned a culture of fractionated and fickle attention. Take, for example, the evolution of the phenomenon known as “phubbing”: preferring the company of your smartphone over those you’re physically with. A decade or so ago, head-shaking Gen X and Boomer nostalgics would post “look-what-we’ve-come-to” pictures of jaded-looking Millennials dining together at restaurants. In the pictures, no one was looking at anyone else; all were staring intently at their phones. It’s not uncommon to see the youth of today looking much the same, but also managing to carry on some semblance of conversation with others at the table as they blithely text, scan, or browse. Behold the increasing powers of the divided and unfocused mind! Such is progress.

Recent jeremiads about the hazards of digital distraction remind us that the issue is fundamentally a moral one. We learn this early in life. Beginning in preschool, we’re praised for focusing on the task at hand and faulted when we succumb to distraction. By middle school, the stakes are raised. Too much daydreaming, doodling, fidgeting, or talking in math class would land you in the hallway or principal’s office a generation or two ago. Now you run the risk of being diagnosed with ADHD and prescribed Adderall. Whatever the institutional response, the guiding precept remains the same: staying focused is good, losing it is bad.

The distrust of distraction runs deep in Judeo-Christian thought. In the Book of Proverbs, Solomon exhorts his children to “Let your eyes look straight ahead; fix your gaze directly before you…Do not turn to the right or the left; keep your foot from evil.” This casting of distraction as a pathway to sin reached its high-water mark in the 17th-century theological musings of Blaise Pascal. “The only thing that consoles us for our miseries is distraction,” wrote Pascal in the Pensées, “yet that is the greatest of our wretchednesses. Because that is mainly what prevents us from thinking about ourselves and leads us imperceptibly to damnation.”

Pascal’s warning strikes the 21st century reader as extrême. A private moral panic of sorts. Few of us fear for our immortal souls when sneaking in a quick round of PUBG or Subway Surfers at the bus stop. And yet the denunciation of distraction and valorization of focus carries on today. In our therapeutic age, it no longer hinges on the threat of perdition. It’s now about finding happiness, success, and personal fulfilment in a mediated world of excessive distraction. Just as our cultural obsession with thinness has grown in proportion to our expanding waistlines, our concern with mental focus has intensified in step with the growing range and power of our distractions. A sampling of popular self-help titles tells the story: Living beyond Distraction, Habits of Purpose for an Age of Distraction, Staying Focused in Times of Distraction, Indistractable, Find Your Focus Zone, Delivered from Distraction.

The last of these deals with the challenges of living with ADHD (attention-deficit/hyperactivity disorder), the diagnostic totem of our distractible society. There has been a lot of passionate debate in recent decades over whether ADHD is a “real illness,” although much of that turns on what exactly is meant by real and illness. Suffice it to say that it’s real and disruptive enough at the high end to have received medical notice for at least two-and-a-half centuries. Writing in 1775, the German philosopher-physician Melchior Adam Weikard characterized patients with profound inattentiveness as follows:

He studies his matters only superficially; his judgements are erroneous and he misconceives the worth of things because he does not spend enough time and patience to search a matter individually or by the piece with the adequate accuracy. Such people only hear half of everything; they memorize or inform only half of it or do it in a messy manner. According to a proverb they generally know a little bit of all and nothing of the whole.


This does sounds a lot like the attention deficit dimension of ADHD. But it also sounds like just about every Gen Z digital native staring at a screen today! And there’s the rub. ADHD is not what philosophers like to call a “natural kind.” Natural kinds – things like tigers, H2O, and coronaviruses – have fairly clear-cut classificatory boundaries that remain indifferent to our preferences, purposes, and biases. Simply calling ADHD a “neurodevelopmental disorder” and pointing to a host of weak biomarkers doesn’t transform it into a natural kind. The truth is that the definition and diagnosis of ADHD is and always has been inextricably bound up with cultural standards of what is acceptable, appropriate, manageable, adaptive, and functional in the range of human behavior. And cultural standards rest on shifting sands.

The increase in the diagnosis of ADHD since the 1970s has many causes, to be sure. Critics have argued that one is the pathologization of fairly common behaviours that are increasingly seen as detrimental to both the educational and occupational prospects of individuals and the task of managing them within private and institutional settings. The ADHD literature is rife with discussion of “under-performing” and “difficult” children and, more recently, adults. Given this threat, it is little surprise that 70 percent of children and youth diagnosed with ADHD in Ontario are put on some rectifying medication, usually a stimulant. Nor is it surprising that an increasing percentage of post-secondary students report using these same stimulants non-medically to enhance their academic performance.

Another cause of the apparent growth of ADHD, some have argued, is the increasing penetration of advanced information, communication, and entertainment technologies into everyday life. A kaleidoscopic array of easily accessible, mobile, and insistent digital distractions can prove irresistible to those with pre-existing attention control problems. In such a hyper-stimulating environment, these problems can easily intensify into a diagnosable mental disorder.

Induction into the networked world of distraction appears to be occurring at younger and younger ages. Who hasn’t seen that toddler in the park swiping manically on an iPad instead of marveling at the mystery of a butterfly or bonding with parents. Some say they’ve seen infants doing the same, but I’m doubtful. Or perhaps just not ready to believe.

Even so, before we rush off to sign up for that mindfulness class, commit to a digital detox, or download the latest “attention management” app, we should give distraction its due. After all, there are times when it’s good for us.

First, we often rely on distraction to take our minds off of physical or emotional pain that focusing on would only worsen in the moment. Think of those early childhood vaccinations. Similarly, distraction can be effective in resisting obsessions, compulsions, and addictions.

Second, releasing the mind to wander from sensation to sensation or thought to thought can reveal unconscious patterns that help us better understand ourselves. The technique of free association in Freudian psychoanalysis is a well-known example. Here, giving consciousness full license to travel where it wants can bring to light repressed psychic content of which we’re unaware.

Third, distraction can help us escape the “cognitive fixation” that blocks us from coming up with creative solutions to problems. It can also invite imaginative connections that inspire innovation, insight, and artistic expression. The German chemist August Kekulé claimed to have solved the puzzle of the benzene molecule by daydreaming about a snake swallowing its tail.

Fourth, there are situations in which a deliberate “deconcentration of attention,” as the Russian psychologist Oleg Bakhtiyarov calls it, is highly adaptive. Avoiding attentional focus in these situations allows for simultaneous awareness of the entire perceptual field and therefore more effective responding. Combat shooting and the monitoring of a complex array of indicators in a nuclear power plant are examples.

Returning to the ethics of attention, it should be noted that interpreting a situation as an instance of focus rather than distraction is often a matter of evaluative perspective. Take Leonardo da Vinci’s well-known painting of the Annunciation, which hangs in the Uffizi in Florence. It shows the Archangel Gabriel informing Mary that she will give birth to the Messiah. Mary has her right hand on the book she was reading before Gabriel arrived with the big news. When describing the event as depicted, few would claim that Mary was “distracted” from her reading by Gabriel. If anything, more would be likely to say that she was distracting herself by reading before Gabriel arrived to focus her attention on who and what is of far greater import for everyone concerned.

“Annunciation”, a painting attributed to the Italian Renaissance artist Leonardo da Vinci, dating from circa 1472–1475. It is housed in the Uffizi gallery of Florence, Italy.

The point here is that what counts as mere distraction and what as corrective change of focus depends on our value commitments with regard to what one ought to be doing in a given situation. Parents accuse teens of distracting themselves from their homework with silly video games such as Final Fantasy or Call of Duty. Teens, in turn, accuse parents (and teachers) of distracting them from these same games with silly homework. Duck or rabbit? It depends on the moral squint.

So where does all this leave us? Are we really giving less attention to each of the myriad experiences that make up our lives? Perhaps. The average web page visit lasts less than a minute, we’re told. And only 20 percent of the words on that average page are read. But isn’t that to be expected when there are over four billion web pages out there on over eight million web servers? And where content providers are buying or ensnaring our limited attention with greater and greater ingenuity?

Then there’s our ever-expanding rhizome of virtual connections, people whose flipbook lives we feel compelled to follow, acknowledge, support, criticize, comment on, and interleave with the projected minutiae of our own. In this context, is it any wonder that we’re no longer willing to devote much time and attention to any one thing or person?

Unwilling or unable? That is the crucial question. Are we still able to read, watch, and listen closely and with profound consideration? To take the time to look under, behind, and around things to see their full meaning when insight is sorely needed? To sit and reflect at length on a single issue when answers elude us?

Hearing some aging professor complain that today’s students no longer have the patience to read Tolstoy’s War and Peace, or produce a sustained 10,000-word argument, doesn’t tell us much in this regard. It may simply be that few students today are interested in reading novels of over a thousand pages, or writing long, turgid essays. It doesn’t necessarily speak to their capabilities. Performance is not competence.

What matters most is whether changes in our habitual patterns of attention—our increased openness to interruption and accelerated pursuit of distraction—have somehow impaired our ability to persevere single-mindedly if and when we want to. And that remains unclear. I am perversely reassured, however, by images of steely-eyed teenagers playing the same video game to the point of hunger, dehydration, and exhaustion. Watching these harbingers of our attentional future as they grind away on sweaty controllers or keypads, can any of us say with confidence what “slouches towards Bethlehem to be born”?

Romin W. Tafarodi is Associate Professor of Psychology at the University of Toronto. He can be reached at [email protected].

Advertisement

Delivering Community Power CUPW 2022-2023

Browse the Archive