The Blog

Game Changers But Not Brain Changers

With technology evolving so rapidly and our brains remaining the same, we have to learn to covet our most prized possession -- our humanware.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Click here to read an original op-ed from the TED speaker who inspired this post and watch the TEDTalk below.

In her TEDTalk, Louise Leakey explained how archaeology has informed us about where we came from and, perhaps, where we are headed as a human species. I was particularly interested in Leakey's explanations of how our brains are different from our ancestors and how that affects our development as a relatively young species.

I have spent the past 30-plus years studying the "psychology of technology" investigating how children, teenagers and adults use technology. Back in the mid-1980s the most common technology was the television and it was nearly always viewed alone. The Internet was becoming available to the general public through the World Wide Web and email was becoming an important workplace tool. These changes, however, took years to penetrate society. Using the consumer adoption metric of reaching 50 million people, while the computer took nearly two decades to reach this milestone, the Web took a mere four years. And that trend has accelerated with new technologies penetrating society in a matter of months rather than years or decades. In the 1980s and well into the 1990s, the typical techie might access a handful of devices on a daily basis, nearly always one at a time. Now, with what I consider the first true game changer since the Web -- the smartphone -- even the typical user has so many options that people are constantly accessing their technology multiple times an hour and studies show that rarely is a single technology used alone. The old television set, which produced "couch potatoes," now is rarely, if ever, watched without an accompanying smartphone. In one study we discovered that nearly three in four young adults and teenagers check their smartphones every 15 minutes or less, no matter what they are doing at the time.

With a powerful device in our pocket or purse, we now have nearly infinite options when we "slide to unlock." On my iPhone's front screen alone I have 26 options including a camera, music player, text messager, email client, newsreader, and calendar. And that is one screen out of seven! Right now those apps are all clamoring for my attention and it is virtually impossible for me to ignore the red alert indicators, and the constant beeping, ringing and vibrating, telling me of even more incoming alerts. And this doesn't count the fact that my brain keeps reminding me that I haven't responded to that Facebook post by my daughter.

It is certainly great that we can connect with our world through so many vehicles and that we can find information about anything with a few taps of our fingers but research is indicating that having so many available options is causing us to try to cram more in the same time which can only be accomplished by multitasking or rapid task switching. While there is some contradictory evidence about whether we can truly multitask, it is clear that our brains are not evolving in a way to allow us to handle so many sources of information and communication. While Leakey talks about brains evolving over millions of years, we are looking at a brain that is incapable of handling the incoming onslaught at a level we used to devote to solitary television viewing. And that brain is not going to evolve any time soon.

We are faced with a dilemma. We have tools that allow, encourage and entreat us to attend to their alerts but yet we have a brain that simply cannot handle the incoming onslaught at any meaningful level without allocating scant attention to everything. At best we are able to give "continuous partial attention" to the incoming information and communication, which then may result in more shallow assimilation and the stress of constant task switching. In one recent study, for example, we found that teens and young adults, even knowing they were being observed while they were studying, were only able to focus for 3-5 minutes before having their attention diverted, usually to an incoming text alert or an internal reminder to check social media.

If we are to thrive with the nearly infinite technological options and our finite human brains, we have to learn (or relearn) the skill of attention. We have to learn how to avoid both the pesky external distractions as well as those silent internally self-generated interruptions. It will take practice. In recent Huffington Post blogs I have provided some ways to help ensure that you get the most from your brain including using "technology breaks" to learn to expand your ability to forego the alerts for up to 30 minutes as well as "brain resetting" breaks to allow your brain to calm from incessant multitasking and technology use every 90 minutes. With technology evolving so rapidly and our brains remaining the same, we have to learn to covet our most prized possession -- our humanware -- above the clarion call of our software and hardware.

Ideas are not set in stone. When exposed to thoughtful people, they morph and adapt into their most potent form. TEDWeekends will highlight some of today's most intriguing ideas and allow them to develop in real time through your voice! Tweet #TEDWeekends to share your perspective or email to learn about future weekend's ideas to contribute as a writer.

Before You Go

Popular in the Community