A version of this post originally appeared on Forbes. Sign up for Caroline's newsletter to receive her latest articles to your inbox.
The name of Microsoft's first and still most popular operating system, Windows, came from a then-revolutionary development: windows on a computer screen.
Before the mid-1980s, computers were text-only and displayed just one task at a time. Then, Xerox, Apple and eventually Microsoft designed a graphical user interface that imitated real desks, with folders, multiple papers and tasks available simultaneously.
The "desktop metaphor" was initially done with tiles (think: Microsoft's colorful four-box logo as applications side by side), but subsequent versions favored an overlapping system with minimizing and maximizing windows.
This function was novel. So novel, in fact, that Windows 1.0 included a computer game that relied on mouse control rather than a keyboard to accustom users to clicking, selecting and toggling between tasks on the screen.
Today, of course, we need no prodding. Millennials were born after the invention of windows, so we've never known life or work without them. Multitasking with technology has become so engrained in our work processes that it's bewildering we ever did anything without it.
But, then, you've heard the problem: Multitasking compromises our visual awareness, divides our attention, distracts us, reduces our job satisfaction, stunts our memory, impairs cognitive function and sabotages our performance.
These consequences are typically presented as side effects. But the biggest problem with multitasking isn't what it causes; it's what multitasking makes us into.
The U.S. Chamber of Commerce reports that multitasking is rewiring our brains, enabling "multiple tasks to be processed in more rapid succession." As our brains adapt to task management, we lose "our ability to think deeply and creatively," Nicholas Carr writes in The Shallows. The better we get at multitasking, the worse we get at creative problem solving. Summing the research, Carr explains that multitaskers are "more likely to rely on conventional ideas and solutions rather than challenging them with original lines of thought."
What this means for developing brains and advancing technology is uncertain. Paul Gardner-Stephen, a telecommunications fellow at Flinders University, believes we'll one day become dependent on the Internet to solve our problems. Or, more moderately, perhaps people will just solve easier, less urgent problems. Our tendency - maybe even our new biological instinct - will be to address small, Google-able annoyances rather than large, systemic, time-consuming challenges.
In other words, as Carr observed, we'll become more like computers ourselves: quick, efficient task executers. In fact, the word multitasking was initially a computer word, not a human word. "Multitask" first appeared in a 1965 IBM paper, referring to a computer's ability to process multiple tasks simultaneously.
The word was applied to humans several decades later. Writing for The New Atlantis, Christine Rosen notes that in the late '90s and early 2000s, "advertisements started celebrating the use of technology for doing lots of things at once," and multitasking became a defined skill on resumes.
Today we use the words "efficient," "effective," "methodical," "productive" and "fast processor" to describe what we want in ourselves, not just for our technology. Meanwhile, millennial multitasking is considered a boon to the workforce.
When I recently returned to my alma mater, Colorado College, I had dinner with the Student Alumni Association. A career counselor asked the seniors how they planned to pitch the block plan -- CC's unique schedule, where students take just one class at a time for three-and-a-half weeks -- to potential employers. They all answered similarly: We can do lots at once, balance extracurriculars with an intensive course load, handle relentless assignments, and so on.
But I see the strength of CC's schedule as exactly the opposite: in this multitasking machine world, we need to train young people to focus on one problem at a time.
This isn't just because they'll do better in the modern job market as a result (though they will); it's so we keep them fundamentally human, as it seems like that's something we're losing.
Caroline Beaton is a workplace psychology journalist. Sign up for Caroline's newsletter to get her latest articles to your inbox.