Embrace video games: From Pokemon to Pac-Man, they’ve been changing the world -- and getting girls into tech -- for over 35 years

Embrace video games: From Pokemon to Pac-Man, they’ve been changing the world for over 35 years
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.
Gamers still play #Pac-Man more than 35 years later. It also helped get women into technology, and made kids interested in bringing personal computers into the homes.
Gamers still play #Pac-Man more than 35 years later. It also helped get women into technology, and made kids interested in bringing personal computers into the homes.

Technological advances always have a downside, as I was reminded last year when the batteries died in the door lock to my hotel room, rendering the key card useless and forcing the staff to actually have to summon an off-duty manager to come fix it. That said, with most technology, the good far outweighs the bad. We can presume hundreds of thousands of lives have been saved through medical advances and communication facilitated by computers and the internet. Recently I’ve read a barrage of essays about the perceived self-centeredness of today’s young adults (and, well, older adults) who walk around glued to their phones, but the ability to contact friends and family in an emergency seems more important than any perceived self-centeredness. The recent Pokemon craze, too, has earned jeers since people began playing the “augmented reality” game on July 6. Players have wandered the streets aiming their phones at parks to catch imaginary Rattatas or Snorlax said to be hiding there. Some players have ventured onto private property, disturbed landmarks, and allegedly caused accidents.

But it’s easy to forget that, besides providing amusement and challenge in a world where we can always use diversions, video games have changed our culture and helped further scientific advancement for more than thirty five years.

In late 1980, a Chicago-based manufacturer of amusement games, Midway, released the Pac-Man arcade game in the U.S. The game was an instant hit due to its unique non-repetitive boards, colorful main character, and playability for hours. Pre-teens like me bonded at pizza parlors and challenged each other to collect all 54 “Pac-Man stickers” while we asked for computers and home gaming systems for Christmas. The game had been developed in 1980 in Japan by an amusement ride and game company, Namco, and then licensed here by Midway. Since the game was “friendlier” than its shoot-em-up or sports-heavy predecessors like Space Invaders and Pong, it was said to bring more women into the gaming phenomenon (myself included).

I started poring over magazines like Enter, put out by Children’s Television Workshop; Vidiot (from the publishers of Creem), Joystik, and the less interestingly named Electronic Games. My school implemented, in junior high, a computer class for the first time. At the same time, my younger brother asked our parents for ― and got – a newfangled Timex Sinclair 2068 personal computer, unprecedented with its 72K of memory, so we could program and play our own games. After hard work we figured out how to program our own version of Pong using the rudimentary BASIC I learned in school and hints from the magazines. The key, we realized, was that if the coordinates of the ball matched the coordinates of the paddle, the ball should bounce off, so we LET A = A-1 and B = B-1 when A and B were the coordinates of the ball. Heady stuff for two pre-pubescent kids, but we really wanted to create our own games. It was an initial interest in the games that spurred an interest in computers and technology.

We were far from alone: Suddenly, teenagers were getting rich designing games. In February of 1983, People Magazine released a cover proclaiming an “Invasion of the Vidkids,” profiling young programmers who’d earned thousands of dollars creating games for software companies before the age of 20. The eighties quickly became a prefigurative decade. Cultural anthropologist Margaret Mead, in her book Culture and Commitment: A Study of the Generation Gap (1970), had fit cultures into three categories: postfigurative, cofigurative, and prefigurative. As opposed to a primitive (postfigurative) culture in which adolescents learned survival skills from their predecessors, in a prefigurative culture, the future was changing so rapidly that parents had to learn from their children. The 1980s may have been the first prefigurative decade.

At the time, fewer adults had home offices than today (telecommuting would have been difficult back then, despite a seminal 1979 Washington Post editorial encouraging the practice one to two days per week to save gas). So adults didn’t necessarily need the newly advertised personal computers in their homes right away. But during the holidays of 1981, 1982, and 1983, kids asked for computers in order to design and play their own games, shepherding along the PC craze more quickly than might have happened otherwise. “Home computers were presented as a friendly introduction to a technology that was going to change the world,” wrote British author Emma Mason in BBC History Magazine this past February in “The 1980s Home Computer Boom.” Computers were advertised for parents as being educational. Long before they might have thought to buy a “PC” for themselves, their kids wanted them.

The prospect of a technological revolution had been brewing, of course, well before the early 1980s, but the popularity of games certainly helped. One of the chief architects of the modern digital age was a young game designer. In 1972, Nolan Bushnell, a California engineer who had worked for a manufacturer of coin-operated machines, founded Atari with a co-worker. His games Pong and Asteroids soon replaced pinball machines in bars. Then the company developed a series of home video game systems (the Atari 2600 and 5200 in 1977 and 1982) and the Atari 400 computer in 1979. At the same time, college dropouts Steve Jobs and Steve Wozniak were debuting their own microcomputers that competed with his, having built their first in Jobs’ parents’ garage in the 1970s.

Not only did video games increase kids’ interest in technology at a very young age, but they even helped special needs students fit in and learn. Last year, two researchers at Penn State released a study in Hearing, Speech, and Language Services in Schools determining that “parents indicated video game play was positive for their children with ASD [Autism Spectrum Disorder]” and noted that the games could help the kids bond with mainstream peers. Earlier this summer, a commenter on a British news website wrote, “My 17 yr old has Asperger’s and he is so excited about PokemonGo that I took him to NYC, which he doesn’t normally want to do. We met a family from Utah and some Canadians all chasing Pokemon! It was the best day ever.” And a recent article noted that strangers were playing Pokemon with an austic teen who had been bullied.

Over the last few decades, video games have helped accelerate changes in technology and culture. Certainly, a game like Pokemon can prove a dangerous distraction for some, but those who exercise caution and respect the law are finding it a gleeful challenge and gateway to an interest in computers, just as the “friendly” video games of the early 1980s were for me and my peers. While I eventually veered away from programming into journalism, I still smile when I see kids excited about a new game, remembering how I witnessed the dawn of an unprecedented revolution. And I can’t help but wonder: When will the next major technological breakthrough come, and who among today’s computer-savvy young people will exercise their brains being part of it?

Popular in the Community

Close

What's Hot