We all know in our heads that science, engineering, and the work of creative people influence our everyday lives. But sometimes it feels more personal.
Chad Ruble's mother had a stroke some years ago and was left with aphasia. According to the National Aphasia Association, about 1 million Americans have aphasia, an impairment of the ability to process language -- in speaking and listening, and usually also in reading and writing. Today's networks can connect us with almost everyone else in the world, but for people with severe aphasia, more is needed. Ordinary phone conversation and email pose challenges.
Chad built a system that helps his mother "Kinecticate" with her friends and family. The system has a visual interface of icons (emoticons) and intensity levels that the user can select with gestures. An email message is composed and sent off: "I'm feeling very happy," in translation, or, "I'm feeling a bit sad." The system is implemented in the Processing language, taking input from the Microsoft Kinect and using off-the-shelf gesture recognition code. Kinecticate is an evolutionary development of Chad's earlier Arduino-based system, Iconicate, which relied on panel of physical dials and a button for its user interface.
(Kinecticate, from Chad Ruble's dadhoc blog, used with permission)
I talked with Chad, via email, about his projects. "Assistive tech has been a focus of my interest for many years now -- all triggered from watching my mom deal with new disabilities following her stroke," he said. "With some ingenuity and motivation, people can leverage and customize very powerful technologies. More and more, we are going to see really cool solutions to people's challenges."
Throughout their history computers have appealed to do-it-yourselfers. Some of the earliest personal computers, like the Altair 8800, were kits that their owners would assemble. These kits have been succeeded by systems that seem to invite re-engineering by their users. The Kinect has been used for projects ranging from virtual puppetry to robot control (even quadcopters). The Lilypad Arduino, developed by Leah Buechley at MIT, is a set of components specialized for electronic textiles and wearable computing; hundreds if not thousands of unique projects (most created by girls and women) can be found online. Coming back to assistive technology, in my own research lab we've used the Arduino in a system for people with vision impairment. An Android phone, worn in a harness on the chest, captures video of a tabletop in front of the user. Recognized objects are named aloud, through the user's headset, and a wristband with vibrating actuators guides the user's hand toward an object he or she chooses.
The computer components we can work with today are enormously more capable than their predecessors. We have infrared projectors, depth sensors, sophisticated microphones, and so forth, along with software to make sense of the information and to provide meaningful output. Engineering advances aren't locked up in systems like the Arduino and the Kinect; they're open to us for extensions or even improvements.
And the science? The popularity of gaming has inspired researchers to consider gesture-based games and applications for purposes other than entertainment. Kinect-based systems are being tested in stroke rehabilitation, physical rehabilitation, and assistance for people with cognitive impairments. We typically think of science as making discoveries about the world and what's possible for us to do, followed up by engineering and individual creativity to bring practical improvements to our lives. But sometimes, especially with applied science, it can happen in the reverse direction, as well.
I asked Chad about his background. Surely he must be a scientist or engineer working on Kinecticate and Iconicate as a side project? "My background is in broadcast journalism," he said. "Up until two years ago, I had very limited programming experience."
How inspiring is that?