Today's computers calculate, automate processes and organize data. A new class of machines, however, is starting to solve problems by invoking logic more akin to how humans tackle challenges. To make this process efficient, such systems must also then experience the world much the way people do.
Over the next five years, computers will start to use the five senses of sight, sound, smell, feel, and taste to identify and then solve problems. These senses -- how humans perceive their environment -- will give computers the context they need to analyze problems in novel ways, helping make the world a more sustainable, productive, and enjoyable place.
In the process, these new systems will propel the world further into the age of cognitive computing, where systems learn from interacting with their environment, and help people make smarter decisions using that learning.
Sight: Computers will understand pictures
Computers will begin to automatically make sense of the reams of videos, photos, and drawings being created today. Rather than waiting for people to tag their visual creations, computers will learn this function by being taught what to look for by seeing previous examples, and then recognizing those patterns of interest to the user through that learning.
Computer systems will use "brain-like" capabilities to extract key information, understand its context, and determine the "meaning" of the content in a given image. Consumers and companies will be able to interact with such systems, asking questions requiring the system to potentially add further understanding of what they are seeing. That will make it possible to pull new insights out of digital visual data, whether it's examining medical MRIs to help diagnose a health condition or spotting patterns in home photo albums to organize them automatically.
Hearing: People will listen only to what they want
Hearing is fundamental to human understanding of the world, but much of what people hear is irrelevant to them. Imagine a system that could speak directly to individuals, telling each one what they are seeing based on where they are looking, their mood, and their behavior.
In five years, people will control and personalize their experience at the events they attend through the use of multiple announcing systems. These systems will follow, interpret, and describe events on the field or at a show by understanding a person's focus by using local sensors, including cameras and microphones. At a soccer game, for instance, a fan sitting in the stands watching a goalie could receive live commentary about that player's history and current performance. Sensors will observe the spectator's gaze and body movements to tailor the commentary to match an individual's interests, which will be delivered directly to him or her using directional audio, technology that can project a beam of sound so narrow that only one person can hear it.
Smell: Machines will smell when something is going wrong
Systems today sense environmental conditions such as temperature, humidity, air flow, but soon they will sniff out problems as well.
IBM is working with health care organizations to equip patient rooms with sensors that monitor one of the biggest challenges in health care: hygiene. Hundreds of sensors will sniff for cleanliness, detecting and identifying the chemical compounds found in the rooms and even on the hands of patients and staff to pinpoint, for instance, whether a patient's room has been cleaned.
Sensors will also sniff out potential diseases. By using devices that can analyze biomarkers and thousands of molecules in a patient's breath, doctors will be able to diagnose and monitor many ailments such as liver and kidney disorders, asthma, and diabetes by smelling whether concentration levels are normal or not.
Touch: Smartphones will reach out and touch
Imagine a world where a father away from home on a business trip can hold his child's hand before she goes to sleep. Or a person can be operated on by a team of specialists halfway across the world.
This future is not so far away. Already, technologies that use the variable frequency patterns of vibrations associated with different physical situations are built into mobile devices and gaming devices to simulate the sensation, say, of driving over a rough surface or being in a car collision.
Now, researchers are experimenting with using these technologies in other industries, making it possible for online merchants, for instance, to let customers touch merchandise before they purchase. Using the vibration motion of the phone, the texture of a piece of clothing can be simulated when the shopper brushes his or her finger over the item on the screen. Each type of fabric will have its own unique vibration pattern, with silk having a softer vibration, for instance, and linen having a stronger one.
Taste: Computers will learn to cook
Creativity defines humanity. But is technology capable of becoming computationally creative? IBM researchers are exploring this question through a demonstration of a system that experiences flavor.
The system, designed to cater to diners' personal preferences and dietary restrictions as well as constraints such as what local food is currently in season, will break down ingredients to their molecular level. It will then blend the chemistry of the compounds found in those foods with the psychology behind the flavors and smells humans like and a database of millions of recipes to generate novel and tasty mixtures, bringing science a step closer to answering the question of creativity.
For more information on 5 in 5, click here.