As people continue to push the boundaries of technology, computers are not just starting to process data by learning from how the human brain solves problems. Soon, they will experience the world the way people do, by using the five senses.
During the next five years, computers will rely on smell, touch, taste, sight, and sound as context and data points for analyzing problems, producing new insights that will help people make smarter decisions, improve sustainability and productivity, and break down barriers, whether of distance or language.
Sight: A pixel will be worth a thousand words: It is nearly impossible to navigate the massive amounts of digital visual information being created today. But that will change. Because computers will soon be able to extract information, meaning, and context from videos, photos, and other digital images by using pattern recognition and being taught what to look for.
These systems will sift through huge amounts of data to draw insights, even allowing people to interact with them by, for instance, asking questions to get further explanation of the systems are seeing. This will enable computers to analyze and compare digital files of MRIs, CT, and X-ray images in order to diagnose health conditions or look at data streaming in from millions of municipal cameras and even crowdsourced images to monitor for public safety.
Hearing: Computers will hear what matters: Mass broadcasting will be a thing of the past. People will soon be able to personalize their experiences at games or concerts through multiple announcement systems. These systems will track, make sense of, and describe events on the field or at a show using sensors, including cameras and microphones. Then, using directional audio technology that can project a beam of sound so narrow that only one person can hear it, these system will direct commentary to each individual based on their interests.
For instance, at a soccer game, a fan who is avidly watching a specific striker will mostly hear commentary about that player’s background and plays on the field. Sensors within the stadium will track where the fan is looking and the player’s body movements to determine the system should tailor the commentary.
Smell: Computers will have a sense of smell: It is common today for sensor systems to monitor various environmental conditions, such as humidity and temperature, to ensure that operations are running well. Soon, the sense of smell will play a role as well within these kinds of systems, tracking everything from hygiene to potential diseases.
My company, for instance, is partnering with healthcare organizations to put sensors in patients’ rooms that will sniff out chemical compounds on objects and people to determine if the rooms have been cleaned properly or if employees are washing their hands properly.
Specialized equipment in a doctors’ offices could help diagnose and track illnesses, such as liver and kidney disorders, by smelling molecules in a patient’s breath and analyzing whether concentration levels are normal or not.
Touch: You will be able to touch through your phone: One of the hardest senses to replicate in an online world is touch. Yet, a great amount of good will be possible when it is perfected, whether helping families to stay in touch or enabling a surgeon to operate virtually.
The foundation for these advances already exist. Technology that use the variable frequency patterns of vibrations associated with various physical situations are included today in mobile devices and gaming devices to recreate the feeling of rolling over a rough surface.
Now we’re working to apply this technology to the retail sector, helping online merchants provide a richer online shopping experience by letting customers touch merchandise before they purchase. Using the vibration motion of the phone, the texture of a piece of clothing can be simulated when the shopper brushes his finger over the item on the screen.
Taste: Digital taste buds will help you to eat smarter: Computers are making huge leaps forward, moving from simply solving problems through if/then computations to actually drawing conclusions based on the observations they make. Yet, are computers capable of taking the next step and becoming creative?
IBM researchers are probing this question by test of a system that experiences flavor. This system will begin by breaking ingredients down to their molecular level. Then it will combine an analysis of the chemistry of those compounds, data about the psychology of human’s preferences for flavors and smell, and a database of millions of recipes to suggest new combinations of foods.
For example, who would ever imagine that roasted chestnuts would pair well with foods such as cooked beetroot, fresh caviar, and dry-cured ham? The system will also be optimized to cater for diners’ personal preferences such as their likes and dislikes, diet restrictions and any food allergies, as well as constraints such as what local food is currently in season.
Bernie Meyerson, IBM Fellow and VP of Innovation