“Siri, send a message to Bob canceling our 2:00 pm meeting.” “Siri, what’s the weather going to be like tomorrow?” “Siri, does the car smell like old cheese?” While iPhone (News - Alert) users have become accustomed to uttering the first two phrases, the third seems like a stretch. After all, computers can’t smell or touch…can they?
While all five of the human senses might be out of reach for computers today, this may not always be the case. IBM (News - Alert) has launched its annual end of the year "5 in 5" list that predicts five trends in computing that will likely show up in the next five years. This year, it’s all about the potential for computers to taste, smell and touch, in addition to seeing and hearing.
Called “cognitive computing,” it involves computer systems that can actually learn instead of passively relying on programming. The emerging technologies that result from this cognitive computing will continue to push the boundaries of human limitations to enhance and augment our senses with machine learning, artificial intelligence (AI), advanced speech recognition and more. And given that these are computers; their senses will probably be far more acute than our own.
Image via Shutterstock
Given how much we already rely on computers today to “sense” for us, this will be a timely development.
"[It’s] a foundationally different way of thinking of computing," Bernie Meyerson, IBM's vice president of innovation, told Mashable in a recent interview. "You have to change how you think about absorbing data. You can't just take a picture and file the picture. You have to treat the picture as an entity at a very high level, as opposed to just a bunch o' bits."
So how might it benefit you if your smartphone can “touch”? Imagine you’re shopping for sheets. You find a good bargain online, but you don’t want to risk spending the money only to find the sheets feel like washed cardboard. By using a combination of haptic and pressure- and temperature-sensitive technologies that plug into your smartphone, you might actually be able to compare sheets by actually “feeling them” through an Internet connection.
“Using digital image processing and digital image correlation, we can capture texture qualities in a Product Information Management (PIM) system to act as that dictionary,” said IBM. “Retailers could then use it to match textures with their products and their products’ data – sizes, ingredients, dimensions, cost, and any other information the customer might expect. The dictionary of texture will also grow and evolve as we grow our appetite, usage and understanding of this kind of technology.”
According to a recent New York Times article, it’s the increasing convergence of computing and human biology.
“The principles of biology are gaining ground as a tool in computing. The shift in thinking results from advances in neuroscience and computer science, and from the prod of necessity,” wrote Steve Lohr for the NYT.
In fact, computer scientists are increasingly collaborating with neuroscientists in an attempt to make computers work a bit more like the human brain. It’s an entire field of study, called “Computational Neuroscience” and its goal is to make computers work more like the human brain. While computers are generally faster than the human brain, computers still can’t rival the nuances and complexities that our human brains can, particularly when it comes to sensory input.
According to those on the cutting edge of computing, in the next few years, this may begin to change.
Want to learn more about the latest in communications and technology? Then be sure to attend ITEXPO Miami 2013, Jan 29- Feb. 1 in Miami, Florida. Stay in touch with everything happening at ITEXPO (News - Alert). Follow us on Twitter.
Edited by Brooke Neuman