IBM 5 IN 5, 2012
Anurag S. Vasanwala
Dalpat J. Prajapati
IBM®, the IBM logo, and ibm.com® are trademarks or registered trademarks of International Business Machines Corp., registered in many jurisdictions
worldwide. Other product and service names might be trademarks of IBM or other companies. A current list of IBM trademarks is available on the web
at Copyright and trademark information at www.ibm.com/legal/copytrade.shtml.
Other product and service names might be trademarks of IBM or other companies.
International Business Machines, abbreviated IBM and
nicknamed "Big Blue", is a multinational computer technology and IT
consulting corporation headquartered in Armonk, New York, United
States. The company is one of the few information technology
companies with a continuous history dating back to the 19th century.
IBM manufactures and sells computer hardware and software (with a
focus on the latter), and offers infrastructure services, hosting
services, and consulting services in areas ranging from mainframe
computers to nanotechnology.
IBM has been well known through most of its recent history as
one of the world's largest computer companies and systems
scientists, engineers, consultants, and sales professionals in over 170
One of the most intriguing aspects of this shift is their ability to give machines
some of the capabilities of the right side of the human brain.
New technologies make it possible for machines to mimic and augment the
senses. Today, we see the beginnings of sensing machines in self-parking cars and
biometric security–and the future is wide open.
Last year, IBM focused the IBM Next 5 in 5, our 2012 forecast of inventions that
will change your world in the next five years, on how computers will mimic the senses.
In the era of cognitive computing, systems learn instead of passively relying
on programming. As a result, emerging technologies will continue to push the
boundaries of human limitations to enhance and augment our senses with machine
learning, artificial intelligence (AI), advanced speech recognition and more. No need
to call for Superman when we have real super senses at hand.
Last year(2012) IBM presents The 5 in 5 in five sensory
categories, through innovations that will touch our lives and see us into the
Processing sights and sounds requires eyes, ears and, most important, a
But what if your hardware shared your senses?
Within the next five years, your mobile device will let you touch what
you’re shopping for online. It will distinguish fabrics, textures, and weaves so that
you can feel a sweater, jacket, or upholstery – right through the screen.
Touch: You will be able to
touch through your phone
Haptic devices such as gloves or ―rumble packs‖ used in gaming have
existed for years. But we use them in closed environments where the touch doesn’t
actually connect to where we are in reality. At IBM Research they think that in the
next five years that our mobile devices will bring together virtual and real world
experiences to not just shop, but feel the surface of produce, and get feedback on
data such as freshness or quality.
It’s already possible to recreate a sense of texture through vibration. But
those vibrations haven’t been translated into a lexicon, or dictionary of textures
that match the physical experience. By matching variable-frequency patterns of
vibration to physical objects so, that when a shopper touches what the webpage
says is a silk shirt, the screen will emit vibrations that match what our skin mentally
translates to the feel of silk.
Touch: You will be able to
touch through your phone
Senior Manager, Intelligent
People say a picture is worth a thousand words, but for computers, they’re just thousands
of pixels. But within the next five years, IBM Research thinks that computers will not only be able to
look at images, but help us understand the 500 billion photos we’re taking every year (that’s about
78 photos for each person on the planet).
Getting a computer to see:
The human eye processes images by parsing colors and looking at edge information and
texture characteristics. In addition, we can understand what an object is, the setting it’s in and what
it may be doing. While a human can learn this rather quickly, computers traditionally haven’t been
able to make these determinations, instead relying on tags and text descriptions to determine what
the image is.
One of the challenges of getting computers to “see,” is that traditional programming can’t
replicate something as complex as sight. But by taking a cognitive approach, and showing a computer
thousands of examples of a particular scene, the computer can start to detect patterns that
matter, whether it’s in a scanned photograph uploaded to the web, or some video footage taken with
a camera phone.
Sight: A pixel will be worth
a thousand words
IBM Research Scientist
(Lost hearing at age three)
Imagine knowing the meaning behind your child’s cry, or maybe even your pet
dog’s bark, through an app on your smartphone. In the next five years, you will be able to do
just that thanks to algorithms embedded in cognitive systems that will understand any
Hearing: Computers will
hear what matters
Each of a baby’s cries, from pain, to hunger, to exhaustion, sound different – even if
it’s difficult to tell. Dimitri kanevsky and some of his colleagues patented a way to take the
data from typical baby sounds, collected at different ages by monitoring brain, heart and
lung activity, to interpret how babies feel. Soon, a mother will be able to translate her baby’s
cries in real time into meaningful phrases, via a baby monitor or smartphone.
Predicting the sound of weather:
Sensors already help us with everything from easing traffic, to conserving water.
These same sensors can also be used to interpret sounds in these environments. What does a
tree under stress during a storm sound like? Will it collapse into the road? Sensors feeding
the information to a city datacenter would know, and be able to alert ground crews before
Hearing: Computers will
hear what matters
An extraordinary dining experience of perfectly cooked food, with unique flavor combinations meticulously
designed on a plate, heightens all of our senses.
But we may not realize that the way we perceive flavors and the characteristics of a “good” meal are
fundamentally chemical and neural. In five years, computers will be able to construct never-before-heard-of recipes to
delight palates – even those with health or dietary constraints – using foods’ molecular structure.
Lessons from Watson: inductive reasoning
Whereas traditional computing uses deductive reasoning to solve a problem with a definitive answer, our
research team uses inductive reasoning to model human perception. Watson was a concrete example of this inductive
type of computing system to interpret natural language and answer vague and abstract questions.
IBM research team is designing a learning system that adds one more dimension to cognitive computing:
The system analyzes foods in terms of how chemical compounds interact with each other, the number of
atoms in each compound, and the bonding structure and shapes of compounds. Coupled with psychophysical data and
models on which chemicals produce perceptions of pleasantness, familiarity and enjoyment, the end result is a unique
recipe, using combinations of ingredients that are scientifically flavorful.
So unlike Watson, which used known information to answer a question with a fixed answer, this system is
creating something that’s never been seen before. It’s pushing computing to new fields of creativity and quickly giving
us designs for novel, high-quality food combinations.
Taste: Digital taste buds
will help you eat smarter
IBM Research Manager,
Within the next five years, your mobile device will likely be able to tell you you’re getting a cold before your
very first sneeze.
With every breath, you expel millions of different molecules. Some of these molecules are biomarkers, which
can carry a plethora of data about your physical state at any given moment. By capturing the information they carry,
technology can pick up clues about your health and provide valuable diagnostic information to your physician.
What’s that smell?
In this evolving new era of cognitive computing, computers are increasingly able to process unstructured
data, draw conclusions based on evidence, and learn from their successes and mistakes. This makes them progressively
more valuable diagnostic tools to help humans solve problems and answer questions.
A version of former quiz show champ, Watson, is now attending medical school at the Cleveland Clinic,
learning from medical students how to identify and process multiple symptoms and patient scenarios to help doctors
diagnose conditions with increasing confidence and accuracy.
However, to learn – one has to sense first.
Tiny sensors that ‘smell’ can be integrated into cell phones and other mobile devices, feeding information
contained on the biomarkers to a computer system that can analyze the data.
Similar to how a breathalyzer can detect alcohol from a breath sample, sensors can be designed to collect
other specific data from the biomarkers. Potential applications could include identifying liver and kidney disorders,
diabetes and tuberculosis, among others.
Smell: Computers will
have a sense of smell