This has been the year computers began to deliver feelings to us in a mainstream way. Following their uncanny ability first to interact with our eyes via screens and then our ears through speakers, now tens of millions of them are acquiring touch feedback. You touch the machine, it nuzzles you back.

Share story

Did you know that those few places on your body where you cannot grow hair are by far the most sensitive? Like the bottoms of your feet?

That’s why the young woman with the metal probe is scratching away at a rough surface in a Johns Hopkins University lab. Suppose you wanted to know what something thousands of miles away felt like — as easily as you could see what it looks like by aiming a remote Internet camera.

What happens if that smart probe transmits the sensation to all those dense nerve receptors along your tingly arch?

After all, there are some occasions when only touch will do, aren’t there?

This has been the year computers began to deliver feelings to us in a mainstream way. Following their uncanny ability first to interact with our eyes via screens and then our ears through speakers, now tens of millions of them are acquiring touch feedback. You touch the machine, it nuzzles you back.

Feel matters. It’s the pea under the princess’s mattress. “The world is going digital, but people are analog,” says Gayle Schaeffer, a marketing vice president at Immersion, a leader in touch feedback. “We like real things. We touch real things all day long. We need to interact with something that feels real. In the digital world, touch is so much more personal and private and nonintrusive.”

Computer screens that you can usefully touch are as common as ATMs and airport check-in kiosks. With the explosive popularity of the Apple iPhone, it became clear that soon, everyone was going to have a touch screen in her pocket.

The touch-surface juggernaut marches relentlessly toward the day when push buttons that physically move in and out are gone forever. Already being conquered are televisions, washers, ovens, printers and workout machines, says Steve Koenig, director of industry analysis at the Consumer Electronics Association. Touch screens are now invading dashboards, desktop phones, remote controls, music players, navigators and cameras.

The problem is that no matter how much you gussy it up, touching a flat computer screen feels like touching a flat computer screen. It can have as many flashing, beeping pictures of buttons as you like, but there’s something about the human brain that doesn’t trust those little icons. We mash them again and again, our brains not believing those icons are responding to us — because it feels all wrong.

Now we’re trying to solve that. The multibillion-dollar goal is for smart devices to make our fingers feel as if they are actually working with the good old three-dimensional physical objects that evolution has taught us to trust.

That’s why competitors to the iPhone are focusing on the main thing it has yet to offer. Advertising directors for the BlackBerry Storm are doing their level best, this holiday season, to make sure you know that their product is not just touchy, but touchy-feely. Hit its screen and you get a hint of a tactile response. This means a lot.

Touch can be spoofed. You can persuade visitors to a “haunted house” that they’re feeling eyeballs when they’re touching peeled grapes. That’s basic to the science and magic of touch, says Allison Okamura, director of the Haptics Laboratory at Johns Hopkins in Baltimore.

“Haptics,” as it is called, refers to the ability of people to sense the world around us through touch. Haptics is to touch as optics is to sight. “Haptics technology” refers to our ability to capture and transmit the vast array of information we get from feeling our three-dimensional world.

Okamura takes command of an experimental surgical robot that could help operate on your eyes. “This eliminates tremor,” she says, maneuvering the robot’s business end over the eye socket of one of those skulls. “If I shake, it holds me steady. I can force it to make me move very slowly and deliberately, so it makes me extremely accurate.”

Robot-assisted surgery has been around for some time, but the surgeon usually stares at a screen to see where the scalpel is going. The way to achieve superhuman steady hands, Okamura explains, is by engaging touch. Computer-mediated feedback makes one’s hands feel as if they are maneuvering through goo. Or, say, if you want to peel a very thin membrane off the back of the retina but you don’t want to puncture the retina itself. Virtual feedback can guide the surgeon’s hands.

Aren’t surgeons wary of robot assistance invading their turf?

“Not here,” says Okamura. “We’ve got the best in the world. They’re real cowboys. They want the newest and best. They come down here and try the stuff out, and no matter what we’re working on, they say, ‘Can’t you make it feel more realistic?’

“I look at them and say, ‘That’s my life’s work.’ “

Meanwhile, Kathryn Smith, the undergrad observed messing around with people’s feet, wants to know if she can record the feeling of something and convey it to your brain through your skin’s sense of touch, the way a microphone and speakers can pick up sound and engage your ears.

When you feel the difference between a sheet of notepaper and a sheet of sandpaper, it’s because you’re judging which causes your skin to vibrate more. Those skin vibrations are what your nerve endings pick up, causing your brain to read “rough.”

Can a prosthetic hand be made to feel? The haptics lab’s fingerlike probe can pick up the vibrations caused by rough surfaces, but how do you get that useful information to the brain?

Suppose you connect the probe to what amounts to a sophisticated vibrator not unlike the one that drives an audio speaker. Suppose you place that at a sensitive part of your body — such as your foot. Could your brain use the nerve receptors there to read the roughness signal correctly?

Our touch is also exquisitely sensitive to temperature. How would you make a computer convey that? Put your hand on one of the concrete uprights on the lab’s wall. It’s cool to the touch. Then put your hand on a metal chase. It feels colder. A wooden desk feels warmer. Actually, they are all the same — room temperature.

The metal “feels cold because the heat rapidly moves from my hand into the object,” says David Grow, one of the haptics lab’s grad students. “But it’s no colder than the concrete next to it,” says Okamura.

There are far more females in the haptics lab than is typical of mechanical engineering departments. Why is that?

“Part of it,” Okamura says, “is that I encourage them. But it may not be too far-fetched to think that females are drawn to the idea of engineering touch. The reason I’m in mechanical engineering is that I like to put stuff together and make it work. But haptics also slops over into fields like physiology and psychology — it’s grounded in the business of figuring out exactly how humans tick. Psychology, as a field, is loaded with women.”

Pricey niche products were the first to offer touch feedback — high-end Mercedes-Benzes and BMWs, medical devices, the Wii controller and some casino and bar-top games. But now more than 35 million touch-feedback cellphones have shipped. Most of them far exceed in sophistication the mechanical spring-loaded screen of the BlackBerry.

The game world has shown how touch can be integrated with vision and hearing. When you “hit” a tennis ball with a Wii controller, not only are your eyes on the screen, but when you “connect” with the virtual ball, triggering the vibration that fires your touch nerves, the device sounds a resounding thwack. Arguably, it’s the sound that really has you thinking you’ve hit a tennis ball, not a baseball. But, as in life, it’s the combination of senses that your brain processes.

“So how do we move from wow and games and the fun part into practical business tools that you can’t live without?” asks Chuck Joseph, general manager of the touch interface products group at Immersion.

He remembers getting the attention of the CEO of a multibillion-dollar company by taking a sophisticated surveying tool and making it something you could understand without looking at it.

“It has touch-screen, but when the surveyor is walking around looking at that screen and trying to touch it, he’s tripping, he’s falling, he’s got a backpack on, he’s got an antenna at the end of the pole.” So Joseph’s crew transformed it into something like a touch Geiger counter. The closer the target, the stronger the vibration.

“Imagine that coming into your friend-finder,” Schaeffer says. “Teenagers at the mall. Or you’re trying to figure out where you’re going and sometimes you can’t hear on a busy street corner. So your GPS can have that feeling to turn left or right, or keep coming.

“As a mom, you can have messaging and alerts that feel different. I’ll know it’s my son, even if I have my sound off. And I’ll know what priority it is. If this is an SOS, I would walk out of this meeting to take the call. It could feel like whatever we wanted to make it feel like — a heartbeat.”