Apple's iPhone 7 made haptics exciting. Now Nintendo's Switch Joy-Con is making it fun. So, what happens next?

From the moment I first held an iPhone 7 at the September Apple event, I knew the Taptic Engine was something special.

Apple's second generation haptic feedback motor for iPhone — the one the headphone jack, in part, gave its life space for — is a substantial improvement over the previous one.

Where before you could press firmly on your iPhone 6s display to trigger 3D Touch and get a reassuring "thump" in response, with iPhone 7 you get a broader, deeper, more sophisticated range of responses.

Some of them are delightfully subtle: Spin through a date or time picker and you can feel a slight "tock" for each number. Thumb across alternate characters on the keyboard and you can feel a little "tick" for each accent.

Others reaffirm the interface. Try to zoom too close or swipe too far, and a small "knock" will inform your finger that you've reached a limit. It's not the "right" feeling and not an exact match to the perfectly visualized rubber banding effect iOS has had since launch, but, in context, you barely notice. The sensory input is in sync, and hence amplified, and you know exactly, unmistakably, what it means.

So, too, iMessage effects. If you've run the iOS 10 beta on a previous iPhone, you've seen the fireworks or lasers. But with iPhone 7, you feel them. The bursts or waves of light sizzle and rumble in your hand.

It's not the sloppy, annoying buzzing other manufacturers have been implementing for years either. And it's not localized to only half the phone, so when you turn it sideways only one of your hands feels anything.

No, Taptic Engine is haptics done right, and the potential is enormous.

Those core effects are still delightful and developers have begun integrating advanced Taptics into their apps as well. I remain incredibly bullish about the technology's future, especially now that word is filtering out about the Nintendo Switch:

Based on Nintendo's event alone, the ice cube segment seemed... odd. It wasn't given any specific context. Reading the above, though, and the context is clear. Nintendo is taking haptics — simulated sensation — to yet new heights. (I can't wait for the teardown to see how they're doing it.)

It might seem like games or tricks right now, but if you take what Apple is doing and take what Nintendo is doing, and you push it out over time, the potential is remarkable.

The idea of picking up my iPhone 8 or iPhone 9 or whatever and feeling every key I press, every knob I slide, every cell I swipe, is easy enough. It's everything else that I'm looking forward to. Ice cubes in cups or marbles in a container are fun, but the interactivity, and maybe even accessibility, behind them are compelling.

For a long time interfaces were mainly pixels on screens, Braille terminals and a few other alternatives aside. Now voice interfaces are becoming more common. Tactile interfaces, thanks to technology like Apple's Taptic Engine and Nintendo's Joy-Con, will be as well.

(Especially in virtual reality (VR) and augmented reality (AR), which both companies may be interested in but neither currently ships.)

And as multi-sensory humans, the more affordances we have, the better.