Apps Have Become Boring. Haptic Feedback Would Make Them Fun Again.
Smartphones and other computers have had powerful vibration motors for years, but we’ve neglected to use them as a tool to immerse us in what we’re doing.
HD Rumble on the Nintendo Switch may have been a simple pleasure, but it’s hard to say it wasn’t a compelling one.
Nintendo featured the intricate and precise haptic feedback of its handheld / console hybrid most prominently with its launch title, the failed Wii Sports successor, 1-2 Switch, which used the Switch’s unique Joy-Con controllers to play a variety of goofy minigames. One of the strangest, “Ball Count” makes you hold a Joy-Con horizontally in your hand and feel how many “balls” are rolling around “inside.”
Despite being a solid object, the Joy-Cons vibration motors are able to simulate the sensation of balls rolling around with haptic feedback alone. It was a legitimate “wow” moment the first time I felt it, a unique experience from what was already a pretty weird console.
Haptic feedback is a sensory element of modern computing that is hiding in plain sight.
Since the Switch’s launch in 2017, almost no third-party games have tried to take advantage of the Switch’s haptic finesse, and next to no Nintendo titles have relied on haptic feedback to communicate information in the same way 1-2 Switch did. It stands out now because detailed haptic feedback is everywhere, from our smartphones to our laptops, but in many ways has never felt more invisible. Haptic feedback is a sensory element of modern computing that is hiding in plain sight, and it should play a bigger role in how we use the computers of the future.
In the land of linear resonant actuators
Cellphones and other pocketable personal electronics have had haptics and vibration motors for ages, but they didn’t really get good until relatively recently. In the early days, eccentric rotating mass motors (EMRs) were the be-all and end-all. Crack open an early pager and you’ll find an EMR at work: A DC motor with a weight attached to one end of its shaft that then spins, and thanks to its asymmetric design, jiggles the whole motor, creating a vibration. Spin the weight at different speeds and you can create different types, strengths, and patterns of vibration to suit a variety of attention-getting purposes.
Plenty of older smartphones used these too, and in early versions of Android, Google even enabled vibration out of the box, greeting literally every tap of your finger on the display with a dull buzz. In the years since, saner heads prevailed. EMRs were traded out for LRAs or linear resonant actuators, and mobile software is far more conservative with how and when it vibrates your phone. LRAs, which are AC motors that oscillate in a straight line and are also far more responsive and energy-efficient, have become key to Apple’s breakthroughs in haptics via the Taptic Engine, Force Touch, 3D Touch, and Haptic Touch.
Apple’s Taptic Engine is a custom LRA that first appeared in the Apple Watch to simulate Force Touch, a hard press that enables hidden interface elements, and the “taps” that make up the smartwatch's notifications. That could be a buzz for a call, double taps for messages, or swipes to signal which way to turn during turn-by-turn navigation.
The Taptic Engine was later added to the iPhone 6S alongside a pressure-sensing layer in the phone display to allow for 3D Touch, Apple’s version of a right-click on steroids, and a solid-state home button. 3D Touch was short-lived — the iPhone XR launched without it — but its notion of “peeking” to preview links before “popping” to jump into them and turning the iOS keyboard into a trackpad were novel and useful additions that made using an iPhone easier, provided you knew about them. 3D Touch was later replaced with Haptic Touch once Apple ditched the pressure-sensitive display, enabled again by the subtle vibrations of the Taptic Engine.
But, of course, the biggest win of Apple’s foray into haptics was the MacBook trackpad. The 2012 12-inch MacBook was so slim and compact that the company removed its physically clicking trackpad in favor of a solid piece of glass that vibrates back at you when you “click” it, with some pressure-sensitive Force Touch features thrown in to sweeten the deal. The change was so successful that now all Apple laptops use the new type of trackpad, and plenty of other laptop manufacturers have used a similar design to slim down the profile of their laptops too.
A button replacement
While wildly successful, what’s disappointing about this conservative approach to haptic feedback is that it turns vibrations by default into button replacements. Look no further than the iPhone 15 Pro. The Action Button is a neat way to replace a switch that was most often “off” with a multifunctional button that can enable shortcuts and other features on the iPhone. The sensation of “pressing” the Action Button is entirely enabled by the Taptic Engine.
There are obvious reasons why smartphone and laptop manufacturers don’t go hog-wild with haptics, the biggest being battery life. You don’t need to spend a long time with haptic feedback enabled on the iOS keyboard to notice it impacting your battery life. Switching to an interface built around vibration could have a serious impact on whether or not your phone or laptop makes it through a full day. But as iFixit notes in a blog digging into what makes Apple’s haptics unique, the company has regularly increased the size of the Taptic Engine as iPhones have grown larger. The satisfying feel of the iPhone or the MacBook Pro’s trackpad is because of the careful tuning of software and hardware. There’s no reason Apple or any other manufacturer couldn’t find other ways to keep battery life in check while leaning into good vibrations.
We should feel more
A new social networking app called ID by Amo stands out for more than a few reasons, the least of which is its out-of-left-field interpretation of social feeds as collaborative collages and mood boards. But what I first noticed about the app was its haptics. As you flip through updates to your friends’ boards, zooming in and out of someone’s board is handled with a tilt of your phone, vibrating in time with your rotations. Hop into a new board and you get a pleasant buzz too. It’s not over the top, but it’s more expressive than your standard text vibration.
Not Boring Software, the creators of (Not Boring) Weather and Calculator among others, similarly deploys playful uses of vibration across its apps while navigating menus. Neither of these are necessarily revelatory uses of haptics, but they’re proof that our personal electronics should do more than just recreate physical buttons or accent existing software elements. We’re engaged with our computers aurally, visually, and yes, tactilely, but mainstream consumer technology has left that tactile element grossly under-explored.
... shouldn’t we be doing more to engage senses outside of just sight and sound?
If the future is, as Meta and Apple propose, going to be driven by head-mounted, “immersive,” spatial computing, shouldn’t we be doing more to engage senses outside of just sight and sound? Why don’t any of these headsets vibrate? Haptics are all over gaming, the PlayStation VR2 headset vibrates your face after all, but its role in traditional computing is mostly to keep up appearances. There are hurdles to overcome, but there’s no reason we shouldn’t feel our computers just as much as we see or hear them.
This article was originally published on