Andy Boxall / Digital Trends

Gesture control systems work best when they are simple, quick, and easy to learn. They also have to feel natural and intuitive, preferably so you remember them, and most of all, they need to be reliable. Apple’s Double Tap on the Apple Watch Series 9 and Apple Watch Ultra 2 is a good example of a gesture control working well.

I think it’s a cool, fun feature, and I’m not alone. However, it’s not the first of its kind, and history shows us that, unfortunately, gestures on mobile devices are more likely to be abandoned and forgotten than they are loved and widely adopted.

Motion Sense on the Pixel 4

Use Motion Sense on Pixel 4 | Pixel

Without going way back to look at phones like the 2009 Sony Ericsson Yari, which used the front-facing camera to check body movements and control pre-loaded games, many tech fans may instantly think of the 2019 Google Pixel 4 and Pixel 4 XL when gesture controls are mentioned.

Don’t Miss These Pixel Deals:

These two devices were the first to feature Google’s Project Soli chip, which used radar to recognize even the tiniest of movements around it. The resulting feature, called Motion Sense, allowed you to swipe over and around the phone to play or pause music, silence alarms, or mute an incoming call. It also unlocked the phone when you picked it up.

pixel 4 xl accented button
Julian Chokkattu / Digital Trends

It was technically exciting, but in practice, it didn’t work reliably enough, and regularity issues meant it couldn’t be used globally, hurting sales potential. The Pixel 4 ended up being the only smartphone outing for Project Soli’s gesture controls, but the chip has lived on without Motion Sense in the Google Nest Hub, where it helps measure breathing while you sleep.

eyeSight Technologies

Pantech’s Vega LTE Smartphone with eyeSight’s Gesture Recognition Technology

Motion Sense on the Pixel 4 is probably one of the better-known failed gesture control systems on a phone, but other companies had been working on gesture control systems for a long time before it. In 2011, Korean electronics brand Pantech released the Vega LTE, which had basically the same gestures as the Pixel 4 but used a software-based system that relied on the front camera to “see.” It was developed by a company called eyeSight Technologies.

For several years after the Pantech Vega, eyeSight Technologies pushed very hard to make gesture controls on mobile devices a thing. It was counting on its platform-agnostic Natural User Interface (NUI) software for success, as it could be built directly into a device’s operating system or even apps to utilize the camera and add gesture control systems.

The company worked with Indian smartphone brand Micromax on the A85 Superfone, which used similar gestures to the Pantech Vega, made the NUI available on Android and iOS, and its technology was shown off at technology trade shows on multiple occasions. It boasted partnerships with companies ranging from Nokia to AMD and tried to capitalize on the early VR craze in 2016, too. Despite all these efforts, it never reached the mainstream, and the company eventually changed its name to Cipia and pivoted to in-car tech.

Air Gestures on the Samsung Galaxy S4

Galaxy S4 gorgeous screen
Digital Trends

Around the same time as eyeSight Technologies was promoting its software-based gesture system, Samsung introduced a small set of gesture controls called Air Gestures on the brand-new Galaxy S4 smartphone. The phone uses an infrared sensor to spot basic hand movements over the screen, allowing you to activate it to check the time without touching it, accept calls, interact with apps, and even scroll through web pages with just a swipe.

It worked quite well, but the sensor’s short range meant you were almost touching the screen anyway, making it appear more gimmicky than the cool tech perhaps deserved. The feature continued on in Samsung’s repertoire but was slowly phased out and replaced by Air Actions, which uses gestures made with the S Pen stylus to perform similar actions without the need for an infrared sensor.

Samsung Galaxy S4 How 2 Use Air Gestures

So far, we’ve seen radar, software, and infrared sensors used to understand hand motions and control features on our phones, showing how companies were keen to experiment and that there wasn’t one recognized “best” way of adding gesture recognition to a smartphone. But we’re not done yet.

Elliptic Labs

Elliptic Labs 2
Elliptic Labs technology demonstrated at a trade show Malarie Gokey/Digital Trends / Digital Trends

Fast forward to 2017, and interestingly, the Galaxy S4 was also called into action in a demonstration of gesture recognition technology from Elliptic Labs, which — like eyeSight Technologies, spent a great deal of time and effort trying to get us waving at our smartphones and other devices.

Elliptic Labs’s technology used ultrasound to detect movement, which allowed a greater field of movement and for different gestures to be used without any reliance on light, less power consumption, and more accuracy. It planned on licensing the ultrasound gesture technology to device makers, but it never seemed to get far beyond the demo and concept stage, despite adapting the same system to take advantage of the Internet of Things (IoT) boom and integrating it into speakers and lights.

Instead, Xiaomi used its ultrasonic proximity sensor to get rid of a traditional proximity sensor and minimize the bezel on the original Mi Mix. Today, Elliptic Labs still works on proximity sensors and has now eliminated the hardware from the system entirely to offer software-driven proximity detection, which can be found on the Motorola Razr Plus and Razr (2023) compact folding phones.

Air Motion on the LG G8 ThinQ

LG G8 ThinQ
Julian Chokkattu / Digital Trends

Both Elliptic Labs and eyeSight Technologies, along with other companies like Neonode, experimented with gesture controls between 2010 and 2017 — but without making much of an impact outside of tech trade shows like CES and MWC. When the Pixel 4 reignited interest in gesture controls in 2019, it was joined by another big-name device: the LG G8 ThinQ.

LG, which has now stopped making smartphones entirely, loved to try new things with its phones, whether it was modular hardware with the LG G5 or secondary screens on phones like the LG V10 and the V50 ThinQ. Air Motion used front-facing cameras and a time-of-flight (ToF) sensor to detect various hand motions, including mimicking the twisting of a volume knob to adjust the volume of the music player.

How to Use LG G8 ThinQ – Air Motion

Like all close proximity gesture control systems, its usefulness was questionable as the touchscreen was right there, mere inches from your twiddling fingers. It also wasn’t particularly reliable, which stopped people from using it. The LG G8 ThinQ marked the end of the line for LG’s G Series, and along with Project Soli, Air Motion was perhaps the last gesture control system to be heavily promoted by a phone maker.

What about smartwatches?

A photo of someone gesturing with their hand to use Double Tap on the Apple Watch Ultra 2.
Joe Maring / Digital Trends

Until recently, gesture controls have mostly been demonstrated or featured on smartphones. But what about smartwatches? Double Tap on the Apple Watch Series 9 and Apple Watch Ultra 2 owes its existence to an accessibility feature called AssistiveTouch, which has been part of watchOS for several years. Samsung provides a very similar accessibility feature on the Galaxy Watch 6, too.

Space is tight inside a smartwatch, and there’s very little spare room for cameras, proximity sensors, or other complex bits of hardware. Double Tap uses the heart rate sensor, the accelerometer, and software to recognize when you are tapping your fingers, adding another system of recognition to the list.

Outside of the Apple Watch and Double Tap, Google demonstrated Project Soli inside a smartwatch, but it never made it to the eventual Google Pixel Watch. The bizarrely named Mad Gaze Watch apparently used bone conduction to enable a range of different gesture controls, from finger snaps to arm taps. In 2015, a company called Deus Ex crowdfunded an add-on module for the Pebble Watch called the Aria, and while it’s not a smartwatch, Google used head nods to add hands-free use to Google Glass.

Simple gestures are best

A person making the Double Tap gesture on the Apple Watch Series 9.
Andy Boxall / Digital Trends

All these examples show there are more failed attempts at making gesture control systems on our phones and smartwatches popular than there are successes. However, there are several simple gestures that have proven to be effective and reliable — to the point where we don’t even consider them special. A great example is raise-to-wake, where lifting or tilting a device’s screen towards your face turns on the display, and it’s the perfect example of a natural movement activating a feature. It could be argued that anything beyond this is simply too complicated.

Even wrist flicks and twists, which were used on the Moto 360 to aid scrolling, seem to be a gesture too far and rarely seen since the Fossil Hybrid HR. Outside of these few isolated examples and the essential accessibility features, gestures have not transformed the regular, everyday use of a wearable or smartphone for most people.

Double Tap has the potential to join raise-to-wake as one of the few widely usable gestures on a smartwatch, though, as it’s simple, natural, and works really well. Sadly, history shows gesture control systems and mobile devices simply haven’t captured our interest yet, and I hope Double Tap doesn’t end up on a future list of quickly abandoned yet promising gesture control systems. It’s too interesting to suffer that fate.

Editors’ Recommendations








Source link