What you need to know
- A new Apple Watch patent has revealed how Watch could one day detect gestures your hand is making.
- It would use sensors to measure changes characteristic of different gestures.
- It could have accessibility uses for people who struggle with verbal commands or performing touch gestures.
An Apple Watch patent published today has revealed how sensors in your Apple Watch could one day detect gestures you're making with your hand, potentially improving usability for those with accessibility needs.
The patent is titled 'motion and gesture input from a Wearable device' and its abstract states:
This disclosure relates to detecting hand gesture input using an electronic device, such as a wearable device strapped to a wrist. The device can have multiple photodiodes, each sensing light at a different position on a surface of the device that faces the skin of a user. Examples of the disclosure detect hand gestures by recognizing patterns in sensor data that are characteristic of each hand gesture, as the tissue expands and contracts and anatomical features in the tissue move during the gesture.
The patent mostly relates, but is not limited to, Apple Watch:
This disclosure relates to detecting hand gesture input using an electronic device, such as a wearable device strapped to a wrist.
The Watch would have multiple photodiodes that can sense light, penetrating to different depths of tissue to measure the movement of your arm. It's much the same technology Apple uses to measure heart rate on your wrist, but rather than detecting heart rate BPM, it could measure how your hand is moving:
This relates generally to detecting a user's motion and gesture input to provide commands to one or more devices. In particular, a device can use one or more sensors to determine a user's motion and gesture input based on movements of the user's hand, arm, wrist, and fingers....
Examples of the disclosure detect hand gestures by recognizing patterns in sensor data that are characteristic of each hand gesture, as the tissue expands and contracts and anatomical features in the tissue move during the gesture.
In one example, the device can be trained on sensor data as the user performs a plurality of hand gestures. For example, during a first period, a user can perform a hand flap gesture and sensor data can be collected as the gesture is performed. During a second period, a user can perform a hand clench gesture and further sensor data can be collected as the gesture is performed. The sensor data can then be processed to calculate signal characteristics (e.g., peak/trough extraction, phase detection, etc., as described below) based on the sensor data for each period.
What possible use could this have? Well, it seems the patent is primarily aimed at accessibility:
Some existing portable electronic devices accept voice or touch input to control functionality of the devices. For example, a voice command system can map specific verbal commands to operations such as initiating a voice call with a particular contact in response to speaking the contact's name. In another example, a touch input system can map specific touch gestures to operations such as zooming out in response to a pinch gesture on a touch sensitive surface. However, there may be situations where the user's ability to speak a verbal command or perform a touch gesture may be limited.
This patent might prove useful by providing those with accessibility needs a new way to input information into a phone or tablet, simply by making gestures with their hands. Of course, this is just a patent so there's no guarantee it will ever become a reality. However, it seems like Apple has all of the tech in place to make this happen.