Google has filed a patent for a system that recognises pressure and swipe gestures on the user’s skin. For smartwatches and earbuds. Google has put up a fantastic array of consumer devices. The Google Pixel 6 and 6 Pro are extremely popular, and they are the first Google phones to use a Tensor processor. The Pixel 6a will be released in May 2022, and it will be the company’s second low-cost model. The first Google Pixel Watch is also anticipated around the same time. Around that time, it’s probable that new earplugs may be revealed.
Last year, the Pixel Buds A-series, which were quite inexpensive, was launched. It’s not out of the question that a new Pro model will be released this year as a successor to the Pixel Buds 2 that will be released in 2020.
In comparison to the newer Buds A, the latter earbuds include a few added capabilities, such as the ability to control the volume. by use of gestures
Meanwhile, Google has submitted a patent that appears to build on this feature. It is concerned with the operation of wearables using an asking interface,’ or gesture control via the user’s skin. From earplugs and smartwatches to virtual reality headsets and smart glasses, the unique technology may be applied to a wide range of wearable items.
Google Pixel wearables that use the user’s skin to control gestures
Google LLC submitted a patent for a ‘Skin interface for Wearables: Sensor fusion to increase signal quality with the WIPO (World Intellectual Property Office) in mid-2020. On March 3, 2022, the 27-page paper was authorised and published.
Wearables are now controlled mostly by voice commands and touch input, due to a built-in microphone. However, a physical input (such as a touch or a tap) may cause the user to experience undesired noise or pain. According to Google, touch can also damage the antenna’s function. The American company has devised a solution: skin gesture control.
The user can perform a swipe or touch motion on their body’s skin near the wearable. The gesture generates a mechanical wave, which is detected by sensors and an accelerometer, which determines the sort of gesture. Sensor Fusion technology is employed for this, which combines data from many sensors for more precise detection.
Which input command pertains to this is determined by the gesture. You can swipe vertically or sideways, and you can also tap with single or multiple fingers, making a short or long tap. The user may utilise this method to control a variety of functions, including answering or terminating a phone conversation, altering the level, forwarding or rewinding music, and so on.
The wearable can tell the difference between movements that aren’t meant as input and those that are. This allows the user to nod, chew, itch, stroll, and/or converse with his head without fear of it being misinterpreted as an input motion.
The proprietary technology provides a number of benefits. By removing the need to physically contact the earphones, the chance of their going into the user’s ear accidentally is reduced, which may be painful. Furthermore, the broader surface facilitates the entry of motions.
These benefits apply to a smartwatch to a lesser extent. Google will most likely focus on developing this technology for the Pixel Buds first, before moving on to other wearables like smartwatches.
It’s unclear whether or not Google will use the proprietary technology in future Pixel Buds and/or Pixel Watch devices. Now that Sony has released the first commercial device with this capability, it’s not unreasonable to expect additional manufacturers to follow suit in the following years.
Google has taken the Pixel line-up seriously lately. The Google Pixel Notepad, the company’s first foldable smartphone, will be released later this year as a competitor to the Samsung Galaxy Z Fold 3. The Pixel Watch appears to be an intriguing timepiece as well. There are plenty of lovely goods to look forward to.
official documentation: Google Pixel Skin interface for Wearables.