Imagine that you go down the street with your headphones and want to change songs. The normal thing would be to press the buttons on the cable or take out the mobile and touch the screen. But what if you could Control music by squeezing the headphone cable? For example, twisting the cable left or right, or sliding your finger up to increase the volume. It sounds futuristic, but Google has done it using artificial intelligence and smart fabrics .
The technical name of this technology is ” E-Textile Microinteractions “, or micro-interactions with e-tissues, translating it literally. To do this, Google has focused on cables due to its modular use in garment cords and the ability to create wired data connections . Let’s see how it works.
HSM and machine learning for a smart cable
HSM stands for Helical Sensing Matrix , or Helical Sensing Matrix . It is a braid made of conductive threads and passive support threads (cotton) that, in a nutshell, are used to record user gestures. Basically, when we make a gesture we are activating a set of electrodes that can “track movement”. The braid has a fixed structure that is repeated throughout the entire cord, so gestures can be performed in any location.
The advantage of this design is that it allows to detect proximity, area, contact time, torsion and pressure . Using simple gestures, the user could do most of the actions related to music control. To give you feedback, something necessary to know if the cord has detected the action or not, the cable also has fiber optic wires that illuminate with a variable intensity in real time.
And what gestures can be done? Google conducted a gesture-obtaining study asking 12 volunteers to make eight gestures nine times. In total, 864 samples with different intensities . With these data, Google trained a machine learning model whose recognition accuracy is 94%. Thus, it is possible to touch once to pause and play; double tap to skip to the next song and twist to the left to lower the volume. You can also do simultaneous gestures, such as pinch and slide or tap and pinch.
Finally, Google developed different functional prototypes to put its system to the test: E-Textile USB Type-C headphones, a sweatband lanyard (to control music with clothing) and a cable for smart speakers. From the GIFs and videos that Google has published, it seems that the system works correctly, but at the moment it is little more than a project.
From Google they trust in “advancing textile user interfaces and inspiring the use of micro-interactions for future portable interfaces and smart fabrics.” They claim that their E-Textile system is faster than the button controls on the headphones and that the feedback obtained during the analysis indicates a certain preference for this type of interaction . It remains to be seen if this technology reaches the consumer market or stays in one more project, although there are already a few smart garments available.
More information | Google
👇 More in NUpgrade
- The gaming edition of the Realme X50 Pro is revealed with all its specifications seven days after its launch
- Apple would launch its augmented reality glasses in 2021, according to Jon Prosser
- HONOR MagicBook Pro 2020: 10th generation Intel processors arrive at the slimmest 16.1-inch ultrabook
- Razer Opus: Razer’s new wireless headphones have active noise cancellation and “neck detection”