- Command can be given in about 6 ways by turning on the cable, giving it, slipping, patting it
- Google's I / O braid recorded gestures with an accuracy of around 94 percent
May 21, 2020, 05:25 PM IST
new Delhi. Now to control music on the phone, there will be no need to speak (voice command) or even touch. Google has done a great experiment on textiles and technology, so that electronic devices can be given commands from normal cord and cable in future. The company claims that the cable has been designed to understand gesture commands by combining machine learning technology in braided (duffed) fiber. The company released the details about the new research, how the user can control the media through the headphone cable by simple tap, pinch, squeeze and twist the cable.
Gives visual feedback by considering input
Google's research is based on the company's old interactive e-textile architecture experience. He developed a new product I / O braid, made of a combination of touch-sensing textiles and fiber optics. This I / O Brad is not only able to understand the input given by the user but also provides his visual feedback. Google's Helical Sensing Matrix (HSM) has been used to develop sensing capabilities.
Google said in a blog post-
As cords can be made to detect basic touch gestures through capacitive sensing, we have designed a helical sensing matrix (HSM), which generates large gesture space. “HSM is a braid with electrically insulated conductive textile yarn (yarn or wire) and passive support yarn. Where conductive yarns in opposite directions take the role of transmit and receive electrodes to enable mutual capacitive sensing. Capacitive couplings at their intersections are modified by the user's fingers, and these interactions can be felt anywhere on the cord as the bradid pattern repeats along the length. ”
Understands 6 types of interactions
Google says that I / O can sense six different interactions on the cable by turning the braid, up to itka, slipping, chubbing, snapping and patting. Google is still experimenting with the technology and it will take some time before the more improved version is released as part of mass-market products.
Works with an accuracy of 94 percent
The company says the results of the experiment so far have been encouraging. It recorded gestures in I / O brads with an accuracy of about 94 percent. It also stated that the rotation of the e-textiles is “faster than existing headphone button control hacks” and “comparable to a surface touch”. In the future, this technology can be used in many ways such as adding touch and gesture control to the headphone cord, hoodie drawstring or smart speaker cord.
Regarding the future of technology, the tech giant said, “We look forward to advancing the textile user interface and promoting the use of microinteracts for wearable interfaces and smart fabrics in the future, where eye-free access and casual, Compact and beneficial input is beneficial. ”