The popular term from the 90s appears to be making a comeback, albeit in a different incarnation, courtesy a team of four students at Carnegie Mellon, two of whom are, you guessed it, desis. [Pittsburgh Post Gazette] (tip Karthik via email)
img: via Post Gazette
It is a sensor-equipped glove, known as HandTalk , that can translate gestures into spoken words on a cell phone. It was developed by students at Carnegie Mellon University as part of a class research project.
Three of the four team members, senior computer engineering students Bhargav Bhat, Hemant Sikaria and Jorge L. Meza , demonstrated the prototype yesterday at Carnegie Mellon’s “Meeting of the Minds” expo of undergraduate research projects.
It’s all in a noble cause, to help the deaf communicate with people who are not familiar with the American Sign Language.
The technology is pretty simple (not really) and cheap – desi ishtyle (really).
Underneath the hood of this system are several relatively inexpensive pieces of technology.
Along each finger and the thumb of the glove are flexor strips, which change their electrical resistance, depending on how much the digits are curled. The positions of the fingers are read by a chip and transmitted wirelessly to a cell phone, which is loaded with a vocabulary that corresponds to the gestures.
The cell phone then types the words as text messages, and an off-the-shelf program translates them into speech.
The prototype currently can only recognize 15 of 26 American Sign Language alphabets and need major upgrades, but the team is optimistic of testing the gloves with hearing-impaired people in a few months. But wait, the desi connection doesn’t end with the team of students.
The HandTalk project is one of several created this year in Dr. Narasimhan’s Embedded System Design course, in which teams of four have to develop a product prototype in 15 weeks.
Among the other projects this term is a lighted jump rope that changes colors the faster someone skips and can deliver digital messages on how many calories are being burned, and a cell phone system that allows students sitting “in the nosebleed section” of a hockey game to tap into any video camera feed within the building and see it on their phone screens.
One project from last year’s class already has resulted in a local spin-off company, Dr. Narasimhan said. The students developed a bar code reader for blind people that gets product and price information from the Web and reads it back to the shopper.
All in a day’s work at this class in Carnegie Mellon.