My first reaction to this article was,"That really is very cool and could undoubtedly be of great use to the hearing impaired." But as I considered it, I wondered actually how that particular application might be utilised. After all, if the hearing-impaired were viewing the event via a video ink and watching the hand motions of the sign language interpreter then they had no need of further exposition. And if there were need and means for a visual transcript, then there would be no need for a sign language interpreter. Is there an obvious extended use to this technology that perhaps I'm missing? Like, communicating with extraterrestials? Or Dolphins?
Interacting with the computer itself, a la "Minority Report", etc. Computers can already track your hands/fingers. The next logical step is for the computer to interpret gestures as commands.
My first reaction to this article was,"That really is very cool and could undoubtedly be of great use to the hearing impaired." But as I considered it, I wondered actually how that particular application might be utilised. After all, if the hearing-impaired were viewing the event via a video ink and watching the hand motions of the sign language interpreter then they had no need of further exposition. And if there were need and means for a visual transcript, then there would be no need for a sign language interpreter. Is there an obvious extended use to this technology that perhaps I'm missing? Like, communicating with extraterrestials? Or Dolphins?
ReplyDeleteInteracting with the computer itself, a la "Minority Report", etc. Computers can already track your hands/fingers. The next logical step is for the computer to interpret gestures as commands.
ReplyDelete