This project explores a new wearable system, called TYTH, that enables a novel form of human computer interaction based on the relative location and interaction between the user’s tongue and teeth. TYTH allows its user to interact with a computing system by tapping on their teeth. This form of interaction is analogous to using a finger to type on a keypad except that the tongue substitutes for the finger and the teeth for the keyboard. We study the neurological and anatomical structures of the tongue to design TYTH so that the obtrusiveness and social awkwardness caused by the wearable is minimized while maximizing its accuracy and sensing sensitivity. From behind the user’s ears, TYTH senses the brain signals and muscle signals that control tongue movement sent from the brain and captures the miniature skin surface deformation caused by tongue movement. We model the relationship between tongue movement and the signals recorded, from which a tongue localization technique and tongue-teeth tapping detection technique are derived
Video Demo