By Max Greenwald '17 and Nora Bradley '18
BuzzMe was created for Haben Girma, a lawyer, disability rights advocate and public speaker who is deafblind. Girma wanted a device that would allow her to gauge audience engagement and response to public speaking presentations. This project embodies the idea of transformations because it uses technology to translate information that is typically transmitted through visual and auditory cues into haptic signals. BuzzMe’s transformations are useful for people who have vision or hearing impairment, but it can also be utilized in a number of other artistic and practical situations to relay information passively.
Facebook live reactions allow a large audience to provide feedback to a speaker that cannot hear or see them by sending “emojis” that represent different emotions that “float” across the screen in real time. A similar method is used in this device to collect self reported crowd information from an easily accessible website to transmit to the user. Based on input from Haben and our own brainstorming, the website collects and transmits messages corresponding to: haha, love, wow, confusion, an alert (to notify the user that more information is necessary, and a percentage poll response. Research into haptic signaling shows that a person can learn vibration signals the same way they can learn a language, so as long as the signals are short and distinct, a person can subconsciously associate vibrations signals to a particular meaning.
We had the opportunity to work with our client to get feedback on the prototype version of our device, and we will be making a number of changes based on her comments. These include:
Feedback messages will come from from an interpreter, rather than through processed audience feedback.
The motor location will change from 4 (one on each corner) to 2 (one on either side of the leg) and the signals will change slightly. A smaller battery and microcontroller will also be used to decrease the overall size of the device.