Gesture Assistant Timeline Calibration and Prototype Design

no eye-tracking as suggested during critiques (too flashy/complicated and not enough positive results)

advertise as an “adult system” to avoid issues with adolescents

move the timeline forward as “2030” plan

originally we didn’t agree with this idea, but now that recent articles are coming out that detecting and interpreting thoughts is possible we think our system is more achievable

will address “broader impacts” since our system is flexible enough to not just solve this pollution problem

teaching sign language

assisting those with limited motor functions

universal communication (e.g. across cultures, deaf community, etc.)

Overall, we have decided to advertise our system as a preventative solution to severe pollution. The pollution will eradicate healthy speech within in the next 20-50 years, so our system will predicatively solve this problem by revolutionizing communication, ultimately resisting the environment, verbal speech, and trust in environmentalist and the government to solve this problem.

Timeline

Week Five: System refinement/polish/documentation of design process

April 19-20

develop video

scenes/script

everyone

develop presentation

Somn/Danny

formally write up refinements

Brittany

pick up/construct/test prototype

everyone

start video producing

Sam with materials and editing

everyone else with acting

April 21

present evolution plan

TBD based on setup

Week Six: Stress testing/evaluation/videos

April 26-27

prototype has been stress tested

everyone

evaluation study among users

Brittany/Danny develop script

Somn/Sam conducting study

producing video

use “scare tactics” to convey impending doom of pollution and need for this system

show how this system is educational and will not produce dependent users

Danny/Brittany acting

Sam/Somn filming

April 28

system presentation

TBD based on setup

Week Seven: final presentations

May 2: ICAT day

Brittany behind the scenes translator — acting as CPU

Danny demoing the tens unit

Danny/Sam conversing with audience member

Somn evaluating audience

**Roles subject to change due to time availability in on this day

May 3: final class exhibit and presentation

Same setup as ICAT day

Prototype Setup/Design

We want to go with a Wizard of Oz approach. The materials we ordered include

this won’t be a part of the “prototype” but will be used to demonstrate how it would feel to actually have the electrodes move your upper extremities

We will have one person acting as the “CPU” unit and remain behind the scenes. The “camera” and wireless earphones will be attached to the headband and function as the wearable unit. We will also fashion and attach a non-function, placeholder CPU.

Once the user is wearing this, we will place placebo electrodes on their upper extremities.

The user will “think” what they want to say by choosing from a listed phrase that we will have displayed and saying it so the “CPU” can hear. The “CPU” will then “translate” this and describe to them what to do with their arms, hands, and/or fingers (simulating the electrode stimulation and control). Once this is complete the other person’s “camera” will detect these gestures and the “CPU” will translate by emitting the translation into their ear from the wireless headphones. This process can then be repeated.

The connections between the “CPU” and the users will be established through a phone call.

Additionally, there will be a board of phrases to choose from, and a drawing of upper extremities with labeled positions. This will cut down on confusion when directing users how to position their upper extremities.