Humans communicate their emotions in large part through facial expressions. We developed a novel technique to study the static and dynamic aspects of facial expressions. The stimulus consisted of 4 parts: (1) a dynamic face, (2) two smaller versions of the starting and end states, (3) a label indicating the target dynamic expression, and (4) sliders that could be adjusted to change the facial characteristics. Participants were instructed to adjust the sliders such that the face would most closely match the target expression. Participants had access to 53 sliders, allowing them to manipulate static and dynamic characteristics such as face shape, eyebrow shape, mouth shape, and gaze. Preliminary data from 4 participants and 7 conditions revealed interesting effects. Some expressions are marked by unique facial features (e.g. anger given by frown, surprise and fright given by open mouth and open eyes, pain given by partially closed eyes, and happiness given by upwards curvature of the mouth). Some expressions seem to develop non-linearly in time, that is, include an intermediate state that deviates from a linear transformation between starting and ending states (e.g. anger, surprise, pain). This demonstrates the method's validity for measuring the optimal representation of given facial expressions. Because the method does no rely on presenting facial expressions taken from or derived from actors, we believe that it is a more direct measure of internal representations of emotional expressions. Current work is focused on building a vocabulary of emotions and emotional transitions, towards an understanding of the facial expression space.