I am currently teaching at USCB as an Assistant Professor of Media Arts, developing a Media Arts track that applies Studio Arts skills to interactive media production such as 3D animation and videogame design.

Our Media Arts classes are based on a STEAM (STEM+Art) approach to applied interdisciplinary research at the intersection of Studio Arts and Computational Science.

July 2013, Achievement Unlocked! I taught gifted high school students from all over California videogame design in UCSC COSMOS summer program. They learned how to design and program games in GameMaker, Processing, and Unity 3D, as well as how to make art assets in GIMP, Audacity, and Blender. Get more info and see the games they made...

February 2013, Technical Report (UCSC-SOE-01-13): In this article we review the state of the arts in virtual character control research, proposing a critical technical practice for making aesthetics a central concern. We argue that increased acting logics and affordances for playing a role in popular videogames are pushing character interaction towards the performing arts domain. To align procedural play with viewer preferences for skilled performance in media, we propose an expressive role play model based on classical acting. Our performative ontology draws from theory and practices in the arts, artificial intelligence, and psychology to provide a firm theoretical grounding for our approach. We use our model to analyze several examples of playable media, and describe initial experiments to infer the aesthetic quality of poses in motion capture data as a first step towards generating expressive character features...

March 2012, Presentation: I gave a non-technical presentation of my research to the undergrad game design survey course HAVC 81: Video Games as Visual Culture. I framed my work within the context of Michael Mateas' recent symposium talk 'Revisiting the Photoshop of AI Debate', which was in response to Chris Hecker's 'Structure vs Style' GDC2008 talk. My presentation was entitled..

We chose to visualize the ESESC memory hierarchy in Processing graphics program. My part was to design the interactive model, develop it in Processing, and make sure the model printed to a PDF file.

CMPS203 Programming Languages: Survey of Functional Programming for Games (Faculty: Cormac Flanagan) by Topher Maraffi and David Seagal.

My contribution is the second part of our FRP survey (starts slide #11 in the presentation, or Part 4 in the paper) , focusing on how the reactive media DSLs of Conal Elliott & Paul Hudak could impact furure NUI game development.

From the Conclusion of our paper "Leveling Up: Could Functional Programming Be a Game Changer?"...

"It is a possibility that FP has not taken off for game development because there wasn’t an ideal fit for it. However, the semantics of FRP, with continuous time and behavior/signals, seems particularly well suited for streaming Natural User Interface (NUI) applications that use Kinect or Wii technology. Could the paradigms of FRP and NUI be merged to provide a new kind of artist friendly tool that enhances content generation for games? The imperative method of keyframing a rig’s joints on a discrete timeline is closer to antiquated stop motion technique than to modern motion capture. FRP could fundamentally change the way animations are produced by virtualizing gesture into performance signal functions, which could then be composed directly through a NUI device like Kinect. This could fuse Hudak’s concept of virtual instruments with Elliott’s concept of tangible values in a new FRP approach for gestural performance in games."

The concept for this project was to use the HyperNEAT neuro-evolutionary algorithm to implement simulated mimicry as performer modeling, recognition, prediction, and synthesis through a NUI interface device. This is still a work in progress...

The concept for the Movienator! project was to design a task oriented agent modeled after the distinctive personality of Arnold, and which uses natural language processing (NLP) to help user’s get movie information from the IMDB database. The kicker is that The Movienator agent prefers Arnold movies over anything else, always trying to steer the user towards one of his many films, or towards a film starring someone he worked with, or at least a movie in the action genre. Filled with Arnold movie trivia and famous one-liners, The Movienator is meant to be entertaining as well as functional. I designed the new architecture to interface with a standard DM, NLU, NLG, and DB moviebot skeleton architecture from the previous year's class. I also developed the Arnold ontology as a list of tuples containing movie titles and quotes-taglines, with system preferences for genres and actors, and wrote getArnold and Bayes functions to interact with the Arnold knowledge base and to drive the system mood.

The concept for the Dance Mario Dance! (DMD) project was to design a simple platformer game where the levels are dynamically generated from the in-game music, so that navigating the resulting level structures requires rhythmic gameplay, similar to Guitar Hero or DDR. Using Infinite Mario, we set out to generate level content (hills, tokens, etc) that would compel the player to make Mario dance through the level in real-time to the song. To complete the level successfully, a player should manipulate the game controls like they are playing an instrument, with difficulty levels being based on the musical complexity and tempo of the music. Our initial prototype in Java produced hills that reflected the shape of an MP3 waveform. I did most of the design, project management, and music processing research, while Ron did most of the level programming. Ideally, in future versions, I would like to generate level content from individual midi instruments, replace standard controls with a controller-less system like Microsoft Kinect, and replace the standard Mario sprites with dancing versions. I think it would be really cool to have a player’s gestures drive Mario’s movement, so that the embodied dance skills of the player become critical to leveling up in DMD!

You can see a proof-of-concept of my PhD research direction by watching video documentation of my Digital Arts and New Media MFA thesis performances. I completed my MFA at UCSC in June 2010, consisting in part of aperformance study (Mimesis & Mocap), that visualized the effect of making an autonomous agent mimic and improvise with an embodied actor through dramatic gesture. My live stage shows used pantomime and dance, within the comedic context of the classic Marx Brother's Mirror Gag, to create the illusion of expressive play between myself and my 3D Avatar. Here is a short preview video that shows excerpts from both The Avatar Dance group show and The Magic Mirror Game two-actor-one-man show: