Assignments

Assignment 1, due 9/20

Post on Piazza one example of interactive computer/technology used in live performance (could be a software program, a digital instrument or controller, a performance, ...). Choose something you find exciting or inspiring. Provide a URL and/or citation. Describe how the technology works (at a high level) and how the human(s) interact with the technology. What do you find exciting about it? Any problems you see with it, or criticisms you might offer (including aesthetic or technical concerns)? How does the technology impact or constrain the type of interaction that is possible, and the type of music that is possible or easy to make?

(Post publicly on Piazza, use hashtag #assignment1.)

Assignment 2, due 10/2

Choose two or more synthesis methods to experiment within a music programming environment of your choosing. (Suggestions: Max/MSP, pd (a free Max/MSP-like environment), ChucK, SuperCollider, ???).

Post a thoughtful critique of the methods to Piazza, considering the quality of sounds that you can produce with a given method, the ease with which you can control the method, and any other characteristics that might influence someone's choice of whether to use the method in a performance or composition.

Use hashtag #assignment2 in your post

If you've never used an audio programming environment before and want some tips, just post to Piazza. Feel free to start with existing code & tutorials on the internet. Feel free to share code and programming tips with one another, but do the experimentation and response individually.

Assignment 3, due 10/9

Create a gesturally-controlled "instrument" that allows you to interactively control sound in real-time. Use an explicit mapping strategy that you program in whatever environment(s) you choose to use (i.e., no machine learning). Reflect on what was easy and hard to do in creating the mapping, what you found rewarding or frustrating about the process, and the process by which you chose the mapping you did. Submit your response on Piazza using #assignment3.

Feel free to build on any of your previous assignments. Easy-to-use controllers include the built-in laptop inputs (see http://smelt.cs.princeton.edu), the Wiimote (OSCulator is recommended if you're on a Mac), or joysticks (we have some you can borrow).

OpenSoundControl is a good tool for patching together code in different environments, e.g. if you want to use Smelt to capture motion sensor input and send it to pd, or if you want a ChucK program to receive Wiimote messages from OSCulator. Google for OSC examples for the languages you're using, and or post to piazza and get others to share their code with you.

Assignment 4, due 10/11

Write 1 paragraph (or more if you really want) reflecting on your experiences with the Fauvel seminar last week. Please post to Piazza using #assignment4. Feel free to start a new thread or join an existing thread to chime in on someone else's post. (If you do that, please offer thoughtful commentary and response to their post, not just post something unrelated without starting a new thread.) You can write about any aspect you want, but here are some ideas for starting points:

Did the seminar change any of your thinking around what "interactive technology" is (and has been, historically)?

Musicology involves ways of studying and reasoning about the world that are quite different from those used in computer science. What might be some of the practices, perspectives, or research goals we have in common? How might computer scientists benefit from understanding more about musicology or other humanities disciplines?

Did your ideas about how to digitize Fauvel -- either for scholars or for the public -- change in any surprising and interesting ways following the two seminar sessions led by musicologists?

Assignment 5, due 10/23

Build at least one gesturally-controlled instrument using Wekinator, with the controller(s) and synthesis method of your choosing.

Post a response to Piazza (using #assignment5), including the following: 1) Describe what controller, gestures, learning algorithms), and synthesis method you used. 2) Reflect on what was easy, what was difficult, and how you might improve the software. 3) Also reflect on how the experience of building with Wekinator compared to your previous assignment of building a mapping explicitly using programming.

Be sure to read Wekinator instructions to help you get started. Please run the walkthrough ahead of time to verify that the code works for you. We have about a 95% success rate on running on arbitrary machines, but if your machine happens to be Wekinator-unfriendly, we'll want to know ASAP. (Also, note that you may have to turn off firewall & antivirus for OSC to work properly on your machine.)

Assignment 6, due 10/25

Look through the table of contents for at least one conference proceedings or journal below, for at least the last 2 years. Choose a conference/journal that you expect to be closely related to your final project.

Write a response that includes the following: 1) A description of the type of work you generally find published at this venue (and what, if anything, is surprisingly absent). 2) List 10 papers that you find exciting, intriguing, or potentially useful. Provide a 1-sentence description of each one, and say briefly why you've included it. You don't (necessarily) have to read these papers-- it's fine to base your choice on the title and abstract. Post your response on Piazza using #assignment6

International Conference on New Interfaces for Musical Expression (NIME): NIME grew out of the ACM CHI (human-computer interaction conference), and it's mostly centered around hardware and software interfaces for performance and composition. However, there is a good mixture of other topics, as well. (Note that 2012 proceedings might not be posted yet at the above link; 2012 is available here.)

International Conference on Digital Audio Effects (DAFx): DAFx certainly includes some signal-processing-heavy research on cool-sounding audio effects, but it also includes broader topics like analysis (and synthesis) algorithms, spatial audio, interactive performance issues, audio perception, and others.

Computer Music Journal: A journal with very diverse content, spanning both technical and artistic considerations in computer music. Breadth is similar to ICMC. Sometimes there are special issues on particular topics-- see the website if any special issues appeal to you!

Organised Sound. Another interdisciplinary journal, can be more focused on musical issues compared to technical issues, but still includes a wide range of topics. Same as CMJ with respect to special issues.

[www.ismir.net/all-papers.html International Conference on Music Information Retrieval (ISMIR)]: Includes not just music "information retrieval," but music informatives more generally. Lots on audio analysis, most (but not all) of it not targeted at performance. The interactive aspects of systems are not usually explicitly considered, but there are definite exceptions. Lots of cool machine learning work here.

Sound and Music Computing Conference: Another international conference with a very broad focus on many issues related to sound, music, and computing. Lots of interesting stuff here. (Note that SMC offers a "summer school" session before each conference, targeted at students in the field-- something to consider!)

International Conference on Computer Music Modeling and Retrieval: A conference that is maybe not quite as broad as ICMC, focused on modeling and retrieval. Includes music emotion analysis, spatial audio, synthesis, computer models of perception and cognition, music information retrieval, computational musicology, others. (You'll have to google for each year individually to find proceedings by year.)

If there is another venue you think would be appropriate to add, just say so (on piazza or in class).

Written final project proposals due November 4. More info will be posted soon & discussed in class.

You will be scheduling a 30-minute meeting to discuss your project proposal for the week of November 5.

20 September: Synthesis algorithms & brief history of live electronic music

Tutorials: 5-minute lightning tutorials on synthesis methods

Tutorial leaders: sign up below. You can use slides, chalkboard, whatever you want. Please provide some URLs/references for people to find more information. Please also play some sound examples in your presentation.

To view this book chapter online, go to http://library.princeton.edu/, type in "The Cambridge companion to electronic music" into the "Books+" search box, and select the first result. This chapter is within the book section called "Electronic music in context."

Today's readings are a bit different-- both are by the same author, but one gives a technical presentation for IEEE and one discusses the problem for an audience of musicians & music technologists. Please read the IEEE paper to get as much of an understanding of the technical detail as possible, and read the SMC paper to get a feel for the argument the author is making to that audience. (i.e., spend more time on reading 1). Respond to both using the same response link.

Possible "cool systems" to highlight

Reactable

Theremin

Ondes Martenot

The Hands

George Lewis' Voyager

Monome

The Continuator

???

General Research Resources

Leading a discussion:

Our discussions will focus on both the technical points in a paper (i.e., how was something done?) as well as a broader critical examination (why was it done? how is this work useful? what other questions does it raise? what are some shortcomings? etc.).