COS597b Fall2012

From CSWiki
Revision as of 19:35, 7 October 2012 by RebeccaFiebrink (talk | contribs) (Assignments)

Jump to: navigation, search

Welcome

Welcome to COS 597B, Interactive Music Systems.

The homepage for the course is http://www.cs.princeton.edu/courses/archive/fall12/cos597B/index.php.

We will be using this Wiki to manage the course schedule and readings, and sign up for presentation/discussion leader slots.

Course participants

Add your name here, with link to your webpage if you want:

  • Rebecca Fiebrink, instructor
  • Edward Zhang '13
  • Jeff Snyder '13
  • Avneesh Sarwate '14
  • Joe Tylka, first year grad student
  • Reid Oda, second year grad student
  • Rahulram Sridhar, G1
  • Katie Wolf, second year grad student
  • Sasha Koruga, G2
  • Ohad Fried, first year grad
  • Abu Saparov '13
  • Danielle Bragg, second year grad student
  • Daniel Ryan '13
  • Alejandro Van Zandt-Escobar, CS '14
  • Zeerak Ahmed '13
  • Nikitas Tampakis '14
  • Jennifer Guo, first year grad
  • Tobe Nwanna, G2

Assignments

  • Assignment 1, due 9/20
    • Post on Piazza one example of interactive computer/technology used in live performance (could be a software program, a digital instrument or controller, a performance, ...). Choose something you find exciting or inspiring. Provide a URL and/or citation. Describe how the technology works (at a high level) and how the human(s) interact with the technology. What do you find exciting about it? Any problems you see with it, or criticisms you might offer (including aesthetic or technical concerns)? How does the technology impact or constrain the type of interaction that is possible, and the type of music that is possible or easy to make?
    • (Post publicly on Piazza, use hashtag #assignment1.)
  • Assignment 2, due 10/2
    • Choose two or more synthesis methods to experiment within a music programming environment of your choosing. (Suggestions: Max/MSP, pd (a free Max/MSP-like environment), ChucK, SuperCollider, ???).
    • Post a thoughtful critique of the methods to Piazza, considering the quality of sounds that you can produce with a given method, the ease with which you can control the method, and any other characteristics that might influence someone's choice of whether to use the method in a performance or composition.
    • Use hashtag #assignment2 in your post
    • If you've never used an audio programming environment before and want some tips, just post to Piazza. Feel free to start with existing code & tutorials on the internet. Feel free to share code and programming tips with one another, but do the experimentation and response individually.
  • Assignment 3, due 10/9
    • Create a gesturally-controlled "instrument" that allows you to interactively control sound in real-time. Use an explicit mapping strategy that you program in whatever environment(s) you choose to use (i.e., no machine learning). Reflect on what was easy and hard to do in creating the mapping, what you found rewarding or frustrating about the process, and the process by which you chose the mapping you did. Submit your response on Piazza.
    • Feel free to build on any of your previous assignments. Easy-to-use controllers include the built-in laptop inputs (see http://www.smelt.cs.princeton.edu), the Wiimote (OSCulator is recommended if you're on a Mac), or joysticks (we have some you can borrow).
    • OpenSoundControl is a good tool for patching together code in different environments, e.g. if you want to use Smelt to capture motion sensor input and send it to pd, or if you want a ChucK program to receive Wiimote messages from OSCulator. Google for OSC examples for the languages you're using, and or post to piazza and get others to share their code with you.
  • Assignment 4, due 10/11
    • Write 1 paragraph (or more if you really want) reflecting on your experiences with the Fauvel seminar last week. Please post to Piazza. Feel free to start a new thread or join an existing thread to chime in on someone else's post. (If you do that, please offer thoughtful commentary and response to their post, not just post something unrelated without starting a new thread.) You can write about any aspect you want, but here are some ideas for starting points:
      • Did the seminar change any of your thinking around what "interactive technology" is (and has been, historically)?
      • Musicology involves ways of studying and reasoning about the world that are quite different from those used in computer science. What might be some of the practices, perspectives, or research goals we have in common? How might computer scientists benefit from understanding more about musicology or other humanities disciplines?
      • Did your ideas about how to digitize Fauvel -- either for scholars or for the public -- change in any surprising and interesting ways following the two seminar sessions led by musicologists?
  • Assignment 5, due 10/23
    • Build at least one gesturally-controlled instrument using Wekinator, with the controller(s) and synthesis method of your choosing.
    • Post a response to Piazza that includes the following: 1) Describe what controller, gestures, learning algorithms), and synthesis method you used. 2) Reflect on what was easy, what was difficult, and how you might improve the software. 3) Also reflect on how the experience of building with Wekinator compared to your previous assignment of building a mapping explicitly using programming.
    • Be sure to read Wekinator instructions to help you get started. Please run the walkthrough ahead of time to verify that the code works for you. We have about a 95% success rate on running on arbitrary machines, but if your machine happens to be Wekinator-unfriendly, we'll want to know ASAP. (Also, note that you may have to turn off firewall & antivirus for OSC to work properly on your machine.)
  • Assignment 6, due 25 October
    • TODO.

Schedule

  • 13 September: First day!
    • Course overview & introduction
    • Link to lecture slides
    • A brief history of computers in music composition, performance, listening, and scholarship
    • Assignment 0, due 9/18:
      • Sign up for Piazza
      • If there are any other course management tools for blogging, discussion, Q&A, etc. that you would like to use, say so! (e.g., using a Piazza post)
      • Read through the course syllabus and get in touch with any questions you have.
      • Get a Princeton CS Wiki account by e-mailing csstaff@cs.princeton.edu (CC fiebrink@cs.princeton.edu); say you're requesting an account to use to edit the course Wiki site.
      • Add your name (and possibly homepage) to the list of participants above.
      • Familiarize yourself with research paper reading & discussion leading resources at the bottom of the wiki.
      • Sign up for at least one or two discussion leader slots, for one tutorial slot for 9/20, and possibly also for slots for 5-minute lightning presentations for "System/Work of the Day."
      • And, of course, do the reading & response for the next class! Readings will be discussed the day they show up on the schedule. Reading responses are due at 8am that day.
  • 18 September: Introduction to digital sound and music
    • Tutorial: Physics, perception, and digital representations of sound and music
    • Reading: Music and Computers, Chapters 1 and 2
    • Assignment 1, due 9/20: Post on Piazza one example of interactive computer/technology used in live performance (could be a software program, a digital instrument or controller, a performance, ...). Choose something you find exciting or inspiring. Provide a URL and/or citation. Describe how the technology works (at a high level) and how the human(s) interact with the technology. What do you find exciting about it? Any problems you see with it, or criticisms you might offer (including aesthetic or technical concerns)? How does the technology impact or constrain the type of interaction that is possible, and the type of music that is possible or easy to make?
      • (Post publicly on Piazza, use hashtag #assignment1.)
  • 20 September: Synthesis algorithms & brief history of live electronic music
    • Tutorials: 5-minute lightning tutorials on synthesis methods
      • Tutorial leaders: sign up below. You can use slides, chalkboard, whatever you want. Please provide some URLs/references for people to find more information. Please also play some sound examples in your presentation.
      • Additive synthesis: Katie Wolf
      • Subtractive synthesis: Jeff Snyder
      • Granular synthesis: Alejandro Van Zandt-Escobar
      • Wavetable synthesis: Rahulram Sridhar
      • Physical modeling synthesis: Ohad
        Demo shown in class
      • Waveshaping synthesis: Reid Oda
    • Reading: Nicolas Collins, "Live electronic music" (Chapter 3 of The Cambridge Companion to Electronic Music, 2011)
    • If time permits, in-class discussion of Assignment 1
    • Assignment 2, due 10/2: Choose two or more synthesis methods to experiment within a music programming environment of your choosing. (Suggestions: Max/MSP, pd (a free Max/MSP-like environment), ChucK, SuperCollider, ???).
      • Post a thoughtful critique of the methods to Piazza, considering the quality of sounds that you can produce with a given method, the ease with which you can control the method, and any other characteristics that might influence someone's choice of whether to use the method in a performance or composition.
      • Use hashtag #assignment2 in your post
      • If you've never used an audio programming environment before and want some tips, just post to Piazza. Feel free to start with existing code & tutorials on the internet. Feel free to share code and programming tips with one another, but do the experimentation and response individually.
  • 25 September: Music synthesis and programming environments
  • 27 September: Introduction to Gestural Control of Sound
    • System/Work of the day: Alejandro Van Zandt-Escobar - Serato Live
    • Read: Bert Bongers, "Physical Interfaces in the Electronic Arts"
    • Read: Wanderley and Depalle, "Gestural Control of Sound Synthesis".
    • Assignment 3, due 10/9: Create a gesturally-controlled "instrument" that allows you to interactively control sound in real-time. Use an explicit mapping strategy that you program in whatever environment(s) you choose to use (i.e., no machine learning). Reflect on what was easy and hard to do in creating the mapping, what you found rewarding or frustrating about the process, and the process by which you chose the mapping you did. Submit your response on Piazza.
      • Feel free to build on any of your previous assignments. Easy-to-use controllers include the built-in laptop inputs (see http://www.smelt.cs.princeton.edu), the Wiimote (OSCulator is recommended if you're on a Mac), or joysticks (we have some you can borrow).
      • OpenSoundControl is a good tool for patching together code in different environments, e.g. if you want to use Smelt to capture motion sensor input and send it to pd, or if you want a ChucK program to receive Wiimote messages from OSCulator. Google for OSC examples for the languages you're using, and or post to piazza and get others to share their code with you.
  • 2 October: Musicology crash-course on Fauvel
  • 4 October: Meeting with Fauvel seminar: Mise en Page
  • 9 October: Mappings for digital musical instruments
    • Assignment 3 due (see above)
    • Brief discussion on Fauvel
    • System/Work of the day: Edward
    • Reading 1: Hunt and Wanderley, "Mapping performer parameters to synthesis engines," Organised Sound 7(2): 97–108, 2002.
    • Reading 2: Cort Lippe, Real-Time Interaction Among Composers, Performers, and Computer Systems, Information Processing Soceity of Japan, SIG Notes, 123:1–6, 2002.
    • Assignment 4 assigned, due 10/11
      • Write 1 paragraph (or more if you really want) reflecting on your experiences with the Fauvel seminar last week. Please post to Piazza. Feel free to start a new thread or join an existing thread to chime in on someone else's post. (If you do that, please offer thoughtful commentary and response to their post, not just post something unrelated without starting a new thread.) You can write about any aspect you want, but here are some ideas for starting points:
        • Did the seminar change any of your thinking around what "interactive technology" is (and has been, historically)?
        • Musicology involves ways of studying and reasoning about the world that are quite different from those used in computer science. What might be some of the practices, perspectives, or research goals we have in common? How might computer scientists benefit from understanding more about musicology or other humanities disciplines?
        • Did your ideas about how to digitize Fauvel -- either for scholars or for the public -- change in any surprising and interesting ways following the two seminar sessions led by musicologists?
    • Supervised learning overview & Wekinator demo, time permitting (Rebecca)
      • TODO: Post slides
      • TODO: Put assignments in separate category
  • 11 October: Gesture analysis algorithms and tools: Part 1 of 2
    • Assignment 4 due (see above)
    • System/Work of the day: Sasha (ReplayGain)
    • Tutorial: 5-to-10-minute overview on HMMs
      • Leader: PLEASESIGNUP
    • Reading 1: Fiebrink, R., D. Trueman, and P. R. Cook. “A meta-instrument for interactive, on-the-fly machine learning.” Proceedings of New Interfaces for Musical Expression (NIME), Pittsburgh, June 4–6, 2009.
      • Response URL: TODO
      • Discussion leader: Rahulram Sridhar
    • Reading 2: N. Gillian, R. B. Knapp, and S. O’Modhrain. "Recognition of multivariate temporal musical gestures using n-dimensional dynamic time warping." In Proceedings of the 2011 International Conference on New Interfaces for Musical Expression (NIME11), Oslo, Norway, 2011.
      • Response URL: TODO
      • Discussion leader: PLEASESIGNUP
    • Assignment 5 assigned, due 10/23: Build at least one gesturally-controlled instrument using Wekinator, with the controller(s) and synthesis method of your choosing.
      • Post a response to Piazza that includes the following: 1) Describe what controller, gestures, learning algorithms), and synthesis method you used. 2) Reflect on what was easy, what was difficult, and how you might improve the software. 3) Also reflect on how the experience of building with Wekinator compared to your previous assignment of building a mapping explicitly using programming.
      • Be sure to read Wekinator instructions to help you get started. Please run the walkthrough ahead of time to verify that the code works for you. We have about a 95% success rate on running on arbitrary machines, but if your machine happens to be Wekinator-unfriendly, we'll want to know ASAP. (Also, note that you may have to turn off firewall & antivirus for OSC to work properly on your machine.)
  • 16 October: Gesture analysis algorithms and tools: Part 2 of 2
    • System/Work of the day: PLEASESIGNUP
    • Tutorial: 5-to-10-minute overview on neural networks
    • Reading 1: F. Bevilacqua, B. Zamborlin, A. Sypniewski, N. Schnell, F. Guedy, and N. Rasamimanana. Continuous realtime gesture following and recognition. Springer Lecture Notes in Computer Science, Volume 5934, Gesture in Embodied Communication and Human-Computer Interaction, Pages 73-84, 2010. Please be sure to download directly from SpringerLink, as not all PDFs you'll find online include the complete text.
      • Discussion leader: PLEASESIGNUP
      • Response link: TODO
    • Reading 2: N. Rasamimanana, E. Flety, and F. Bevilacqua. Gesture analysis of violin bow strokes. In Proceedings of Gesture Workshop 2005 (GW05), pages 145–155, 2005.
      • Discussion leader: PLEASESIGNUP
      • Response link: TODO
      • Discussion leader: Alejandro Van Zandt-Escobar
  • 18 October: Composing the instrument; alternative views of interaction
    • System/Work of the day: PLEASESIGNUP
    • Reading 1: TODO
      • Discussion leader: Jennifer Guo
      • Response link: TODO
    • Reading 2: TODO
      • Discussion leader: PLEASESIGNUP
      • Response link: TODO
  • 23 October: State-of-the-art DMIs
  • 25 October: Wrap-up on gesturally-controlled instruments, discussion on Assignment 6 and final projects
    • Assignment 6 due (see above, TODO)
  • 30 October and 1 November: Fall break

Tentative further schedule

Schedule from this point on is still tentative!

  • 4 November: Written project proposals due
  • Week of 5 November: Schedule a 30-minute meeting to discuss your project proposal
  • 6 November: Either live coding or networked performance?
    • Read:
    • Read:
    • Discussion leader: Reid Oda
    • Supplemental material:
    • System/Work of the day:
  • 8 November:
    • Read:
    • Read:
    • Discussion leader:
    • Supplemental material:
    • System/Work of the day:
  • 13 November:
    • Read:
    • Read:
    • Discussion leader:
    • Supplemental material:
    • System/Work of the day:
  • 15 November: Laptop orchestras
      • Leaders: ??? (Could be 1-3 people)
    • Read: Why a laptop orchestra?
      • Discussion leader:???
    • Read: Wicked problems for laptop orchestras
      • Discussion leader: Zeerak Ahmed
    • Supplemental material:
    • System/Work of the day:
  • 20 November (May merge with another class or two?): Other collaborative / social systems
    • Tutorial: Overview of music production process and tools
    • Essl?
    • Networked performance?
    • Robots?
    • Smule apps?
    • Collaborative instruments?
    • System/Work of the day:
  • 22 November: Thanksgiving, no class
  • 27 November: Music production
    • Read: Pardo
      • Discussion leader: ???
    • Read: Duignan
      • Discussion leader: Joe Tylka
    • System/Work of the day:
  • 29 November: Interaction with music recommendation systems
    • Read: Kulesza
      • Discussion leader: ???
    • Read: ???
    • System/Work of the day:
  • Week of 3 December: Schedule a 30-minute meeting to discuss your project progress
  • 4 December: Revisiting interactive systems in music scholarship
    • Read: ???
      • Discussion leader: ???
    • Supplemental material:
    • System/Work of the day:
  • 6 December: "Creativity support tools" in HCI
    • Read: Shneiderman, Resnick, et al. on creativity support tools
      • Discussion leader: Danielle
    • Read: Latulipe et al
      • Discussion leader: ???
    • Supplemental material:
    • System/Work of the day:
  • 11 December: Grab bag topics ???
    • Read:
    • Read:
    • Discussion leader:
    • Supplemental material:
    • System/Work of the day:
  • 13 December: Grab bag topics ???, wrap-up discussion
    • Read:
    • Read:
    • Discussion leader:
    • Supplemental material:
    • System/Work of the day:

  • 8 January
    • Final project presentations
  • 10 January
    • Final project presentations
  • 15 January: Dean's date, final paper due (+ code, presentation slides, other materials)

Tutorial topics

  • Possible topic overviews:
    • Programming tools (Max/MSP, ChucK, SuperCollider, ?)
    • Music production / studio tools & practices (e.g., Logic demo; overview of editing, mixing, mastering processes)
    • BCIs & biosignals for music
    • ML tools (e.g., Marsyas, Wekinator, Gesture Follower)
    • Sound synthesis methods (e.g., additive, wavetable, waveshaping?, subtractive?, physical modeling, FM, granular) (high-level overview)
    • Summarize the state-of-the-art regarding gesture analysis for conducting analysis, dance (e.g., Laban analysis), or ancillary gestures of instrumentalists.
    • Kinect basics (how it works, how to program for it)
    • Basic real-time audio processing methods (not synthesis) (e.g., vocoders, Autotune, other live effects) ?
    • Summarize state-of-the-art regarding audio analysis of some sort (e.g., onset detection, beat tracking, pitch tracking)
    • Summarize state-of-the-art regarding MIR topics (e.g., recommendation, tagging, playlist generation, collection visualization)
  • Possible technical overviews:
    • Digital audio (representation of audio in a computer, sampling & quantizing, Nyquist's theorem)
    • Audio feature extraction
    • Machine learning topics: classification, neural networks, graphical models
    • HMMs, DTW
    •  ??

Possible "cool systems" to highlight

  • Reactable
  • Theremin
  • Ondes Martenot
  • The Hands
  • George Lewis' Voyager
  • Monome
  • The Continuator
  •  ???

General Research Resources

Resources for Topics Discussed in Class