For my final, i want to combine musical performance with visuals, both produced algorithmically. I will generate a score for 2 guitars, based on an as-yet undecided algorithm. The guitars will run through a digital effects processor (not necessarily integral to the project at hand, other than for aesthetic purposes), to a pitch to midi converter, and then into max, where the midi data will be sent out again to processing. Once in processing, a real-time graphic display will be updated.
i am investigating the max object fiddle~ and the processing library maxLink for this. I will be sending midiout from max to my processing sketch via maxLink.
I am still nailing down the details on how i would like to proceed. It is my first time using max, performing publically playing guitar, and doing live image processing, so there should be much to learn and post about along the way!
Friday, November 10, 2006
This was a pretty straight-forward lab, although i did encounter one issue. All of the notes that were coming out of garageband were super soft. I had all of my volume levels up, yet I could barely hear the notes coming out. This was probably due to my velocity though.