Project Progress Review and Rebuttal
My progress report was reviewed by an anonymous classmate. I’ve reproduced their comments below, with my own responses inline.
The literature survey and project report provide clear overviews of related work. The idea seems novel and beyond that it is also interesting. The project goals are clearly outlined and should be obtainable within the time constraint of 10 weeks. The documents were well written.
I liked the intro in the literature survey as it provided a succinct overview of the paper body.
The diagrams in the project overview were helpful.
I’m interested in seeing how the multi-touch gestures will work.
Me too! They’re coming along; we’ll see how solid they are for the demo.
I am assuming the music objects are virtual GUI objects and are not tangible objects as in the Reactable. This wasn’t clearly stated to be one way or the other.
Great point. Yes, these are virtual objects on a virtual table, displayed on the iPhone touch screen. I’ll make that clear in the final report.
It would have been nice to have some examples for Pd or the Pd objects you created and examples of OCS [sic]. An example of OSC patterns and bundles would be great.
Very good points. I was planning to save the examples for the final report, in which I would be able to show how the project definitively worked. I decided that, for this report, I would let the references serve as examples. I will be sure to include these in the final report.
Does the laptop send the sound back to the iPhone? Is the hardware sufficient to run everything off the iPhone in the future? It seems this would be better as people will not always be lugging their laptops around with them.
The laptop does not send audio back to the iPhone. The iPhone serves as a remote UI for the laptop synthesizer. The whole system is designed to be more of a “personal installation” at this point. If and when the iPhone gets some significant hardware upgrades, it could be sufficient to contain the entire system in the iPhone. For now, though, the iReactable serves as a novel way to interact with your laptop synthesizer.
Do the other OSC controllers not have the ability to sync the objects the user creates in them? I am wondering if they can achieve the same function as the Reactable.
The existing iPhone OSC controllers allow the user to design arrays of knobs and sliders, and can be programmed to send some of the messages I have designed for the iReactable. However, the complete interface cannot be recreated in any of these existing applications.
I hope the demonstration will be ready by week 9 since that is the week of our final presentations. This would be slightly faster than your provided timeline.
I think the last week of class is 10th week, but that’s still right around the corner. It’s going to be a close one, as far as the demo goes.