Autonomous Virtual Instruments II

This is a short documentation excerpt from Autonomous Virtual Instruments as it was presented at the Masters of Aalto exhibition in Helsinki.

Since the last version the mechanisms that they use to control their pendular movement have been greatly improved. They are much more consistent now and have better control. The audio signal analysis that the instruments use to listen to each other is significantly more sophisticated so that they are now listening to the timing and pitch information of their neighbours, not just overall audio energy. The instruments are now able to adjust the radius of their bell component and have some additional control over the pendular swing. This provides each instrument with some control over their pitch and timing in order to respond the information they are hearing from their neighbours. There are some superficial cosmetic changes. Although the system is much more sophisticated overall and the instruments are listening and responding to their neighbours, the effect is still very similar to wind chimes: ambient and stochastic with variation, but not much perceivable organization. I am hopeful that another round of development will allow for some emergent patterns or properties to become apparent. I would like to move away from the orbiting camera look, perhaps with several viewports and/or navigation. Also, the audio could be spatialized so the instruments are aurally place in the space as well as visually.