vendredi 19 août 2011

Small work for robot and insects (Andie Gracie)



'Small work for robot and insects' has evolved over a period of three years. v1 was a commission from digitalsummer01: inter[face] to produce a small sound based piece of work, and formed part of a series of investigations into interfaces between natural and technological systems. The work consisted of a group of crickets and a simple quadruped robot existing in a large glass tank, seperated by a glass divider. The sound of the cricket song was transmitted to the robot which made a series of random movements in response.

v2 was commissioned by Arnolfini Live and is a much more sophisticated work. A new hexapod robot was designed and built from scratch and a neural network brain was programmed by Brian Lee Yung Rowe of Muxspace, New York. The robot is now able to listen intelligently to the cricket song and attempt to devise a unique language with which to communicate with them or provoke them into certain behaviours.

The work frames an interest in exploring connections between machine and nature that are outside the typical areas covered by cybernetics or purely scientific study. It illustrates the attempts to allow spontaneous relationships to emerge and provide a platform for us to witness the process. By forming a mechanical entity with qualities which are vaguely lifelike but which have the capacity to behave in a completely non-lifelike manner we allow the formation of spontaneous relationships that inhabit a ‘third state’. All parts of the system need to adapt to each other for meaningful communication and exchange to take place, rather than informed response. Once, and if, this third state is achieved the entities can begin a mutual exchange of information which will continually evolve and develop nuance.

The robotic element of the installation is a custom designed and built polycarbonate and aluminium hexapod with the capacity to express itself using movement, lights and sound. It has an initial library of 18 base motions, 7 base light sequences and 32 base sounds which can be considered as 'phonemes', a starting point from which can be extracted any sequence or combination to construct a complex and evolving vocabulary.

The incoming sound from the crickets is analysed using fft processes in max/msp and the resulting numerical data is passed to the neural network. Once the neural network has done its processing it passes back the numerical representation of the ‘sentence’ to max/msp which relays it via serial to the robot. The robot carries an onboard microprocessor (OOPic) loaded with a programme to control its functions, which are triggered by the data coming from the neural network.

Aucun commentaire:

Enregistrer un commentaire