Delivered in partnership with the Australia Council for the Arts the Synapse Residencies place Australian artists into science and research settings to pursue collaborative projects with benefits accruing to both the resident and the host organisation.
Kirsty Boyle (NSW) + the Artificial Intelligence Lab (Switzerland)
Kirsty drew upon her extensive knowledge of Karakuri Ningyo (Japanese mechanical doll making) to develop girltron, a girl robot with a mechanical performance-based AI system. Girltron highlights the importance of fusing science with broader cultural and social concerns and recognises the role tradition plays in contemporary technology. Kirsty’s chief collaborator for the project will be AI specialist, Dr Lijin Aryananda.
Madeleine Flynn & Tim Humphrey (VIC) + the Garvan Institute of Medical Research (Australia)
Madeleine and Tim worked with Dr Shane Grey, head of the Gene Therapy and Autoimmunity Group, to investigate ways of sonifying information from new genetic analysis techniques that reveal the dynamics of cellular processes. The collaboration has the potential to advance the understanding of complex cellular patterns and networks, as well as providing unique opportunities for the artistic rendition of processes at the heart of human existence.
http://flynnhumphrey.anat.org.au/
Tina Gonsalves (QLD) + Affective Computing Group, MIT (USA), Wellcome Trust Centre for Neuroimaging (UK), and the Brighton & Sussex Medical School (UK)
Tina’s project, Chameleon, drew upon earlier work developed in partnership with Emeritus Professor Chris Frith, Wellcome Principal Research Fellow. Chameleon synthesizes neuroscientific and affective computing research to explore and provoke emotional processes by producing emotionally responsive audiovisual narratives. The work highlights awareness of our inner selves, as well as our innate tendency to synchronise and connect with others.
Tags: affective computing, art science collaborations, emotionally responsive audiovisual narratives, mechanical performance-based AI system, neuroscience, robotics, sonifying data, visual environmental inputs