The restaging of Trisha Brown's contemporary dance masterpiece Astral Convertible is two months past, but the arts-technology development that contributed to the staging continues. To support the performance, IACAT research programmer and CCG member Mary Pietrowicz created a gesture recognition system (ie, Labanotation-trained machine learning software) that enabled performers garbed in wireless networks (based on the work of Thecla Schiphorst of Simon Fraser University, who refers to them as "wearable architectures") to signal and trigger changes in aesthetic effects in the performance environment (lights, sound, projection). Mary theorized this gesture recognition system for recent presentations at CHI (Computer Human Interface Conference) in Atlanta, Georgia (April 10-11, 2010), specifically explaining how arts serves as a special driver for full body gesture interfacing and suggesting that the mapping of performance data to sensory effect is a new art form that has yet to be fully explored.
Mary and Thecla are continuing their collaboration, planning to further develop and apply the Labanotation-trained machine learning software developed here. Also participating in the collaboration will be Thecla's graduate research assistant Diego Maranan, IACAT research Bob McGrath, Dance Professor and Dance Director of Music John Toenjes (who led the Astral Convertible production), and CCG Director and IACAT Co PI Guy Garnett.
— Kelly Searsmith