Touching Affectivity: an Interactive Installation
Summer Researcher 2021 (McVey Research Assistant): Touching Affectivity: an interactive installation, includes a robot whose vocalizations are sonifications of the way it is touched. The robot experiences its world through pressure sensors and handmade conductive fur, responds to touch through haptic feedback and sound, and records the sensor’s data for analysis. In one exhibition, only the pressure sensor readings were used to control a haptic purring sensation and a sonic “scream.” Anton Dimitrov ’23 will be designing and implementing a coding scheme to analyze the video recordings of human subjects interacting with an emotive robot at an exhibition and interpret the results. The design of the robot’s sonification of the sensor data is a new interface for emotive expression which has applications in robotics and communication. This summer he will be building a web platform for data collection of human subjects research on the robot’s expressed emotion in response to gestures and other generative algorithms for producing emotive beeps and chirps.
20/21 School Year (McVey Research Assistant)
The goal of this project is to test to see if we can train a neural network to draw from an image in a similar way to humans. Many drawing algorithms are filters that use edge detection to determine where lines should be placed. Prior research shows that this is not the way that people actually draw. We would like to see if the inclusion of eye-tracking data increases the accuracy of stroke prediction. For this project, we are exploring the correlation between a drawing and their eye movements while drawing based on the eye tracking points obtained by using eye tracker and the drawing data.
Preliminary data was collected on where people look when drawing was collected at University of California Santa Barbara in 2018. This past summer, Changling Li ’22 and I pre-processed the data and visualized and animated the eye tracking data and drawing. We created various graphs and animations of attention shifting and the drawing process to understand the data and process it for further exploration. This past school year, we used a pre-existing model which is built based on TensorFlow to train our drawing data. For further exploration, I plan to redesign the neural network to incorporate the eye tracking data. In the end, I will compare the results of the network trained using only the drawing data, to the results of a neural network trained on both the drawing and eye-tracking data. I also plan to use the results of these experiments to design an artwork about sight and drawing.