Paint Your Face Away: workshop at Late at Tate Britain: PASSAGE
Paint Your Face Away is a drop-in digital face painting workshop by Shinji Toya. The development of the digital face painting tool for this session has been inspired by Frank Bowling’s paintings. Participants use the painter to create their profile pictures while running a real-time face detection on the image of a face being painted so that at one point the profile picture stops being detected by the computer vision through the painting process. In this way, the digital paint acts as a type of disruptive noise for the machine.
Just as Bowling’s work shifted from the figurative to the abstraction, involving the techniques associated with the fluidity of paint and patchiness of bounded areas, the workshop explores how the figuration of a recognisable “face” in the eye of the machine is obscured by the fluid digital paint and the discontinuous textures and patterns.
Face detection technologies can be used to detect and gather faces in images online. We live in an age where your faces in the online images can be put in a public dataset for training facial recognition without consent (See Adam Harvey’s investigation). Or a picture of your face could be made Creative Commons licensed so that it is not owned by you in a traditional sense, and the face is for any company or researchers to use to train facial recognition algorithms (the case of IBM and Flickr photographs).
Facial recognition technology can have multiple functions and can be used for policing and border control for identification and beyond, but also target advertising through the “mood detection” function. In other words, the technology is increasingly used for regulatory systems and commercial gain.
The face painting we play with here could be used to resist the automated scraping of the digitised faces and introduce useless noise to the biometric datafication of faces being done in service of facial recognition. As Hito Steyerl suggests, faces are quantified pixels less and less owned by us. But could we try to claim the ownership back to us through the painterly resistance?
The workshop is also inspired by CV Dazzle by Adam Harvey, in which the work and its research provide many physical, facial mark-up methods for obfuscating face detection technologies. Although, a relatively small number of examples seemed to have been made in purely digital approaches. And, make-up faces could be used to further train facial recognition systems. Hence, the workshop approach is selected in response to these aspects and the context of the online face-scraping. Also, the workshop approach has a symbolic association with Rosemary Lee’s research suggesting that object classification algorithms show difficulty identifying abstract paintings.
* The face detection algorithm used for the workshop is Runway.
Relevant Articles/Links: Data Economy of Faces
- Facial recognition’s ‘dirty little secret’: Millions of online photos scraped without consent – NBC News
- Who’s using your face? The ugly truth about facial recognition – Financial Times
- MS Celeb is a dataset of 10 million face images harvested from the Internet – Megapixels, Adam Harvey and Jules LaPlace
- IBM didn’t inform people when it used their Flickr photos for facial recognition training
- You No Longer Own Your Face. Students were recorded for research—and then became part of a data set that lives forever online, potentially accessible to anyone.
- Is FaceApp’s Data Collection Any Worse Than Facebook’s?
- Can you trust FaceApp with your face?
- Transnational Flows of Face Recognition Image Training Data – Megapixels, Adam Harvey and Jules LaPlace
- Facebook’s Face-ID Database Could Be the Biggest in the World. Yes, It Should Worry Us
Articles/Links: Some Uses of Facial Recognition
- How facial recognition advertising is becoming your new social contract
- Piccadilly Circus lights facial detection system ‘incredibly intrusive’
- London police’s face recognition system gets it wrong 81% of the time – MIT Review
- Facial recognition wrongly identifies public as potential criminals 96% of time, figures reveal
- San Francisco Bans Agency Use of Facial-Recognition Tech