A computer security and privacy researcher who also is an investor in augmented reality took a look at Facebook’s latest smart-glasses project and found biometric-identifier concerns for the recordable world as well as for the recorder.
In an opinion article published by The Conversation, Indiana University computer science professor Apu Kapadia compares Facebook’s Ego4D project with what he and his IU team have learned in studying sociological facets of people walking the Earth with AI-supported recording and reporting devices on their faces.
The Ego4D dataset can be used for algorithmic training, from biometric recognition to robotics performance in unstructured environments. Facebook is keen to facilitate development that would make smart glasses almost a new external lobe of the human brain.
Kapadia writes that biometric privacy concerns and AI ethics demand that Facebook executives frame in their own minds how dangerous and socially disruptive smart glasses will be if treated as another revenue-producing app.
(The researcher is an investor in Snap Inc., owner of augmented-reality social media service Snapchat. He also has been funded by the National Science Foundation, the Department of Defense and Microsoft Research; and had twice received Google’s faculty research award.)
It is good advice and timely as Facebook defends itself against insider accusations that leaders allegedly have chosen profits over the safety and health of their own subscribers, some of whom are children.
And, earlier this fall, the company had to disable an AI algorithm said to be the source of photos of Black men labeled “primates.”
Ego4D, is an egocentric, or first-person, video dataset available to the public, with benchmarks. The video and audio data, totaling 3,025 narrated hours of mostly “unscripted” content, was collected by 855 participants in nine countries as they went about their lives wearing smart glasses for up to 10 hours a day.
Privacy concerns by and angry reaction of people seeing that they are being recorded by smart glasses are well-reported, spawning the term glasshole for wearers of Google’s Glass. And that was before an increase in claims, accurate or not, by some biometric software makers that they can discern emotions.
But Kapadia brings up less-known privacy biometric questions, chiefly, how much personal data are wearers themselves willing to give up?
Beyond recording a person’s surroundings, including their home’s interior, and interactions. Smart glasses would be able to track even fleeting glances, the kinds that might be difficult to explain to a jealous partner.
It is interesting to note that Kapadia’s research specifically mentions that study participants have specifically said they would not want the devices to operate in bathrooms. One of the representative clips displayed in an interactive Ego4D graphic is of a subject in a room that looks very much like a public bathroom.
AI | biometric data | biometrics | biometrics research | data protection | dataset | ethics | Facebook | privacy | smart glasses | wearables