Are surveillance systems using biometric scanning really privacy-friendly?
I read a post today on the IAPP‘s Daily Dashboard talking about a “privacy-friendly” “positive side of facial recognition”. It suggests that Ontario’s Information and Privacy Commissioner Ann Cavoukian has endorsed this usage of biometric encryption. Having a strong understanding of privacy and only a basic understanding of biometrics, I wondered how these claims were possible. The story points to an article in Business Week which only states two technical points regarding facial recognition in a casino:
First, it does scan the face of each person entering a casino, but if there is no match against the list of 15,000 addicts, the image is removed instead of being stored in a database. Second, the casinos use a form of biometric encryption for the face and personal information databases. Essentially, this means that the personal information is stored in an encrypted fashion and can be unlocked only when a face serves as a key. If a hacker were to break into the database, he would find only garbled strings of numbers and letters.
With regard to the first point, one should ask how it is verified that non-hit data is removed from the database? What are the repercussions if it is not? What stops that information from being added in the future as currently being requested with ALPR?
It mentions non-hit images will be removed from the database, if it’s non-hit data, why was it stored initially? Was the database backed up during that short time? How long does the non-hit data sit in the database?
The second part is the big mystery though – removing technical details from the article to maintain readability for the lowest common denominator reader; the idea that a hacker would only find garbled strings of letters and numbers sounds great in theory, but everything in a computer is such, as you know. Even the word “password” or a 3D full resolution image of a face, in software, is numbers and/or letters to a computer; that’s how they work.
Perhaps it’s intended as a red herring, but the phrase, “Essentially, this means that the personal information is stored in an encrypted fashion” is frightening. What does essentially mean? What is an encrypted fashion? It’s either encrypted, or it isn’t.
There is no mention of the false positives rate in the article. In ALPR, which is simple letters and numbers comparitively, false positive estimates I’ve read are between 11% and 38% with the complex points in a face, I can only imagine a much higher number.
While I understand that a lot of the technical details have been left out of the article, the few that have been used, scream of abuse potential.
In summary, there is no evidence that
– non-hit data is not, or will never be stored, and no mechansim to detect such
– the hit data has a really high false positive rate, recording innocent people have, causing them to be interrogated
– all of the data collected and stored is encrypted in a fashion that any security related developer can’t easily reverse engineer it
It seems like this system will undoubtably affect many non-addicts who don’t want to participate in this system. This suggests this technology is not privacy friendly at all.
Cheers,
—
Kris Constable
Technical Advisor