In the world of porn, you are damned if you do, damned if you don’t.
People want us to use the latest and greatest technology to ensure that everyone who stars in films are of legal age. I’ve always agreed with this. In the past, girls have been able to use fake IDs, and we can’t allow this. We need to do what it takes to make sure everyone is okay.
The problem is, that using these new technologies to match up a person’s identity with their actual ID can get a company in trouble.
Because sometimes technology can be used for bad things as in the latest case out of the UK.
Clearview AI is accused of illegally scraping 20 billion images of people’s faces from the web without their knowledge or permission and then using them to form a global facial recognition database.
In short, if you post a photo to social media, they would use it to add to their massive database. and were done so without anyone’s permission.
The company would then sell the data they collected. Clearview AI offers an app that customers can use to upload a photograph of someone to try and identify them by checking them against its unlawful database.
John Edwards, the UK information commissioner, said the company “not only enables identification” of the people in its database “but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable.”
So while new technology can do many wonderful things, it can also be used for bad things.
Imagine someone making a database of all the porn stars from photos they post on Twitter and Instagram. Now you are in a database of known sex workers, and it could be used to discriminate against you.
Experts have warned that automatic facial recognition technology is even more privacy-invasive than the police collection of DNA and fingerprints.