Skip Navigation

AI tech will take our sense of privacy, it seems inevetable.

With the proliferation of AI powered deepfakes at galactic speeds, there will be nobody (especially women) that will have shred of privacy left in few years. It does not matter if it is fake photo or not, nobody should be able to see "you" naked unless you allow it. But with rise of tools that are able to run on consumer level hardware, that seems like really losing battle. Since how can we police what a person can run or cannot run on his personal computer? That is another can of worms better not opened, since the idea of some agency being able to monitor what you do on your PC is another dystopia. Soon, you can never be too sure if your neighbor or coworker did not deepfaked you and now every time he looks at you he sees you as sexual object. That is highly uncomfortable though for sure.

Since we cannot possibly stop it, what is the best option moving forward? Normalizing it? Marginalizing it, since it is fake after all? Ignoring it? No option seems very good either.

This goes way beyond current framework of "revenge porn", since when it comes to revenge porn, the case is simple - unlawful distribution without consent. But what about unlawful generation for personal use without consent? I cannot think of legal grounds that could make this criminal offense, since soon we would have to ban even drawing lewd doodles with pencil at home.

7
7 comments
You've viewed 7 comments.