"And that's why we, the benevolent and peaceful police, need to track all your movements at all hours of the day. For the children. You don't want to be anti-children do you? Skynet told us where you live."
Unexplainable results should never be probable cause because you can't determine that the decisions were not made using protected traits either directly or inferred.
Yes, the question of current purpose is nearly irrelevant. It's the question of possible purpose that's concerning because once it's A) available & B) left to human subjectivity then privacy & 'innocent until proven guilty' is no longer guaranteed.
Exactly. This is why this is scaring me. The police are vacuuming up data on everyone and who knows who else they'll go after, especially if the wrong person gets into power. Even on the state level. I sure hope DeSantis' Florida doesn't have this ability.
I'm less concerned about that if its purely public data. If a police officer sat in a helicopter looking for drivers driving erratically, then notified a trooper on the ground to check on the car, and perform a field sobriety test if there is cause to do so I think that would fall within the confines of the law, even though thousands of cars could have been in their field of view and considered for potential DUI.
I am of the opinion that if the data is not either directly in public view, or the user can opt out of persisting it and it is available to the general public, even if for a fee, then its fine to use the data. I think any kind of AI algorithm's suggestions on its own should not be considered probable cause, you can use it to narrow down suspects, but you need actual evidence for a warrant or arrest.
I think the issue I have with this situation is collecting and storing such a vast amount of travel data on individuals without their consent. If leaked, that data could be used to track down victims of stalking and abuse, or political dissidents.
It's OK. Ordinary people will have no trouble at all making sure they use a different vehicle every time they drive their kid to college or collect an elderly relative for the holidays. This will only inconvenience serious criminals.
Is there any actual analysis this went down as written? This sets off two eyebrow alarms for me: 1. AI doing something revolutionary without serious issues and 2. clean cut police work, which never happens (at least not anymore).
Honestly I'd put money down the police caught him by chance and went backwards to find a good explanation for how. I'd also be highly skeptical of an AI system that actually catching drug dealers without also catching like everyone else.
I would like to propose a toast to the end of the war on drugs, thanks to this technology which will surely be decisive in convincing people not to want drugs. Once we've dealt with all these pesky low level dealers the cartels will pack it all in and give up the chance of huge profits.
Kicking the same can down the road. Incredibly depressing and dumb. Stop voting for these idiots and join the likes of Portugal by legalising drugs and treating addiction as a health issue.
Also don't defund the program like Portugal did. The conservatives there didn't like that decriminalizing drug possession for personal use actually works, so they immediately worked to cut funding to the program by like 80% and surprise surprise the program stopped being as effective as it was at the start. Essentially every piece of data we have on Tough on Crime™ politics shows that the approach doesn't work. If you want people to stop using drugs, make it easy for them to do so without fear of being arrested/imprisoned.
the problem is the driver's life outside of the car being part of the equation. Imagine a headline like AI learns the driving patterns of anime fans. How is the traffic camera gonna know which cars are being driven by anime fans in the first place? Of course drug dealers are gonna be much less likely to have drug dealer bumper stickers that might tip the cameras off.