Body camera video equivalent to 25 million copies of “Barbie” is collected but rarely reviewed. Some cities are looking to new technology to examine this stockpile of footage to identify problematic officers and patterns of behavior.
Yea I share the same concerns about the “AI”, but this sounds like a good thing. It’s going through footage that wasn’t going to be looked at (because there wasn’t a complaint / investigation), and it’s flagging things that should be reviewed. It’s a positive step
What we should look into for this program is
how the flags are being set, and what kind of interaction will warrant a flag
what changes are made to training as a result of this data
how the privacy is being handled, and where the data is going (ex. Don’t use this footage to train some model, especially because not every interaction is out in the public)
Would you rather these things never be reviewed? Isn’t something better than nothing?
You’ll literally never be able to afford (or hire) enough people to review the data they are taking in…
I mean unless we start killing billionaires and taking their shit.
Yea I share the same concerns about the “AI”, but this sounds like a good thing. It’s going through footage that wasn’t going to be looked at (because there wasn’t a complaint / investigation), and it’s flagging things that should be reviewed. It’s a positive step
What we should look into for this program is
Make it publicly accessible. It’ll most certainly get watched and problems will be reported to be investigated further.
Corporations would be delighted to analyze all this footage.
File a complaint, and you get to view the video. If nobody files a complaint, there is no need to view the video.
Indeed, nobody should be looking at the video unless a complaint is filed.