When I first learned about Daniel Francis and his AI startup, Abel, the idea of Francis riding shotgun during police chases seemed, well, unnecessary. How much of that is really about gathering data for your AI project, and how much of it is just for the thrill of riding along in a high-speed chase?
But then I thought about it a little more, and it made better sense: Cops are currently spending something on the order of 100,000 man-hours annually (only in the U.S.) to generate 5.26 million reports. Abel's bet is that these reports could be generated with much less effort if AI did the heavy lifting. After working for nearly a year to refine its algorithm, Abel is now able to turn the first draft of a police report into a finished document in about 10 minutes.
But this is where I start to wonder. AI in police work is thrilling. Sure, it’s only helping with the tech side of reports now, but what happens when AI takes over and starts analyzing the footage and, even making the decisions?
I can’t decide if this is either:
a) the next exciting step toward revolutionizing an archaic profession, in which human cops might finally see some relief, or
b) a coiled spring toward a dystopian future, a future where we might trust cops more but where tech ain't gonna be cheap. And there’s always a cost. Revolutionizing an archaic profession, in which human cops might see some relief, ain’t without down sides.
Posted Using InLeo Alpha
That’s a good quest indeed. I suppose the human cop is still responsible for the final report, decision, verdict.
But whenever someone would create an AI model to help with this process, then it would have some of those decisions baked in the model or prompts.
So it is very important that implementation is very carefully done and not blindly trusted upon.