There are like a hundred movies on why this is a bad idea.

Bitter-Gur-4613
44 Comments
Subscribe
Notify of
44 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Minority Report is leaking.

It can only use bad data generated by generations of racial bias for predictable results

Can hardly wait til AI starts checking out politics…

This needs to be shut down. Right now

Whoops, the AI spoke truth by accident:

Data and social scientists from the University of Chicago have developed a new algorithm that forecasts crime by learning patterns in time and geographic locations from public data on violent and property crimes. The model can predict future crimes one week in advance with about 90% accuracy.

In a separate model, the research team also studied the police response to crime by analyzing the number of arrests following incidents and comparing those rates among neighborhoods with different socioeconomic status. They saw that crime in wealthier areas resulted in more arrests, while arrests in disadvantaged neighborhoods dropped. Crime in poor neighborhoods didn’t lead to more arrests, however, suggesting bias in police response and enforcement.

“What we’re seeing is that when you stress the system, it requires more resources to arrest more people in response to crime in a wealthy area and draws police resources away from lower socioeconomic status areas,” said [Ishanu Chattopadhyay, PhD](http://med-faculty.bsd.uchicago.edu/Default/Details/15944), Assistant Professor of Medicine at UChicago and senior author of the new study, which was [published this week in *Nature Human Behaviour*](https://www.nature.com/articles/s41562-022-01372-0)

I’m sorry Dave, I can’t let you do that.

So, Psychopass?

“someone will steal !”

Crime is subjective. This ends badly.

Hah. It’s not gonna be a cool „Minority Report” dystopia. It’s gonna be a really lame statistical machine that spits statistically propable bullshit, but believing it to be true is going to make job for a lot of people easier. And a hell without sense for everyone else.

Sooooo I work in EMS. We have days where not much happens, we have days where we are utterly slammed and could use like, ten times as many ambulances and crews.

For some reason, I believe that an AI could be trained to anticipate the ebb and flow of call volumes. It wouldn’t be 100% right, but I know I’d be glad if the service put up an extra truck for expected busy days.

Alas, private EMS would probably just use it to reduce full time staff and “save” money 😔

Next up in the line of new inventions: The torture nexus, from the award winning book:

“Don’t create the torture nexus”

Wow, another way to streamline the process of class-divide acceleration! “Hey look – turns out poor people commit more property crimes! Let’s use this data to oppress them more!”

What, AI might be used by the rich to enforce divides and exacerbate existing problems instead of fixing them? Who could possibly have seen this coming?

Remember when the pseudoscience of profiling was the hot new thing?

And why it stopped being hot rather quickly?

This is amazing! What a victory for AI over its detractors.

I bet AI will be able to create a torment nexus next…

Seriously though did no one appreciate the actual clever comeback?

>Forbes 30 under 30 has been doing this for years.

Maybe this AI could generate a report for the minority of people who are going to be committing these crimes…

Movies get it wrong in that they arrest people on suspicion of committing a crime.

“Beep boop beep I think people are going to overthrow a democratically elected president, let’s add security beep boop beep,” seems fair.

Or am I missing something?

Not just in movies. There are two well-known cases of how algorithms can be biased when used in the criminal sphere. Look for COMPAS case and PredPol case.

Minority report

Can’t wait for Person of Interest season 6.

‘MINORITY’ report.

What’s a bad idea ks charging them with it beforehand,knowing would be perfect to prevent them.

I bet it just determines a persons proximity to poverty and makes a guess based off if they have access to social services or not. Maybe childhood poverty matters too…

What I’m trying to say here is that our capitalistic system is fucked up. You can clearly tell who the system is going to fail before they are failed.

Super AI will be the worst thing to happen to society

Listen, we’ve seen this before in fiction and know how it goes wrong. Let’s stop while we’re ahead, shall we??

unpopular opinion, this is actually good as long as you don’t have preemptive prosecution I believe in either DC or Baltimore they have crime stoppers volunteers that intervene when they get a sense that gang type crime is going to escalate and they have programs to locate likely offenders and divert them to mentorship programs before they commit the crime – the part that was bad about minority report was punishing people for things that haven’t happened yet, everything else is fine in my book

It’s almost like there was a movie about this…

Weird how they don’t put the same resources into predicting white-collar crime.

This has been tried and tested for ages now. It’s not something an AI can “now” do. The data on which models like these are trained are extremely biased due to years of racial and class-based profiling. Plus, even disregarding that, don’t you think that if the AI consistently predicts crime to be present at certain areas, resulting in tons of patrol cars heading there super fast and always on time, the criminals won’t simply hit another place close by that the model hasn’t learned about? Then you’ll get to merrily retrain your already misrepresenting model so you can bust 10 guys before the whole bunch moves again.

You can detect suspicious behaviour, weapons, stolen cars and a ton of other things with AI that are super useful for law enforcement, but we can’t just take something super complex and human like crime and reduce it to essentially a fancy optimisation problem.

… but what if the AI has one big nose and could smell crime before it happens!… he… nose the truth!!

ok, but can ai be used to create better living conditions so people don’t turn to crime in the first place?

And they looked at those movies as handbooks not warnings against doing things like this.

There is a whole ass county in Florida that proves that this is a bad idea

Watchdogs anybody? 1984? Did anyone see the “human brain AI”? We’re in the worst timeline.

How the fuck would you ever be able to calculate the accuracy, unless you simply assume, that every one who actually had the intent of committing a crime would confess, and the last ten percent would be people who had no intention of committing a crime, and, hence, would never confess to a crime that they had no intention of committing…

GATTACA moment

That’s not the point.. it’s to criminalize anyone they want

There was a meme I saw awhile ago that basically said this

“Author of dystopian book/movie: ‘the moral of Don’t Build the Bad Thingamajig is that the Bad Thingamajig is bad’

Billionaire a few decades later: ‘we have now created the Bad Thingamajig from the hit book/movie Don’t Build the Bad Thingamajig’”

I hate everything about this

This tv show was already done. It was called “person of interest”.

Person of Interest about to become real

They used it for good in person of interest

Reminds me of that anime Psycho-Pass. Gotta keep your crime coefficient down.

44
0
Would love your thoughts, please comment.x
()
x