Reel Science

Did Minority Report predict our current reality?

In this Tom Cruise classic, police use sophisticated tech to stop crime before it happens.

by Tara Yarlagadda
Updated: 
Originally Published: 
Tom Cruise's character surrounded by sci-fi tech
20th Century Fox

Can you arrest someone for murder before they’ve even committed the crime? It might seem like a farfetched notion, and when 2002’s Minority Report came out, it probably seemed like just that — a Hollywood sci-fi interpretation of policing gone rogue. But twenty years later, this futuristic Tom Cruise sci-fi thriller feels chillingly relevant to our modern world.

In the movie — based on a short story by Philip K. Dick and set in the year 2054 — special humans known as “precogs” predict the future. Police officer John Anderton (Tom Cruise) pieces together the precog’s visions in his “Precrime” unit to identify murder victims and stop perpetrators before the crime actually occurs.

We might not have mutant-human “precogs” who dispense visions of the future, but the movie’s premise parallels a very real — and hotly contested — technology police are using to attempt to stop crime before it happens: predictive policing.

“One of the problems of predictive policing is that it creates an illusion of greater certainty—and is, therefore, more likely to lead to miscarriages of justice,” futurist Andrew Curry tells Inverse.

Reel Science is an Inverse series that reveals the real (and fake) science behind your favorite movies and TV.

What is predictive policing?

The trailer for Minority Report (2002).

Precogs aside, predictive policing in real life isn’t all that dissimilar to Minority Report. It basically involves using computers and large amounts of data to predict where and when crimes are likely to occur.

In fact, the four general categories of predictive policing — methods to predict crimes, predicting offenders, perpetrators’ identities, and predicting victims — match up pretty closely to the information precogs provide in the movie: a timestamp of the murder, and the identity of the murder victim and the perpetrator.

But in real life, it’s artificial intelligence that we rely on to interpret the datasets instead of precogs. Based on data about past crimes, algorithms can generate hotspots of likely crime — or, even more controversially, generate a profile of someone more likely to commit a crime than the general population.

For example, let’s say hypothetical Neighbood A has a history of car break-ins. Police might use predictive policing to justify placing more patrol cars in this neighborhood versus Neighbood B. Proponents of predictive policing suggest it’s a fairer way of predicting future crime hotspots rather than relying on fallible human memory. Good algorithms can also help us decide whether these break-ins are a pattern that is likely to persist over time rather than a one-off event.

“Rather than relying on police to ‘get a sense’ of problems or remember that there has been a string of car break-ins, a good algorithm can find these patterns,” Greg Ridgeway, chair of the University of Pennsylvania’s department of criminology, tells Inverse.

Is Minority Report plausible in real life?

Minority Report portrays a future where police comb through large amounts of data to predict crimes. It may not be that far off from our real world.

20th Century Fox

In Minority Report, Anderton’s program is an experimental one — but one with seemingly great success. Since implementing the program, murder rates have plummeted to nearly zero in the past five years.

Why commit a murder if it’s nearly guaranteed the police are going to arrest you? The idea in the movie — and more broadly in real life — is that predictive policing could ultimately deter and reduce crime.

“The idea that AI technologies would predict crimes with the same degree of accuracy as in the film is mostly a sci-fi invention in my assessment,” Sven Nyholm, an associate professor of philosophy at Utrecht University and author of the book Humans and Robots: Ethics, Agency, and Anthropomorphism, tells Inverse.

However, Sven says it is that future AI technology could help police predict crimes with a much greater degree of accuracy than we can now.

“In other words, the future might not be like what we see in Minority Report. But it might involve much more precise crime prediction than what is currently possible,” Nyholm says.

That future may be closer than we think. Police departments across the U.S. have been deploying predictive policing regularly in the past decade. The LAPD was one of the first to experiment with the algorithm crime-predicting method in 2008 — though other police departments in California later said the method was not as effective as proponents suggested.

Further, a recent study from researchers at the University of California used machine learning algorithms to predict the likelihood of re-arrest within three years after a prisoner’s release.

Should we be worried about predictive policing?

Police officer John Anderton (Tom Cruise) stands opposite Danny Witwer (Colin Farrell) — a Department of Justice investigator concerned about the overreach of the unit’s “Precrimes” program.

20th Century Fox

Proponents of predictive policing say it helps police smartly deploy limited resources using empirically-backed data.

“The police would be negligent if they had data and information on hand and did not use that information to be smarter about how to use their scarce resources,” Ridgeway says.

But civil rights advocates highlight significant concerns over predictive policing, such as algorithms reinforcing racist biases about perpetrators and crime hotspots based on past information. You risk criminalizing young men who have not necessarily committed a crime — a reality that vaguely echoes Minority Report’s premise. Nyholm says it’s hard to create AI without such biases.

“Since you’re using historic data as the basis for your predictions, effectively you entrench crime patterns. Whatever biases are already in your policing system, they get reinforced,” Curry adds.

Curry also says there’s a lack of transparency regarding how these algorithms function.

“Often even the police forces who are using them don’t really know what the algorithms are up to,” Curry explains.

Nyholm calls AI technology like predictive policing a “double-edged sword,” potentially making society safer for some groups and less safe for others. The question is whether the costs of predictive policing outweigh its benefits.

Even if the specifics of Minority Report are cinematic concoction, the basic idea of law enforcement wielding data to predict crime has come true in a way Philip K. Dick could hardly have foreseen when he published the short story more than sixty years ago.

“While the movie presents a scenario that is so extreme that it must be regarded as pure science fiction, the idea of governments using data about its citizens to try to prevent crime is by no means science fiction,” Nyholm concludes.

Minority Report is streaming now on Netflix.

This article was originally published on

Related Tags