This article was published on August 1, 2017

This CIA-funded tool predicts crime before it happens


This CIA-funded tool predicts crime before it happens

A 2002 film adaptation of a 1956 Philip K. Dick novel warned us of the dangers of predictive policing. In ‘Minority Report,’ Tom Cruise’s character played a cop in the Los Angeles Police Department’s pre-crime unit. The futuristic tale saw Cruise, an officer tasked with nabbing baddies before they committed crimes, ultimately question the morality of jailing criminals before they’d committed the offense.

While Cruise’s character pulled back, law enforcement in the real world is pushing forward with the same type of technology.

Palantir — a CIA-backed startup created, in part, by PayPal’s billionaire co-founder Peter Thiel in 2004 — is a little-known tool that’s changing the world right under our noses.

Once used to predict roadside bombs in Iraq based on patterns of previous deployment, Palantir is now being deployed here at home for everything from law enforcement to finance.

The tool currently resides in a nondescript building on a back street in Palo Alto, California. From the outside, you might not think much of it. Inside, the technology is protected by walls that are impenetrable by radio waves, phone signals, or internet: its only means of entry secured with advanced biometrics, and pass codes held by dozens of independent parties whose identities are protected by blockchain technology.

According to Palantir, the building “must be built to be resistant to attempts to access the information within it. The network must be ‘airgapped’ from the public internet to prevent information leakage.”

The ‘eye in the sky’ — Palantir’s term, not mine — sifts through massive amounts of data, attempting to better derive useful information from its contents. For all the data we collect in the US (and there’s a lot of it), we’re not all that good at figuring out what to do with it aside from storing it and hoping the future versions of ourselves figure it out.

Its client list includes the CIA, FBI, NSA, Center for Disease Control, the Marine Corps, the Air Force, Special Operations Command, West Point, and for good measure, the IRS. If using predictive AI to discern what future you might do based on past data creeps you out, it’s worth noting 50 percent of Palantir’s client roster is in the public sector.

But it’s on the streets of Chicago and Los Angeles that Dick’s premonition of an Orwellian future is becoming reality. There, Palantir’s algorithms monitor previous crime data to create “hot spots” law enforcement officials then use to determine which areas need a larger police presence.

At its surface, divvying up police cruisers based on which neighborhoods are most crime-ridden doesn’t sound like all that terrible an idea. It’s only when you consider that in some parts of the country, being black and male is often reason enough for the police to swoop in, and sometimes with deadly consequence.

And it’s the data collected that’s cause for concern. It stands to reason that heavily-policed areas are always going to be responsible for more collared criminals, thus creating biases in the data set that could plague communities for years to come. The addition of predictive algorithms to, in effect, send officers to an area with the pre-conceived notion a crime is about to happen only fulfills the AI prophecy.

Two males in hoodies walking down the street who might not have earned a second glance on a normal day now suddenly fit the description for a home invasion that happened hours earlier.

It’s this level of militarization by police that puts law enforcement at odds with the communities they’re sworn to protect, and Palantir is only going to make it worse. Ana Muñiz, an activist and professor at UC Irvine told LA Weekly:

Any time that a society’s military and domestic police become more similar, the lines blur. The military is supposed to defend the territory from external enemies, that’s not the mission of the police – they’re not supposed to look at the population as an external enemy.

Palantir can’t be pigeonholed as just a predictive AI for crime and terrorism, though. Its algorithms have the potential to be deployed on a variety of seemingly-mundane data sets that, when combined, paint a comprehensive picture of each of our daily lives. Worse, even if we were to go offline and not transmit a single bit of information on the internet for the rest of our days, the data needed to paint this picture already exists, and it’s sitting in storage just waiting to be utilized.

It’s not Sci-Fi anymore. Advanced tools like Palantir create valuable meaning out of unorganized bulk data. And it’s this meaning that everyone from government organizations, hackers, or even corporations like Facebook and Amazon wants to get their hands on.

For Palantir’s part, the software isn’t inherently evil. It’s a powerful tool used to sort data and form predictions based on its contents. It, like most things, is but a tool. Palantir is a cog in the system where data is power, and the apparatus is simply a means of extracting it.

As the National Rifle Association once said: “Guns don’t kill people. People kill people.”

Like guns, Palantir seems to have a people problem.

Get the TNW newsletter

Get the most important tech news in your inbox each week.