Facebook Twitter YouTube SoundCloud RSS
 

British Police Roll Out New ‘Precrime’ Software to Catch Would-Be Criminals

21st Century Wire says…

Here is the latest chapter in your Brave New World

The concept of ‘Precrime’ was first introduced in a 1958 short story by visionary science fiction author Philip K. Dick and was later adapted for the big screen in Steven Spielberg’s 2002 blockbuster movie The Minority Report.  The story illustrates how Tom Cruise’s character, Precrime Chief John Anderton, is able to track down and apprehend homicidal criminals before they actually commit their crime.

Anderton is aided by a trio of captive psychics called “Precogs” – kept in saline flotation tanks deep inside Precrime Headquarters. The Precogs’ brains are hard-wired into a police supercomputer from which Anderton and his colleagues spend their days sifting through “previsions” of future crimes which the psychics have seen in the future.

Precog
PRECOGS: Predictive crime fighting has jumped from science fiction to reality.

Amazingly, the new 2015 precrime systems being pioneered mainly in Great Britain, are using what police are calling “Precobs” – customized precrime software applications tasked with predicting specific behaviors from specific pre-profiled individuals.

Today we’re told this is all about “catching burglars” and “making our neighborhoods safer”, but that’s not really for what these systems were designed. The developers of this software have a much grander vision – one which is identical to what Philip K. Dick warned about in his novels 50 years ago.

Below, we can see how enthusiastic US, European and British police forces are about allowing the computer think for them. It’s more than a little disturbing, but sadly, this is where modern policing is headed. Remember the USA’s notorious anti-terror “No Fly List”? Yes, that’s right – the one which grew from a few thousand in 2002, to a monster list containing roughly 1.5 million names of US citizens in a few short years. According to security agencies, if you were on the list, this meant according to the computer, you posed a public “security risk” and therefore should be denied access to air travel. It’s preposterous, but that did not stop them from using the lists for all sorts of political purposes including harassing journalists. If today’s technocrats have their way – advanced A.I (artificial intelligence) computers and software will be both compiling AND administrating those lists.

Police claim that this new computer allows them to conduct more ‘targeted investigations’, but does it really? The burglary ring detailed below could have just as easily been tackled through traditional, or ‘human’ police detective work. Machines do not necessarily do a better job than humans, but that’s not really the point. What they really provide for 21st century police and security services is a machine to blame should anything ‘go wrong’. That need to relinquish themselves of any human liability and responsibility is the No.1 motivator for institutions who are chomping at the bit to adopt this type of technocratic technology.

Believe it or not, it’s here…

1-Precrime

Pre-crime software recruited to track gang of thieves


Chris Baraniuk
New Scientist

Predictive policing is on the rise in the US, UK and Europe. The technique now faces one of its toughest challenges: the Felony Lane Gang.

THEY always choose the line at the bank farthest from CCTV – that’s how the Felony Lane Gang got its name. With crimes committed in 34 states, they’ve withdrawn millions of dollars from banks using cheques and credit cards stolen from cars. A handful of individuals connected to the group have been arrested, but the ringleaders have remained at large for years. Can crime-predicting software finally stop them in their tracks?

That’s the hope of police in the US, who have begun using advanced software to analyse crime data in conjunction with emails, text messages, chat files and CCTV recordings acquired by law enforcement. The system, developed by Wynyard, a firm based in Auckland, New Zealand, could even look at social media in real time in an attempt to predict where the gang might strike next.

Met police-CCTV“We’re trying to get to the source of the mastermind behind the criminal activity, that’s why we’re setting up a database so everybody can provide the necessary information and help us get higher up the chain,” says Craig Blanton of the Marion County Sheriff’s Office in Indiana. Because Felony Lane Gang members move from state to state to stay one step ahead, the centralised database is primed to aggregate historical information on the group and search for patterns in their movements, Blanton says.

“We know where they’ve been, where they are currently and where they may go in the future,” he says. “I think had we not taken on this challenge, we along with the other 110 impacted agencies would be doing our own thing without better knowledge of how this group operates.”

It’s not the only system that police forces have at their disposal. PredPol, which was developed by mathematician George Mohler at Santa Clara University in California, has been widely adopted in the US and the UK. The software analyses recorded crimes based on date, place and category of offence. It then generates daily suggestions for locations that should be patrolled by officers, depending on where it calculates criminal activity is most likely to occur.

Kent Police in the UK have been using PredPol for two years. A few months ago, officers were given a patrol location by the system and initially thought it strange – it wasn’t in one of the areas most commonly affected by street crime. They visited the location anyway and discovered a distressed woman and a child in public. The woman had been beaten up and the child sexually assaulted.

“The officers managed to take care of them and also managed to apprehend the offender, who was a known and wanted suspect,” says Mark Johnson, head of analysis at Kent police.

He says that when the statistical data for the area in question was analysed, officers realised that although it was not as prone to crime as other areas, similar offences had been recorded there in the past. The software did not predict a specific crime, but it predicted that something violent was likely to take place – and it was right.

Targeting which areas to patrol has had a significant effect. Johnson says that PredPol is one of the reasons why the annual number of recorded crimes in Kent has fallen from 140,000 to 100,000 since its implementation.

Part of the enthusiasm for this technology has come from officers burdened by tightening budgets, especially in the US, says David Roberts at the International Association of Chiefs of Police. “There’s been real pressure on law enforcement agencies to work smarter, to do more with less and be much more proactive in targeting their scarce resources,” he says.

Burglary Here

David Wall, professor of criminology at the University of Durham, UK, thinks statistical technology can be highly beneficial but he warns that not all crimes can be predicted – yet.

“Anomalies can happen anywhere and these are not necessarily related to social circumstances, they tend to be related to circumstances that are unveiled at a particular moment in time,” he says. “The classic one is a domestic argument that gets out of hand and turns violent. It’s very hard to predict that.”

Predictive policing software packages are being adopted across mainland Europe, too. In Germany, researchers at the Institute for Pattern-based Prediction Techniques (IfmPt) in Oberhausen have developed a system for tackling burglaries. Precobs works by analysing data on the location, approximate date, modus operandi and stolen items from robberies going back up to 10 years.

Based on this information, Precobs then predicts where burglaries are likely to happen next. This is tightly defined, within a radius of about 250 metres, and a predicted time window for the crime of between 24 hours and 7 days. Officers are then advised to focus their resources in a flagged area.

Precobs has been trialled in the Swiss cantons of Basel-Landschaft, Zurich and Aargau as well as in a number of German cities including Munich and Nuremberg. Michael Schweer, head of analysis at IfmPt, says that the accuracy of predictions so far is about 80 to 85 per cent – meaning that burglaries happened in most of the areas the software predicted.

This allowed police to conduct more targeted investigations and Schweer says that in Zurich the number of people arrested in connection with such crimes approximately doubled over the past 18 months. What’s more, in areas policed with the aid of the system, the overall number of burglaries has fallen as much as 35 per cent, thanks to more arrests and a more effective police presence on the streets.

“All our customers on pilot projects [have said they] will take it forward to regular use,” says Schweer. He adds that at the end of the month, IfmPt will announce a personal app for police officers that will deliver a crime prediction to their smartphones or tablets. “The next step with this kind of instrument is to have the data implemented right on the spot,” he says.

In France, meanwhile, a system provided by Sûreté Globale has enjoyed success for several years targeting a variety of crimes, not just burglaries. Spokesman Sébastien Delestre says that police in Paris tested the approach to crack down on joyriding on New Year’s Eve five years ago. They arrested twice as many people as the year before, he says. The software is now being used in Lyon, Lille and several smaller towns.

Sending police to areas where crime hasn’t yet occurred may sound intrusive, and raises the spectre of a police force following conclusions based on data that could keep pointing them to poor or minority communities. But it might be an improvement compared with recent high- profile cases of police bias, such as that reported last week after an investigation into the shooting of Michael Brown by a police officer in Ferguson, Missouri, last year. Predictive policing may be better both for law enforcement officers and the citizens they are charged with protecting.

“There’s no question that you can have biases within data but there are some ways to detect that bias mathematically,” says John Morgan, former director of the Office of Science and Technology at the US National Institute of Justice. “These tools allow officers to make real decisions with, in my view, much less bias than they otherwise might.”

READ MORE POLICE STATE NEWS AT: 21st Century Wire Police State Files

 

 

Get Your Copy of New Dawn Magazine #203 - Mar-Apr Issue
Get Your Copy of New Dawn Magazine #203 - Mar-Apr Issue
Surfshark - Winter VPN Deal