The Los Angeles Police Department fudged violent crime stats under the watch of Bill Bratton — cooking the books to keep down the number of serious assaults, a report said Thursday.
An estimated 14,000 cases from 2005 to 2012 are believed to have been altered in the effort to conceal the magnitude of crime in the city, according to a Los Angeles Times analysis.
Before his tenure began as NYPD Commissioner, Bratton served as the Chief of the LAPD from 2002 to 2009. Using the revolutionary CompStat system, which crunches crime numbers and determines where incidents are likely to occur, he helped lower crime within the city for seven consecutive years.
But data now shows that those numbers were significantly skewed by police — and that the number of violent offenses in L.A. were actually 7% higher than what authorities reported during that period, the Times reports.
Serious assault cases were also 16% higher.
Despite the error, the Times says violent crime still showed a decline from 2005 to 2012.
LAPD officials have confirmed the findings and admitted that they were working to fix the problem.
“We know this can have a corrosive effect on the public’s trust of our reporting,” said Asst. Chief Michel Moore, who is in charge of the LAPD’s system for tracking crime.
“That’s why we are committed to,” he explained. “Eliminating as much of the error as possible.”
In order to discover the misclassifications, the Times ultimately used a computer algorithm to analyze crime data from 2012 to 2013 that they obtained from an LAPD investigation they did last year.
That report ultimately uncovered widespread errors in the way serious assaults were classified and forced LAPD Chief Charlie Beck to publicly admit the wrongdoings.
After discovering that certain words identified a crime as a serious or minor assault, the Times went over the data from 2005 to 2012 and back-checked each incident with reporters for accuracy.
They eventually discovered that the LAPD had consistently distorted the numbers from year to year and that serious crimes had been dismissed as minor offenses.
Many of these incidents in fact resulted in serious injuries — including a case in 2009 in which a man stabbed his girlfriend with a 6-inch kitchen knife, according to the Times.
While the boyfriend was later found guilty of assault with a deadly weapon, the attack was listed as a “simple assault” in the LAPD’s crime database.
Shortly after acknowledging the problems within the department last year, chief Beck made several changes to way crimes were recorded.
The LAPD launched a team of detectives, dubbed the Data Integrity Unit, which were tasked with improving the quality of police reporting.
In addition to retraining hundreds of officers in charge of classifying crimes, the group conducted unscheduled spot checks on crime reports from each of the department’s regional divisions.
They also used detailed diagrams called “decision trees” to give station officers a step-by-step lesson plan on how to classify crimes and approve officers’ reports, the Times reports.
But despite all this, an internal police audit released by the LAPD earlier this week found that officers were still fudging their reports last year — with aggravated assaults appearing to be 23% lower than what they actually were.
Police officials later claimed that the data had been obtained before their reforms were in place, adding that they expect the errors to decline in the future.
Several current and retired LAPD officers have blamed the misconstrued numbers on the immense pressure from division captains to meet their crime reduction goals, according to the Times.
While Moore agrees that senior police officials can often be “condescending” and give “pressure-cooked” speeches during meetings, he insisted that placing the higher-ups in charge of crime trends was necessary.
“Is there pressure today? Absolutely,” he said. “We hold our people to high standards. Our issue is to do so respectfully and in a manner that provides people with their dignity.”