Safe Speed home
Understanding
Communicating
Navigating
Issues
News
Helping
About Safe Speed
Gambling with Cameras.

Every one a winner!

A few tricks they use to make it sound as if the cameras are a big success.

 
How the safety camera partnerships claim success

The rules now say that a speed camera must be sited at a location where accident figures for the last three years confirm a risk due to drivers using excess speed. Sounds good doesn't it? We'll only be putting cameras at accident black spots then, and we'll see the benefits clearly. But there are massive problems with this approach. Read on.

Scaring away the traffic

Drivers don't like speed cameras. Some go out of their way to avoid them, and use alternative routes whenever possible. Others will choose a camera free route in preference to a camera infested route where similar alternatives exist. It's obvious that installing a camera will tend to scare some of the traffic away, and all things being equal, this will provide a potential reduction in accidents at the camera site. But since the traffic tended to go elsewhere, so will the accidents. Often the alternative routes may be more minor roads which are more dangerous.

new Other improvements

Sometimes a new speed camera is added to a problem area as part of a treatment scheme to reduce accidents at a genuine accident black spot. The treatment scheme might include new road markings, new signs and changes to a junction layout as well as a new speed limit and a speed camera. The caution here is to know what parts of the treatment scheme had what effect on any resulting accident reductions. In truth, we'll probably never know how much of any improvement results from which components of the treatment. They'll probably claim the entire improvement was due to the speed camera. We know that any such claim is highly unlikely to be true.

How big is a site?

Recent figures we saw for claimed speed camera accident reductions included a stretch of road 2 km long. Now it's easy to imagine a dangerous stretch of road 2 km long. But how much of the road would a single speed camera affect? The actual area surveyed is under 50 meters, allow 100m before the camera for deceleration, and 100m after it and regular drivers on the route will have been affected for just 250m, or 1/8th of the distance. Any accident within the site but not within range of the camera will very likely be entirely unaffected by the camera. But they have the random cluster effect to fall back on.

Random clusters (regression to the mean)

Suppose we have a county with 100 potential speed camera sites, 1,000 accidents in the last three years and one camera to place. Now it's extremely unlikely that all these sites have recorded similar levels of accidents. There's always a certain degree of random clustering. We don't know the maths that governs random clustering, but fortunately it's very easy to model with a bit of BASIC and some random numbers. The program is below. We made runs of 1,000 complete simulations which average the largest cluster to get an "average size of largest cluster" figure. Here are the results:
 

Number of Sites Number of Accidents average size of largest cluster normal average value
advantage
10
1,000
115
100
13%
100
1,000
18.6
10
46%
1,000
1,000
5.53
1
82%
10
10,000
1,046
1000
4.4%
100
10,000
126
100
21%
1000
10,000
21.7
10
54%
10,000
10,000
6.65
1
85%

So we go and place the camera at the site of the largest cluster. That's the trick bit. The figures for the untypical cluster are now planted in the history of the new camera. Over the next time period assuming all is equal, the number of accidents at the site will tend to return to average.  Without the camera contributing anything at all to the changes and with perfectly normal statistical averages (using the original example) we will be able to truthfully claim a 46% reduction in "accidents at the camera site" for our next press release.

If you could place bets with odds like these, you would be on a sure fire winner. 

It's been suggested that we could get the same effect by burying a bible at the site, or putting a garden gnome at the roadside, which is true if we site cameras and record results on this typical, but hopelessly flawed basis.

The government call this effect "regression to the mean" and it is documented (click here). There's even a compensatory equation (click here) do they ever use it? I suspect not.

External trends

Often over a survey period there will have been national trends in road accidents. So if a certain class of accident has gone down nationally by 5%, and they claim a 4% reduction at the camera site, you might reasonably guess that accidents at the camera site rose, relative to the national trend. In practise, a small external trend reduction will probably just bolster up the usual claims by a few percent.

Genuine accident black spots

There are undoubtedly a few accident black spots around the country where a speed camera could be justified as part of a genuine safety improvement scheme. Likely candidates might be junctions on fast roads, or before deceptive bends where too many drivers get it wrong. In these cases, the more visible the camera, the better the results. Huge clear signs: "Speed Camera Ahead" together with a reminder of an appropriate speed limit for the hazard area might well produce good road safety results. The camera would need to be sited on the approach to the hazard area, no more than 100m from the bend or junction. Such installations do exist, but are very much the exception. At a wild guess we'd estimate that considerably less than 5% of the cameras meet these sensible requirements properly.

Spin

The so-called safety camera partnerships employ PR men or spin doctors whose job it is to ensure that the messages reaching the public are positive. In many cases, the public seems to buy into the lies. Keep your eyes wide open for bogus claims. Almost all the claims are bogus and support policies which simply don't work to save lives.

Remember that the spin people are trained to tell you something clear and believable and that when they succeed they are doing what they are paid to do. If it didn't work they wouldn't be employed. They are employed specifically to present the facts creatively to make them sound better than they are. They have no moral or legal obligation to the truth.

Of course if accidents go down, they claim it's the cameras doing it. If accidents go up they say "we don't have enough cameras, drivers are still speeding, we must have more cameras".

new Selective reporting

The spin doctors love to choose the best figures and ignore the worst. So if serious injuries are down by 5% (say from 1000 to 950) and deaths are up by 20% (say from 100 to 120), they might just tells us about the serious injuries, or they might lump them together into that meaningless and misleading statistic "Killed and seriously injured" (KSI). In this example KSI has dropped from 1,100 to 1,070, so they'll possibly claim a "3% drop in KSI". 

But wait. What are we going to compare with what when we're quoting changes in percent? The spin doctors can cherry pick the best figure. 3 year average. 5 year average. 1994 to 1998 average baseline. Last year's figures. 1990 figures. They pick whatever they want to tell the story that they want you to hear.

The only honest method is to show all the figures year by year. Always be suspicious if they use an average figure.

new KSI (killed and seriously injured)

KSI is always misleading for two important reasons. One, it carries the assumption of death onto a much larger group of serious injuries. As a rule of thumb expect the "killed" figure to be about one tenth of the serious injury figure. And two, the phrase "serious injury" conjures up the image of a wheelchair case. But here's the definition of serious injury that's used:

"Serious injury: An injury for which a person is detained in hospital as an “in-patient”, or any of the following injuries whether or not they are detained in hospital: fractures, concussion, internal injuries, crushings, burns (excluding friction burns), severe cuts and lacerations, severe general shock requiring medical treatment and injuries causing death 30 or more days after the accident." (from RAGB notes (click here))

So in reality, many of these serious injuries will be fully healed in a week or two, and many more in three months.

That's not intended to lessen the human misery of the situation, and some people are seriously injured with life-changing consequences. But the reality is far less than it sounds. The reality of 1,000 KSI might often be 100 killed, 100 with permanent or long term injuries and 800 walking wounded.

Gnomes cut crashes by 60%!

Since the "benefits" outlined here add up together, we thought we'd do a worked example for the benefits of planting our large and very very ugly imaginary garden gnome at the roadside. We chose a site that had five serious accidents over the last three years. We placed the gnome a year ago and there's been just one accident since. Out comes the calculator and we immediately deduce that the average for the last three years was 1.667 accidents. Looks like we're on a winner with our gnome.

Our headline is "Gnomes cut crashes by 60%!", and we embark on a large scale gnome introduction scheme.

But it isn't the truth. Other effects have taken place:

  • The gnome was so large and very very ugly, that 10% of traffic has diverted to other routes.
  • General road traffic accidents in the wider area have reduced by 3% perhaps due to road engineering and vehicle safety improvements.
  • We do a regression to the mean test and discover that the average number of accidents at the site is probably only 0.8 per year. (we verify this by looking back a little further in the history of the site and discover that in the previous 10 years, taken as a whole, there have been 9 accidents.)
So the true score for the apparent effect of our gnome is 1 accident * 90% of the previous traffic level * 97% to allow for general accident reduction. = 0.873, so we calculate that the gnome has increased likelihood of accidents by 0.873/0.8 = 9%. Of course, it's highly unlikely that figures based on a single accident have any statistical significance.

But we've invested quite a bit in the gnomes, and we decide to keep quiet about it. We don't want to see the investment go to waste, and if we admit that the gnome appears to have increased accidents we might lose our job.

Somewhere deep in the police report for the accident at the gnome site is a statement from the driver of the crashed car saying: "It's eyes seemed to follow me, and I stared at it. When I looked back ahead, the traffic had stopped and I couldn't brake in time. Gnomes like that shouldn't be allowed at the road side."

Keeping them honest

It might not be possible to keep them honest, but we should be asking the following questions::

  • What are the total accident figures for the whole county?
  • How do these figures compare to national trends?
  • Will we see the benefits from the cameras reflected in the national accident statistics? (we haven't yet after 10 years)
  • How many sites and how many accidents?
  • What are the traffic reductions at your camera sites?
  • What have you done to compensate for the effects of random clusters?
  • What other safety improvement treatments were applied at the same time as the camera(s)?
Together, these questions should uncover most of the basic lies.
Our little BASIC program

RANDOMIZE TIMER

CLS

numberofsites = 100
numberofaccidents = 1000

DIM sites(numberofsites)

expected = numberofaccidents / numberofsites

loopa:

max = 0

RANDOMIZE TIMER

FOR n = 1 TO numberofsites; REM clear array

sites(n) = 0
NEXT n

FOR n = 1 TO numberofaccidents; REM populate array with random hits 

j = CINT(RND(1) * numberofsites)
sites(j) = sites(j) + 1
NEXT n

FOR n = 1 TO numberofsites; REM find largest value in array

IF sites(n) > max THEN max = sites(n)
NEXT n

count = count + 1
total = total + max
ave = total / count
PRINT count, max, ave, expected

IF count < 1000 THEN GOTO loopa; REM do it 1,000 times to get a good average

 
emails

I had a little pop at Strathclyde's safety camera partnership for making bizarre and unjustified claims, which they conveniently contradicted all by themselves. Here's what happened:

My email:

Subject: Advert
Date:  Mon, 16 Dec 2002 13:04:09 +0000
From: Paul Smith <psmith@business.email>
To: info@camerascutcrashes.com

I read your web site with interest and listened to the radio advert "Advertisement 3" on page:

http://www.camerascutcrashes.com/media/default.asp

[click here for the advert]

Now it seems to me that there's something dangerously amiss with your numbers.

The advert claims that out of 412 children killed or injured 25 lady drivers were exceeding the speed limit. That's just 6%. If only 6% were exceeding the limit before the accident, How on earth can we expect cameras to reduce crashes by 64% (which is a claim on your first page)?

And I'd like to focus in on those 25 accidents too.

  • How many were also drunk?
  • How many were improperly registered, improperly licenced or improperly insured?
  • How many were stolen cars?
  • How many were Police vehicles?
  • How many of the remainder would have had the accident outcome altered substantial by reduced speed? 
And what, exactly, are you doing about the 95%+ of these accidents that will not be influenced by your cameras?

Best Regards,
Paul Smith

Their reply:

Subject: Safety Camera Scheme
Date: Fri, 20 Dec 2002 09:25:39 -0000
From: "Traffic" <rppsd@strathclyde.police.uk>
To: <psmith@business.email>

I refer to your recent 'e' mail.

The point you miss in your comments is that the safety camera project is merely another tool to allow the police and local authorities to treat and tackle road safety issues.

The scheme reduces speed, crashes and casualties.

Is that such a bad thing?

Your criticism of the statistics is flawed due to your misunderstanding of the figures that are quoted.

The child figures refers to figures for a specific category of casualty
victims.

The 64% reduction refers to the average reduction experienced at all 28 speed sites in the Glasgow pilot area during the 2-year pilot project.

It IS wrong to try and correlate these figures.  That is not what the issues are intended to prove or achieve.

The 64% figure is intended to highlight the results attained in the pilot
area.  This can and WILL be rolled out to other areas.  If we were not to do this would be very wrong.

If we were to discover a cure for a serious illness that claimed many lives would we want it rolled out or just sit on it??!!

The child figures are intended to associate the fact that everyone is at
risk of speeding and that that can result in serious and often tragic
consequences.

Your last set of questions are irrelevant.

Cameras cut crashes and help to save lives.

These other aspects and areas are covered by other operational duties that are not effected or diminished by this scheme.

I trust the above is of assistance to you.

B Crawford
Sergeant


 
Comments (click here)

Comments on any of the above are welcome. We will be delighted to publish all suitable emails including those whose content we disagree with. Email comment.

Let's make speed cameras as unacceptable as drink driving

We have a strict editorial policy regarding factual content. If any fact anywhere on this web site can be shown to be incorrect we promise to remove it or correct it as soon as possible.
SafeSpeed
You can't measure safe driving in miles per hour.
footer  
Google
Web www.safespeed.org.uk
Safe Speed navigation:
front page forums join Safe Speed press / media email
main page site guide Paypal donate contact comments
See our new user's 'home page'

Note new address and telephone