Acusensus highlights magnitude of seatbelt problem

If you don’t wear a seatbelt, you’re disproportionately likely to be killed in road collisions. Geoff Collins of Acusensus talks to Adam Hill about how AI will allow police to monitor and prevent this risky behaviour
March 8, 2023
Acusensus’ Geoff Collins speaks to a BBC crew next to the M1 in England
Acusensus’ Geoff Collins speaks to a BBC crew next to the M1 in England

January 2023 was the 40th anniversary of the UK’s first seatbelt law coming into force, making it mandatory for drivers and front seat passengers to wear a seatbelt. The issue was highlighted that month when UK prime minister Rishi Sunak was fined by Lancashire Police for not wearing a seatbelt while filming a video for social media in the backseat. Cue lots of media rebukes about ‘safe seats’.

This high-profile blooper apart, compliance in front and back seats is pretty high: UK Department for Transport figures show that, in 2021 in England, Wales and Scotland, 94.8% of all drivers were observed using a seatbelt (although this had dropped from 96.5% in 2017). It’s important: road casualty statistics for 2020 show that 23% of car occupant deaths were not wearing one – in other words, you are disproportionately likely to be killed in a collision without using this most basic piece of kit.

Proof of concept

Australia-based road safety specialist Acusensus thinks it has an answer: its AI-driven Heads-Up camera system has been used in proof-of-concept work, funded by National Highways, with Aecom as delivery partner. The Heads-Up kit, mounted on a van for mobility and ease of use in these tests, helps to identify drivers who aren't wearing a seatbelt or are distracted by their mobile phone.

“It’s just showing: one, what can be done; and two, the magnitude of the issue that we're facing globally,” says Geoff Collins, general manager UK for Acusensus.

As well as seatbelt violations, Heads-Up will also give a clear image of mobile phone use
As well as seatbelt violations, Heads-Up will also give a clear image of mobile phone use

It has been tried out on England’s M1 motorway and the south coast town of Brighton, in the county of Sussex: the latter location was not chosen at random - Jo Shiner, chief constable of Sussex Police, is also the UK National Police Chiefs' Council lead for roads policing. “We made a point of getting it down there so that she was able to have a look at it and understand what could be done,” explains Collins. “There's quite a difference between a brochure and the real world.”

With two black boxes and five cameras in total, Heads-Up takes several images. There is a large, contextual one which shows the entire vehicle including number plate. “And then there are other cameras which have a different angle of attack,” Collins, who comes from a machine vision background himself, explains. “You have a very, very sharp angle of attack, looking down through the windscreen. So if you're holding a phone low down, perhaps next to the steering wheel, it would pick it up. Then there's a shallow-angle camera coming in, looking over a longer distance, that would pick up behaviour from the side.”

Radar trigger

A radar is used to trigger image capture at the right location, creating a measure of the speed of the vehicle. If any images suggest the behaviour that enforcement authorities want to identify, the AI pulls the data through for potential review. AI has changed the game, Collins says: “You teach it what ‘normal good’ looks like; and you teach it what ‘abnormal bad’ looks like. And you have false positives and false negatives, true positives, true negatives, and it's the change cases where it learns and gets a bit better. For example, if an image comes up and the AI says: ‘I think somebody's not wearing seatbelts’, but they are wearing one, it goes back into the learning machine. So by the time you've got, let's say 100,000 images, you've trained it what good and bad look like. But it’s continued to learn from the difference points where it's maybe made a mistake.”

In effect, the AI acts as a “very powerful pre-filter” which is always followed by human review of the data. The filtering is done locally in a GPU processor, linked to the camera, at the side of the road. Human review happens at some point in the next 48 hours. “Typically, it will be someone remote that goes to a secure cloud,” explains Collins.

The image is key. “If you just take a bad image to start with, you'll probably make a bad decision,” he adds. “So there's a combination of very good optics, very good camera resolution and technology, very fast shutter speed, and particular wavelength light shone in a particular way, which then gives you the best possible domain, the best possible image. And then the AI will look at the image and think: ‘Based on my knowledge and understanding, this is what I think the outcome should be’. But unlike a human, it never gets bored: show it the same thing, time after time after time, it would make the same decision.”

Data integrity

The integrity of personal data is upheld at all times, Acusensus says. Images are only reviewed if there's a likely violation, and only once a violation has been validated by a human is any more data - e.g. vehicle make and model, and numberplate – made available.

Geoff Collins
Geoff Collins

Collins emphasises that in the UK “we're in the exploration and demonstration phase with a view to becoming operational”. In Australia, Heads-Up is fully up and running, with sites installed, operated, controlled and monitored day-in, day-out.

The van won’t be the long-term delivery model in the UK: instead Heads-Up will be on trailers which can be relocated routinely every few days, with perhaps some fixed infrastructure on strategic roads where the traffic volume is higher. “I would be disappointed if we're not doing either or both of those things by the end of the year,” Collins concludes.

For more information on companies in this article