Similar thing happened to me friend of mine about 20 years ago but without algirithms in New Zealand. He had been warning for more than a year that he former girlfriend was dangerous and he wanted custody.
Then she suffocated their 6 year old daughter with a pillow and burnt the house down.
Who designed the algorithm and who implemented it, and where is the accountability for those two actors? Ultimately they are the ones at fault here, not the “algorithm”
The actual headline is a flawed narrative. It should read “police software written by X and deployed under Y’s purview caused death”. The story should be about that, not the falsehood that some nebulous “algorithm” was responsible
Isn’t this how risk works: you assess 1000 people to be at medium risk, 1 dies. Because medium risk means 1/1000? Kind of like lottery tickets all being a low chance of winning the jackpot, yet someone one usually does…
I don’t mean to talk down one travesty. But this is just stats meeting reality/human (mis)understanding of stats.
Yes, it’s like people complaining that there was a 90% chance of rain and then it didn’t rain.
Of course in reality you either get killed by your partner or you don’t, every judgement except „negligible“ and „extreme risk“ will be wrong if you need absolute certainty. And if you want to have a false negative rate of literally 0, you need to return „extreme risk“ for every single case.
I think it's indirectly relevant in a disturbing way; if you had a perfect "algorithm"[1] to predict (say) recidivism (let's not quibble over how it works or exactly what risk factors it would assess, let's just posit it is very accurate) then that algorithm would be racist because its predictions would be correlated with the race of the prisoner.
So we can either have fair, colour blind algorithms or accurate algorithms, but we can't have both.
[1] the term "algorithm" is used in the article but it sounds more like a risk assessment framework.
Yes, but the man was Nigerian [1], so he is never named, and no photos of him are shown, despite being the victim's ex-partner, as it is against the BBC guidelines to mention race when not "relevant" [2]. It's okay to state his gender, however - that they permit us to know.
Seems like it was just headlinese, plus being super-cautious about defamation before a conviction — especially in the UK, where libel law is more pro-plaintiff that in the U.S.. FTA itself: "... Her partner had allegedly used his key to enter her flat and soon the house was on fire.
"While her children, mother and ex-partner all escaped, Lina didn't. Her 11-year-old son was widely reported as telling police it was his father who killed his mother."
Similar thing happened to me friend of mine about 20 years ago but without algirithms in New Zealand. He had been warning for more than a year that he former girlfriend was dangerous and he wanted custody.
Then she suffocated their 6 year old daughter with a pillow and burnt the house down.
Who designed the algorithm and who implemented it, and where is the accountability for those two actors? Ultimately they are the ones at fault here, not the “algorithm”
The actual headline is a flawed narrative. It should read “police software written by X and deployed under Y’s purview caused death”. The story should be about that, not the falsehood that some nebulous “algorithm” was responsible
Isn’t this how risk works: you assess 1000 people to be at medium risk, 1 dies. Because medium risk means 1/1000? Kind of like lottery tickets all being a low chance of winning the jackpot, yet someone one usually does…
I don’t mean to talk down one travesty. But this is just stats meeting reality/human (mis)understanding of stats.
Yes, it’s like people complaining that there was a 90% chance of rain and then it didn’t rain.
Of course in reality you either get killed by your partner or you don’t, every judgement except „negligible“ and „extreme risk“ will be wrong if you need absolute certainty. And if you want to have a false negative rate of literally 0, you need to return „extreme risk“ for every single case.
[flagged]
Is it relevant to the story, according to you?
I think it's indirectly relevant in a disturbing way; if you had a perfect "algorithm"[1] to predict (say) recidivism (let's not quibble over how it works or exactly what risk factors it would assess, let's just posit it is very accurate) then that algorithm would be racist because its predictions would be correlated with the race of the prisoner.
So we can either have fair, colour blind algorithms or accurate algorithms, but we can't have both.
[1] the term "algorithm" is used in the article but it sounds more like a risk assessment framework.
[flagged]
[flagged]
It's not relevant to people who don't live in fear of The Other, and it's not helpful to pander to people who do.
[flagged]
Yes, but the man was Nigerian [1], so he is never named, and no photos of him are shown, despite being the victim's ex-partner, as it is against the BBC guidelines to mention race when not "relevant" [2]. It's okay to state his gender, however - that they permit us to know.
[1] https://www.surinenglish.com/malaga/benalmadena-torremolinos...
[2] https://www.bbc.com/newsstyleguide/r/
Seems like it was just headlinese, plus being super-cautious about defamation before a conviction — especially in the UK, where libel law is more pro-plaintiff that in the U.S.. FTA itself: "... Her partner had allegedly used his key to enter her flat and soon the house was on fire.
"While her children, mother and ex-partner all escaped, Lina didn't. Her 11-year-old son was widely reported as telling police it was his father who killed his mother."
“Crazy ex burns down house with woman and child in it” doesn’t cut it in 2025.
It'd be potentially defamatory before a conviction, so British non-tabloid press is likely to be super-cautious
[flagged]
Please stop.