|
In order to ensure that this thread meets TL standards and follows the proper guidelines, we ask that everyone please adhere to this mod note. Posts containing only Tweets or articles adds nothing to the discussions. Therefore, when providing a source, explain why you feel it is relevant and what purpose it adds to the discussion. Also take note that unsubstantiated tweets/posts meant only to rekindle old arguments will be actioned upon. All in all, please continue to enjoy posting in TL General and partake in discussions as much as you want! But please be respectful when posting or replying to someone. There is a clear difference between constructive criticism/discussion and just plain being rude and insulting. https://www.registertovote.service.gov.uk |
On May 16 2025 22:46 KwarK wrote: Murder prediction isn’t that complicated surely. Take a pattern with a few domestic violence callouts, battery, stalking, some time in prison, release, and a violation of a restraining order. He’s killing her. Everyone with the full fact pattern can see that. You have to wait for him to do something illegal but you know it’s going to happen.
A program that puts together the various records the different departments have and tells you which ones match the pattern of previous murders is a good idea. You wouldn’t sentence people for precrime off of it but you would treat police calls from the victim a little more urgently.
Tech can analyze large amounts of data and find patterns that aren’t as obvious as the example I used. Those patterns have predictive value. Predictive data can be used for crime prevention. And all of that can be done without encroaching upon civil liberties.
"A program that puts together the various records the different departments have and tells you which ones match the pattern of previous murders is a good idea."
I disagree with this. If you knew that someone is going to commit murder, wouldnt you feel obligated to prevent it? And in somewhat more active way, than hoping that victim will have a chance to call 911? How exactly would you do it without encroaching on civil liberties? Also should person displaying such pattern be allowed to work as police officer, doctor?
You also say: "You wouldn’t sentence people for precrime off of it" and follow with this: "Predictive data can be used for crime prevention". Well maybe they wouldnt be sentenced, but still treated differently because of something they never did and may never do.
Not to mention that mere existence of database like that would be rather problematic. Would it be public? I would say no. In that case whats to stop people from walking around saying: "I have a friend in police and he told me that this dude is going to kill someone."
|
On May 17 2025 02:31 Razyda wrote:Show nested quote +On May 16 2025 22:46 KwarK wrote: Murder prediction isn’t that complicated surely. Take a pattern with a few domestic violence callouts, battery, stalking, some time in prison, release, and a violation of a restraining order. He’s killing her. Everyone with the full fact pattern can see that. You have to wait for him to do something illegal but you know it’s going to happen.
A program that puts together the various records the different departments have and tells you which ones match the pattern of previous murders is a good idea. You wouldn’t sentence people for precrime off of it but you would treat police calls from the victim a little more urgently.
Tech can analyze large amounts of data and find patterns that aren’t as obvious as the example I used. Those patterns have predictive value. Predictive data can be used for crime prevention. And all of that can be done without encroaching upon civil liberties. "A program that puts together the various records the different departments have and tells you which ones match the pattern of previous murders is a good idea." I disagree with this. If you knew that someone is going to commit murder, wouldnt you feel obligated to prevent it? And in somewhat more active way, than hoping that victim will have a chance to call 911? How exactly would you do it without encroaching on civil liberties? Also should person displaying such pattern be allowed to work as police officer, doctor? You also say: "You wouldn’t sentence people for precrime off of it" and follow with this: "Predictive data can be used for crime prevention". Well maybe they wouldnt be sentenced, but still treated differently because of something they never did and may never do. Not to mention that mere existence of database like that would be rather problematic. Would it be public? I would say no. In that case whats to stop people from walking around saying: "I have a friend in police and he told me that this dude is going to kill someone."
It wouldn't be public, but it would be linked to all the facial recognition cameras we have everywhere now. So if your information spits out the wrong numbers, you'll be tracked endlessly despite having done nothing wrong.
|
United States42265 Posts
On May 17 2025 02:57 Jockmcplop wrote:Show nested quote +On May 17 2025 02:31 Razyda wrote:On May 16 2025 22:46 KwarK wrote: Murder prediction isn’t that complicated surely. Take a pattern with a few domestic violence callouts, battery, stalking, some time in prison, release, and a violation of a restraining order. He’s killing her. Everyone with the full fact pattern can see that. You have to wait for him to do something illegal but you know it’s going to happen.
A program that puts together the various records the different departments have and tells you which ones match the pattern of previous murders is a good idea. You wouldn’t sentence people for precrime off of it but you would treat police calls from the victim a little more urgently.
Tech can analyze large amounts of data and find patterns that aren’t as obvious as the example I used. Those patterns have predictive value. Predictive data can be used for crime prevention. And all of that can be done without encroaching upon civil liberties. "A program that puts together the various records the different departments have and tells you which ones match the pattern of previous murders is a good idea." I disagree with this. If you knew that someone is going to commit murder, wouldnt you feel obligated to prevent it? And in somewhat more active way, than hoping that victim will have a chance to call 911? How exactly would you do it without encroaching on civil liberties? Also should person displaying such pattern be allowed to work as police officer, doctor? You also say: "You wouldn’t sentence people for precrime off of it" and follow with this: "Predictive data can be used for crime prevention". Well maybe they wouldnt be sentenced, but still treated differently because of something they never did and may never do. Not to mention that mere existence of database like that would be rather problematic. Would it be public? I would say no. In that case whats to stop people from walking around saying: "I have a friend in police and he told me that this dude is going to kill someone." It wouldn't be public, but it would be linked to all the facial recognition cameras we have everywhere now. So if your information spits out the wrong numbers, you'll be tracked endlessly despite having done nothing wrong. They’re going to track everyone endlessly despite having done nothing wrong anyway.
|
On May 17 2025 03:39 KwarK wrote:Show nested quote +On May 17 2025 02:57 Jockmcplop wrote:On May 17 2025 02:31 Razyda wrote:On May 16 2025 22:46 KwarK wrote: Murder prediction isn’t that complicated surely. Take a pattern with a few domestic violence callouts, battery, stalking, some time in prison, release, and a violation of a restraining order. He’s killing her. Everyone with the full fact pattern can see that. You have to wait for him to do something illegal but you know it’s going to happen.
A program that puts together the various records the different departments have and tells you which ones match the pattern of previous murders is a good idea. You wouldn’t sentence people for precrime off of it but you would treat police calls from the victim a little more urgently.
Tech can analyze large amounts of data and find patterns that aren’t as obvious as the example I used. Those patterns have predictive value. Predictive data can be used for crime prevention. And all of that can be done without encroaching upon civil liberties. "A program that puts together the various records the different departments have and tells you which ones match the pattern of previous murders is a good idea." I disagree with this. If you knew that someone is going to commit murder, wouldnt you feel obligated to prevent it? And in somewhat more active way, than hoping that victim will have a chance to call 911? How exactly would you do it without encroaching on civil liberties? Also should person displaying such pattern be allowed to work as police officer, doctor? You also say: "You wouldn’t sentence people for precrime off of it" and follow with this: "Predictive data can be used for crime prevention". Well maybe they wouldnt be sentenced, but still treated differently because of something they never did and may never do. Not to mention that mere existence of database like that would be rather problematic. Would it be public? I would say no. In that case whats to stop people from walking around saying: "I have a friend in police and he told me that this dude is going to kill someone." It wouldn't be public, but it would be linked to all the facial recognition cameras we have everywhere now. So if your information spits out the wrong numbers, you'll be tracked endlessly despite having done nothing wrong. They’re going to track everyone endlessly despite having done nothing wrong anyway. I don't think this is the correct attitude to the constant bites being taken out of civil liberties tbh.
Sure, this minority report thing isn't the huge deal it appears to be at a very brief first glance, but its still yet more powers to the cops and that stuff always ends up getting abused, so its just another way of abusing the public.
I just wish that for once we would get a government who could audit police powers and take away stuff that isn't necessary, or is never used, or is frequently abused.
|
Norway28602 Posts
Also not a huge fan tbh. I mean it's not that I can't see the valid applications for it (like woman calls 911 on her likely to murder her boyfriend and they make sure they come asap instead of going like are you sure you're not just being hysterical right now), but this is one of the slopes that actually feels slippery. Where's the cutoff for how likely you are to murder someone before some action is taken and what action is taken? I mean if you have a 40% chance to murder someone according to The Algorithm, should you really be walking freely on the streets anyway? Hell, should you if there's a 10% chance? I'm not really sure that's how they're thinking about this but even trying to make some type of prediction for whether someone will murder someone, it's like, it's worthless if you're not going to act on it, but how can you act on it?
Anyway, there are lots of things - especially surveillance related - that I think are good if used by a benevolent government that I am still skeptical towards implementing because if we're one bad government away from some serious abuse steming from it then that's too much of a risk to take.
|
I don't see a problem with it until it effects the final prosecution of crimes. They've been useing wal-mart algorithms in order to alocate the limited resources of law enforcement to maximum efficency. The point where you are charged with a crime you have not commited is a pretty clear line to watch for crossing.
|
On May 17 2025 11:43 Sermokala wrote: I don't see a problem with it until it effects the final prosecution of crimes. They've been useing wal-mart algorithms in order to alocate the limited resources of law enforcement to maximum efficency. The point where you are charged with a crime you have not commited is a pretty clear line to watch for crossing. I'm still fighting the absolutely futile battle against surveillance society and it invasions of privacy tbh lol
|
UK is a bit scarier than most places due to the sheer amount of CCTV cameras you decided to install, but in general, I'd worry more about Google and Meta gathering your data than the government doing so directly. Best case they only use the data to sell you stuff and maybe pass it on to the government. Worst case, your profile is for sale to the highest bidder, e.g. Cambridge Analytics.
|
On May 17 2025 19:07 Acrofales wrote: UK is a bit scarier than most places due to the sheer amount of CCTV cameras you decided to install, but in general, I'd worry more about Google and Meta gathering your data than the government doing so directly. Best case they only use the data to sell you stuff and maybe pass it on to the government. Worst case, your profile is for sale to the highest bidder, e.g. Cambridge Analytics. I wonder what it would take to stop it from happening. Its a completely blatant violation of privacy, done in the name of capitalism with sinister government backing. I don't know that enough people care, or even that I should care (given that no-one else does), but I do for some reason.
There are whole organizations dedicated purely to stopping governments from overstepping on this stuff. I can't remember a single example of any of them being successful.
|
This is so dystopian it can’t be real. Right??
|
On May 17 2025 19:22 Jockmcplop wrote:Show nested quote +On May 17 2025 19:07 Acrofales wrote: UK is a bit scarier than most places due to the sheer amount of CCTV cameras you decided to install, but in general, I'd worry more about Google and Meta gathering your data than the government doing so directly. Best case they only use the data to sell you stuff and maybe pass it on to the government. Worst case, your profile is for sale to the highest bidder, e.g. Cambridge Analytics. I wonder what it would take to stop it from happening. Its a completely blatant violation of privacy, done in the name of capitalism with sinister government backing. I don't know that enough people care, or even that I should care (given that no-one else does), but I do for some reason. There are whole organizations dedicated purely to stopping governments from overstepping on this stuff. I can't remember a single example of any of them being successful. No clue about the UK, but Bits for Freedom pretty much singlehandedly got the Dutch government to return to pen and paper elections rather than using the voting machines they had bought, with a campaign showing multiple ways they could be hacked, surveilled or tampered with. They also played a major role in torpedoing the CCIP legislation at a European level.
|
Nobody mentioned the single biggest problem with this idea. If you need a hint for what that is google racial disparities in crime statistics and then imagine what happens when you try to program that into an AI hive mind that predicts your likelihood for committing crime. My experience on this forum tells me even if some of you are okay with this idea on paper none of you will be okay if the conclusions of the program end up being unsavory, regardless of how accurate they are.
|
The possibility of getting "racist" results was like the first thing that came to my mind but I think it's mostly because of how hard American politics seep into Western media. Western European governments aren't perfect but systemic racism doesn't seem to be that big of a deal on this side of the Atlantic.
|
Northern Ireland24451 Posts
On May 17 2025 19:07 Acrofales wrote: UK is a bit scarier than most places due to the sheer amount of CCTV cameras you decided to install, but in general, I'd worry more about Google and Meta gathering your data than the government doing so directly. Best case they only use the data to sell you stuff and maybe pass it on to the government. Worst case, your profile is for sale to the highest bidder, e.g. Cambridge Analytics. I didn’t even realise we were notably CCTV heavy, I thought that was just generally the trend.
It’s not something that massively bothers me as generally it’s a camera pointed at me, there’s not someone actually watching me. If CCTV footage is being pored through, it’s after the fact and investigatory, and my presence is going to be incidental.
It may seem a somewhat arbitrary distinction to some, but I think it is quite a meaningful one.
Myself and colleagues had zero issues with security cameras being a thing in our place of work. Many felt uneasy when we were told they’d become fully manned.
Then the dynamic changes, is some busybody watching you and will they report non-serious infractions? Security cameras already covered the serious stuff like theft, or if an assault happened.
As it turned out not much functionally changed. It’s pretty prohibitively difficult to actually survey a couple of hundred folks moving in and out. Even if you really wanted to be an arse about it.
I totally understand some concerns, albeit I find it a little frustrating at the same time. Governments could do a lot of good with modern tech advances in various spheres, but it’s generally a no-go with the public out of privacy concerns. Reasonable concerns but then the public by and large just voluntarily cede privacy to corporate actors.
Perhaps not this pre-crime example, I’m not sure on this one. But things like preventative medical screening. I think you could build more robust transport infrastructure if you hoovered up real-time data and crunched it.
Just spitballing really but there are some pretty good use cases for big data in the public sector, folks just tend to be very, very reticent. Which I have no issue with, but it feels they don’t apply this reticence to other actors.
|
Northern Ireland24451 Posts
On May 17 2025 22:26 BlackJack wrote: Nobody mentioned the single biggest problem with this idea. If you need a hint for what that is google racial disparities in crime statistics and then imagine what happens when you try to program that into an AI hive mind that predicts your likelihood for committing crime. My experience on this forum tells me even if some of you are okay with this idea on paper none of you will be okay if the conclusions of the program end up being unsavory, regardless of how accurate they are. From the article, to me anyway it feels people who’ve already a decent gauge on predictive factors are pulling in data they think is relevant and building this thing.
Rather than just unleashing an AI to crunch all the data.
Prior convictions for domestic violence for example. It would stand to reason that someone who had such a conviction would be a person of interest here as it were. It’s less ‘sexy’ for the tabloids but murder via domestic violence is right up there in terms of overall murder rate.
I’m not one who is massively concerned with such modelling, the thing here I’m somewhat confused by is how is this intended to be used? I’m still unsure what the end goal and application here is, by the sounds of it much of this thread is in the same boat.
|
On May 17 2025 22:26 BlackJack wrote: Nobody mentioned the single biggest problem with this idea. If you need a hint for what that is google racial disparities in crime statistics and then imagine what happens when you try to program that into an AI hive mind that predicts your likelihood for committing crime. My experience on this forum tells me even if some of you are okay with this idea on paper none of you will be okay if the conclusions of the program end up being unsavory, regardless of how accurate they are. There are other people in this thread that are going to be really mad if it finds no reason to have racial bias, but a mathematical one for socioeconomic conditions or whatever.
|
On May 17 2025 23:35 Billyboy wrote:Show nested quote +On May 17 2025 22:26 BlackJack wrote: Nobody mentioned the single biggest problem with this idea. If you need a hint for what that is google racial disparities in crime statistics and then imagine what happens when you try to program that into an AI hive mind that predicts your likelihood for committing crime. My experience on this forum tells me even if some of you are okay with this idea on paper none of you will be okay if the conclusions of the program end up being unsavory, regardless of how accurate they are. There are other people in this thread that are going to be really mad if it finds no reason to have racial bias, but a mathematical one for socioeconomic conditions or whatever.
Isn't your postal code one of the classic indicators of future outcomes? Any big data crunching will likely find that relationship quickly.
Living in a rich area you likely have successful parents and your peers are also from that background so you push the basic level up. Facilities, extracurriculars and so on are often better.
While living in a very poor area the school is bad, there are no jobs. Parents are likely less successful and less likely to push you and your peers.
With enough data access you could probably find strange and interesting data though. Random example, got drivers license as soon as allowed, played football while young and spend more than average on alcohol could be an extreme indicator, much stronger than any individual thing.
|
Northern Ireland24451 Posts
On May 17 2025 23:35 Billyboy wrote:Show nested quote +On May 17 2025 22:26 BlackJack wrote: Nobody mentioned the single biggest problem with this idea. If you need a hint for what that is google racial disparities in crime statistics and then imagine what happens when you try to program that into an AI hive mind that predicts your likelihood for committing crime. My experience on this forum tells me even if some of you are okay with this idea on paper none of you will be okay if the conclusions of the program end up being unsavory, regardless of how accurate they are. There are other people in this thread that are going to be really mad if it finds no reason to have racial bias, but a mathematical one for socioeconomic conditions or whatever. I don’t think that would make many on here mad.
We already know this, it’s precisely why many of us think trying to equalise socioeconomic conditions is so important to do.
|
United States42265 Posts
I mean if the algorithm comes up with the conclusion that crime is mostly caused by environmental factors resulting from a failure of the politicians to provide adequate social housing or whatever then I’m not going to argue. If the biggest predictive factor for criminality is a Tory government then we can use that information.
|
On May 17 2025 23:35 Billyboy wrote:Show nested quote +On May 17 2025 22:26 BlackJack wrote: Nobody mentioned the single biggest problem with this idea. If you need a hint for what that is google racial disparities in crime statistics and then imagine what happens when you try to program that into an AI hive mind that predicts your likelihood for committing crime. My experience on this forum tells me even if some of you are okay with this idea on paper none of you will be okay if the conclusions of the program end up being unsavory, regardless of how accurate they are. There are other people in this thread that are going to be really mad if it finds no reason to have racial bias, but a mathematical one for socioeconomic conditions or whatever.
Spoiler alert - there’s also racial disparities in “socioeconomic conditions or whatever.”
It would be foolish to think people would be satisfied if the disparities in race that the machine spits out is because of environmental reasons as opposed to “melanin content.” Any discrepancy is going to be extremely problematic regardless of the variables that are causing the discrepancies.
|
|
|
|