|
In order to ensure that this thread meets TL standards and follows the proper guidelines, we ask that everyone please adhere to this mod note. Posts containing only Tweets or articles adds nothing to the discussions. Therefore, when providing a source, explain why you feel it is relevant and what purpose it adds to the discussion. Also take note that unsubstantiated tweets/posts meant only to rekindle old arguments will be actioned upon. All in all, please continue to enjoy posting in TL General and partake in discussions as much as you want! But please be respectful when posting or replying to someone. There is a clear difference between constructive criticism/discussion and just plain being rude and insulting. https://www.registertovote.service.gov.uk |
On May 17 2025 23:39 Yurie wrote:Show nested quote +On May 17 2025 23:35 Billyboy wrote:On May 17 2025 22:26 BlackJack wrote: Nobody mentioned the single biggest problem with this idea. If you need a hint for what that is google racial disparities in crime statistics and then imagine what happens when you try to program that into an AI hive mind that predicts your likelihood for committing crime. My experience on this forum tells me even if some of you are okay with this idea on paper none of you will be okay if the conclusions of the program end up being unsavory, regardless of how accurate they are. There are other people in this thread that are going to be really mad if it finds no reason to have racial bias, but a mathematical one for socioeconomic conditions or whatever. Isn't your postal code one of the classic indicators of future outcomes? Any big data crunching will likely find that relationship quickly. Living in a rich area you likely have successful parents and your peers are also from that background so you push the basic level up. Facilities, extracurriculars and so on are often better. While living in a very poor area the school is bad, there are no jobs. Parents are likely less successful and less likely to push you and your peers. With enough data access you could probably find strange and interesting data though. Random example, got drivers license as soon as allowed, played football while young and spend more than average on alcohol could be an extreme indicator, much stronger than any individual thing. Agreed.
On May 17 2025 23:42 WombaT wrote:Show nested quote +On May 17 2025 23:35 Billyboy wrote:On May 17 2025 22:26 BlackJack wrote: Nobody mentioned the single biggest problem with this idea. If you need a hint for what that is google racial disparities in crime statistics and then imagine what happens when you try to program that into an AI hive mind that predicts your likelihood for committing crime. My experience on this forum tells me even if some of you are okay with this idea on paper none of you will be okay if the conclusions of the program end up being unsavory, regardless of how accurate they are. There are other people in this thread that are going to be really mad if it finds no reason to have racial bias, but a mathematical one for socioeconomic conditions or whatever. I don’t think that would make many on here mad. We already know this, it’s precisely why many of us think trying to equalise socioeconomic conditions is so important to do. Sensible people won't, people who think skin colour is what makes people good or bad will.
On May 17 2025 23:54 BlackJack wrote:Show nested quote +On May 17 2025 23:35 Billyboy wrote:On May 17 2025 22:26 BlackJack wrote: Nobody mentioned the single biggest problem with this idea. If you need a hint for what that is google racial disparities in crime statistics and then imagine what happens when you try to program that into an AI hive mind that predicts your likelihood for committing crime. My experience on this forum tells me even if some of you are okay with this idea on paper none of you will be okay if the conclusions of the program end up being unsavory, regardless of how accurate they are. There are other people in this thread that are going to be really mad if it finds no reason to have racial bias, but a mathematical one for socioeconomic conditions or whatever. Spoiler alert - there’s also racial disparities in “socioeconomic conditions or whatever.” It would be foolish to think people would be satisfied if the disparities in race that the machine spits out is because of environmental reasons as opposed to “melanin content.” Any discrepancy is going to be extremely problematic regardless of the variables that are causing the discrepancies. You have spent way to much reading facebook meme's and looking for reasons to be mad at the "left". People on the left don't think there are not a higher percentage of, for example Blacks in America, committing crime. They believe that it is no skin colour or genetically related.
If AI proves them right they are not going to be mad.
|
Did you understand my post? If the AI hive mind uses the statistic that blacks commit more crime to predict they may be more dangerous and as a result maybe they should be policed differently, then no reasoning in the world for this conclusion is going to satisfy anyone on the left. Your delusion that the left would be okay with this so long as the conclusion is not based on black people being “genetically inferior” is pure fairytale.
|
Norway28602 Posts
Yeah I honestly don't know of people who reject that there is a racial difference in amount of crime committed. However, there are two caveats: many will argue that the real difference is likely to be smaller than what the difference is for punishment received (drug use is a well-documented example of this), and any real difference is more likely to be caused by environmental factors (not racist) than cultural factors (racist?) than biological factors (racist and this idea is basically entirely rejected by the left).
|
I think you're talking past each other.
BJ is saying: if the algorithm says that someone from Tottenham is more likely to commit a crime than someone from Notting Hill, then the police will intensify their presence in Tottenham following Tottenham locals around, frisking them, etc. Whereas Notting Hill residents go about their business unharrassed and carefree.
Instead of addressing the latter there's a lot of people saying the former is both obvious and not something to worry about. That isn't the main argument though.
That said, policing in Tottenham is already different than it is in Notting Hill. The cops don't need an algorithm for that. Will an algorithm make it worse? Maybe? Maybe not.
|
On May 18 2025 00:47 BlackJack wrote: Did you understand my post? If the AI hive mind uses the statistic that blacks commit more crime to predict they may be more dangerous and as a result maybe they should be policed differently, then no reasoning in the world for this conclusion is going to satisfy anyone on the left. Your delusion that the left would be okay with this so long as the conclusion is not based on black people being “genetically inferior” is pure fairytale. Yes I did, and your further explanation proved it. Your version on the "left" is not the real left, sure there are loonies just like everywhere, but the vast majority are like...
On May 18 2025 00:58 Liquid`Drone wrote: Yeah I honestly don't know of people who reject that there is a racial difference in amount of crime committed. However, there are two caveats: many will argue that the real difference is likely to be smaller than what the difference is for punishment received (drug use is a well-documented example of this), and any real difference is more likely to be caused by environmental factors (not racist) than cultural factors (racist?) than biological factors (racist and this idea is basically entirely rejected by the left).
Agreed.
|
On May 17 2025 22:56 WombaT wrote:Show nested quote +On May 17 2025 19:07 Acrofales wrote: UK is a bit scarier than most places due to the sheer amount of CCTV cameras you decided to install, but in general, I'd worry more about Google and Meta gathering your data than the government doing so directly. Best case they only use the data to sell you stuff and maybe pass it on to the government. Worst case, your profile is for sale to the highest bidder, e.g. Cambridge Analytics. I didn’t even realise we were notably CCTV heavy, I thought that was just generally the trend. It’s not something that massively bothers me as generally it’s a camera pointed at me, there’s not someone actually watching me. If CCTV footage is being pored through, it’s after the fact and investigatory, and my presence is going to be incidental. It may seem a somewhat arbitrary distinction to some, but I think it is quite a meaningful one. Myself and colleagues had zero issues with security cameras being a thing in our place of work. Many felt uneasy when we were told they’d become fully manned. Then the dynamic changes, is some busybody watching you and will they report non-serious infractions? Security cameras already covered the serious stuff like theft, or if an assault happened. As it turned out not much functionally changed. It’s pretty prohibitively difficult to actually survey a couple of hundred folks moving in and out. Even if you really wanted to be an arse about it. I totally understand some concerns, albeit I find it a little frustrating at the same time. Governments could do a lot of good with modern tech advances in various spheres, but it’s generally a no-go with the public out of privacy concerns. Reasonable concerns but then the public by and large just voluntarily cede privacy to corporate actors. Perhaps not this pre-crime example, I’m not sure on this one. But things like preventative medical screening. I think you could build more robust transport infrastructure if you hoovered up real-time data and crunched it. Just spitballing really but there are some pretty good use cases for big data in the public sector, folks just tend to be very, very reticent. Which I have no issue with, but it feels they don’t apply this reticence to other actors.
Regarding CCTV - first I will take workplace - I would say there is a difference between workplace and public space, also if I recall and nothing changed I believe company cannot review CCTV footage without police. There was some accident in place I worked at around 10 years ago and company had to ask police to come to check the footage.
"It’s pretty prohibitively difficult to actually survey a couple of hundred folks moving in and out. Even if you really wanted to be an arse about it. " - yes, for a human, add facial recognition software, AI and it becomes non effort.
"I totally understand some concerns, albeit I find it a little frustrating at the same time. Governments could do a lot of good with modern tech advances in various spheres, but it’s generally a no-go with the public out of privacy concerns."
Of course they can, it is the will which is the issue. I mean once they got nuclear power, power plants werent exactly first thing they used it for. I think you believe that government and citizens have the same goals, they however dont. Those two are separate entities.
"Reasonable concerns but then the public by and large just voluntarily cede privacy to corporate actors."
Bolded makes massive difference here. What corporate actors end of the day can do? Now lets exclude banks from those. And take FB for example - you can decide that you dont want to have an account and it is fine, you in no way, shape or form are obliged to behave in accord with FB policies. Even if you do decide to use FB, and break T&C, most they can do is ban you. More so if you are mistreated by them odds are government can actually protect you.
Now government - whether you like it, or not, you are bound by its policies, it is not exactly a choice. On top of that if you decide to break government policies, you just go to prison.
So you see comparison is rather unfair. One cant really do shit to you, other can do with you whatever it wants.
On May 17 2025 23:14 WombaT wrote:Show nested quote +On May 17 2025 22:26 BlackJack wrote: Nobody mentioned the single biggest problem with this idea. If you need a hint for what that is google racial disparities in crime statistics and then imagine what happens when you try to program that into an AI hive mind that predicts your likelihood for committing crime. My experience on this forum tells me even if some of you are okay with this idea on paper none of you will be okay if the conclusions of the program end up being unsavory, regardless of how accurate they are. From the article, to me anyway it feels people who’ve already a decent gauge on predictive factors are pulling in data they think is relevant and building this thing. Rather than just unleashing an AI to crunch all the data. Prior convictions for domestic violence for example. It would stand to reason that someone who had such a conviction would be a person of interest here as it were. It’s less ‘sexy’ for the tabloids but murder via domestic violence is right up there in terms of overall murder rate. I’m not one who is massively concerned with such modelling, the thing here I’m somewhat confused by is how is this intended to be used? I’m still unsure what the end goal and application here is, by the sounds of it much of this thread is in the same boat.
bolded - it doesnt matter. It either wont get used (then why do it in the first place), or get used (which is effectively punishing someone for something he didnt do). So for example: police will check on you every now and again, thats not very nice, is it, considering you didnt do anything. I mean your neighbours and people from your work would probably have some talk about it. ( on the plus side though, your lunch would never go missing )
On May 17 2025 22:26 BlackJack wrote: Nobody mentioned the single biggest problem with this idea. If you need a hint for what that is google racial disparities in crime statistics and then imagine what happens when you try to program that into an AI hive mind that predicts your likelihood for committing crime. My experience on this forum tells me even if some of you are okay with this idea on paper none of you will be okay if the conclusions of the program end up being unsavory, regardless of how accurate they are.
You think thats the problem now??? Isnt terrorist prediction tool logical follow up? Bad times to be Irish or Muslim in UK.
Edit: some typo.
|
On May 18 2025 08:33 Billyboy wrote:Show nested quote +On May 18 2025 00:47 BlackJack wrote: Did you understand my post? If the AI hive mind uses the statistic that blacks commit more crime to predict they may be more dangerous and as a result maybe they should be policed differently, then no reasoning in the world for this conclusion is going to satisfy anyone on the left. Your delusion that the left would be okay with this so long as the conclusion is not based on black people being “genetically inferior” is pure fairytale. Yes I did, and your further explanation proved it. Your version on the "left" is not the real left, sure there are loonies just like everywhere, but the vast majority are like...
Riiiiiight… people on the left would totally be okay with racial disparities in policing so long as the AI mastermind tells them there is good reason for it not based on skin color… lmao. Believe what you want.
|
United States42265 Posts
On May 18 2025 10:15 BlackJack wrote:Show nested quote +On May 18 2025 08:33 Billyboy wrote:On May 18 2025 00:47 BlackJack wrote: Did you understand my post? If the AI hive mind uses the statistic that blacks commit more crime to predict they may be more dangerous and as a result maybe they should be policed differently, then no reasoning in the world for this conclusion is going to satisfy anyone on the left. Your delusion that the left would be okay with this so long as the conclusion is not based on black people being “genetically inferior” is pure fairytale. Yes I did, and your further explanation proved it. Your version on the "left" is not the real left, sure there are loonies just like everywhere, but the vast majority are like... Riiiiiight… people on the left would totally be okay with racial disparities in policing so long as the AI mastermind tells them there is good reason for it not based on skin color… lmao. Believe what you want. Why are we assuming the AI will say the answer is to discriminate against black people? Not a gotcha, but isn’t it more likely to say that we need more affordable housing.
|
On May 18 2025 10:15 BlackJack wrote:Show nested quote +On May 18 2025 08:33 Billyboy wrote:On May 18 2025 00:47 BlackJack wrote: Did you understand my post? If the AI hive mind uses the statistic that blacks commit more crime to predict they may be more dangerous and as a result maybe they should be policed differently, then no reasoning in the world for this conclusion is going to satisfy anyone on the left. Your delusion that the left would be okay with this so long as the conclusion is not based on black people being “genetically inferior” is pure fairytale. Yes I did, and your further explanation proved it. Your version on the "left" is not the real left, sure there are loonies just like everywhere, but the vast majority are like... Riiiiiight… people on the left would totally be okay with racial disparities in policing so long as the AI mastermind tells them there is good reason for it not based on skin color… lmao. Believe what you want.
Obviously they would AI said so:
https://www.theguardian.com/technology/2024/feb/22/google-pauses-ai-generated-images-of-people-after-ethnicity-criticism
|
On May 18 2025 10:28 KwarK wrote:Show nested quote +On May 18 2025 10:15 BlackJack wrote:On May 18 2025 08:33 Billyboy wrote:On May 18 2025 00:47 BlackJack wrote: Did you understand my post? If the AI hive mind uses the statistic that blacks commit more crime to predict they may be more dangerous and as a result maybe they should be policed differently, then no reasoning in the world for this conclusion is going to satisfy anyone on the left. Your delusion that the left would be okay with this so long as the conclusion is not based on black people being “genetically inferior” is pure fairytale. Yes I did, and your further explanation proved it. Your version on the "left" is not the real left, sure there are loonies just like everywhere, but the vast majority are like... Riiiiiight… people on the left would totally be okay with racial disparities in policing so long as the AI mastermind tells them there is good reason for it not based on skin color… lmao. Believe what you want. Why are we assuming the AI will say the answer is to discriminate against black people? Not a gotcha, but isn’t it more likely to say that we need more affordable housing.
I mean you yourself brought up the idea of responding to calls differently based on this predictive technology
On May 16 2025 22:46 KwarK wrote: You wouldn’t sentence people for precrime off of it but you would treat police calls from the victim a little more urgently. .
Like do you honestly think the predictive technology would assess that an elderly white woman poses the same threat as a black male youth? Of course not. Of course you’re going to send more calvary for the latter than the former if that’s the larger threat. It’s going to make all kinds of judgements based on race, religion, sex, etc.
|
On May 18 2025 11:26 BlackJack wrote: Like do you honestly think the predictive technology would assess that an elderly white woman poses the same threat as a black male youth? Of course not. Of course you’re going to send more calvary for the latter than the former if that’s the larger threat. It’s going to make all kinds of judgements based on race, religion, sex, etc.
Entirely dependant on who codes it, the AI doesn't actually come up with it on its own. If it's coded to be discriminatory then it will be discriminatory. If not, then it won't.
|
|
|
|