• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EDT 04:04
CET 09:04
KST 17:04
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
[ASL21] Ro24 Preview Pt1: New Chaos0Team Liquid Map Contest #22 - Presented by Monster Energy7ByuL: The Forgotten Master of ZvT30Behind the Blue - Team Liquid History Book19Clem wins HomeStory Cup 289
Community News
Weekly Cups (March 16-22): herO doubles, Cure surprises2Blizzard Classic Cup @ BlizzCon 2026 - $100k prize pool42Weekly Cups (March 9-15): herO, Clem, ByuN win42026 KungFu Cup Announcement6BGE Stara Zagora 2026 cancelled12
StarCraft 2
General
Weekly Cups (March 16-22): herO doubles, Cure surprises Weekly Cups (August 25-31): Clem's Last Straw? How to Choose the Right KYC Partner for Your Proje Team Liquid Map Contest #22 - Presented by Monster Energy What mix of new & old maps do you want in the next ladder pool? (SC2)
Tourneys
World University TeamLeague (500$+) | Signups Open RSL Season 4 announced for March-April Sparkling Tuna Cup - Weekly Open Tournament WardiTV Team League Season 10 KSL Week 87
Strategy
Custom Maps
Publishing has been re-enabled! [Feb 24th 2026]
External Content
Why Is Assignment Helper So Powerful for Students The PondCast: SC2 News & Results Mutation # 518 Radiation Zone Mutation # 517 Distant Threat
Brood War
General
ASL21 General Discussion Soulkey's decision to leave C9 BGH Auto Balance -> http://bghmmr.eu/ JaeDong's form before ASL [ASL21] Ro24 Preview Pt1: New Chaos
Tourneys
[ASL21] Ro24 Group B [ASL21] Ro24 Group A ASL Season 21 LIVESTREAM with English Commentary [Megathread] Daily Proleagues
Strategy
Fighting Spirit mining rates Simple Questions, Simple Answers Soma's 9 hatch build from ASL Game 2
Other Games
General Games
Stormgate/Frost Giant Megathread General RTS Discussion Thread Nintendo Switch Thread Path of Exile Dawn of War IV
Dota 2
Official 'what is Dota anymore' discussion The Story of Wings Gaming
League of Legends
G2 just beat GenG in First stand
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Deck construction bug Heroes of StarCraft mini-set
TL Mafia
TL Mafia Community Thread Five o'clock TL Mafia Mafia Game Mode Feedback/Ideas Vanilla Mini Mafia
Community
General
US Politics Mega-thread Are Online Numerology Courses Actually Worth It? CaratFlair Diamond Engagement Rings – Elegant Fore European Politico-economics QA Mega-thread Things Aren’t Peaceful in Palestine
Fan Clubs
The IdrA Fan Club
Media & Entertainment
[Req][Books] Good Fantasy/SciFi books Movie Discussion! [Manga] One Piece
Sports
2024 - 2026 Football Thread Cricket [SPORT] Formula 1 Discussion Tokyo Olympics 2021 Thread General nutrition recommendations
World Cup 2022
Tech Support
Laptop capable of using Photoshop Lightroom?
TL Community
The Automated Ban List
Blogs
Funny Nicknames
LUCKY_NOOB
Money Laundering In Video Ga…
TrAiDoS
Iranian anarchists: organize…
XenOsky
FS++
Kraekkling
Shocked by a laser…
Spydermine0240
Unintentional protectionism…
Uldridge
ASL S21 English Commentary…
namkraft
Customize Sidebar...

Website Feedback

Closed Threads



Active: 2130 users

US Politics Mega-thread - Page 8507

Forum Index > Closed
Post a Reply
Prev 1 8505 8506 8507 8508 8509 10093 Next
Read the rules in the OP before posting, please.

In order to ensure that this thread continues to meet TL standards and follows the proper guidelines, we will be enforcing the rules in the OP more strictly. Be sure to give them a re-read to refresh your memory! The vast majority of you are contributing in a healthy way, keep it up!

NOTE: When providing a source, explain why you feel it is relevant and what purpose it adds to the discussion if it's not obvious.
Also take note that unsubstantiated tweets/posts meant only to rekindle old arguments can result in a mod action.
KwarK
Profile Blog Joined July 2006
United States43737 Posts
Last Edited: 2017-08-20 17:55:37
August 20 2017 17:42 GMT
#170121
On August 21 2017 02:37 Falling wrote:
Well alright, so the the 'real' communists are passive deterministic observers- which still means anyone who is actively trying to cause communism is doing something fundamentally dangerous to society.

As for the second- as as I'm aware they started with the abolition of property and ended with the state run farm. That speaks more to the untenability of abolishing property rights than any lack of true belief. (But maybe I'm wrong on that front.)

I mean pretty much anyone who says "things would be better if I was in charge and could forcibly remake the world to conform to my vision of how it should be" is pretty fucking dangerous. Revolutionary communists are definitely in that camp. I think the same arguments against Nazis can be levied against revolutionary communists. Ultimately in both cases their participation in democratic institutions is only to subvert them. While there is an argument to be made for absolute rights to participation, I think there is also an argument to be made that if a party is attempting to gain power and influence within a democratic society by taking advantage of the rights afforded by that society in order to overthrow that society, their participation should be limited.

Nazis don't believe in democratic society, they'll happily take advantage of the rights it gives them as a way to gain momentum but their end goal is revolutionary, they intend to overthrow it and establish a totalitarian regime that can purge non whites. Same for Bolsheviks obviously, they'll participate in the Duma but only in as much as it'll help them abolish the Duma and replace it. There's certainly an argument to make against affording those movements the same democratic privileges as others. My main problem with that is the potential for abuse. As I said a few days ago, I'm not sure that there's any restriction I could come up with that wouldn't have been illegally and incorrectly used as a weapon against the civil rights movement.
ModeratorThe angels have the phone box
Falling
Profile Blog Joined June 2009
Canada11450 Posts
August 20 2017 17:43 GMT
#170122
Oh hey! We agree on something
Moderator"In Trump We Trust," says the Golden Goat of Mars Lago. Have faith and believe! Trump moves in mysterious ways. Like the wind he blows where he pleases...
Uldridge
Profile Blog Joined January 2011
Belgium5069 Posts
Last Edited: 2017-08-20 18:00:47
August 20 2017 17:53 GMT
#170123
Is hiring the biggest marketing bureau to subliminally indoctrinate everyone into your ideology the way to do it, then?

It's the only way man, I'm tellin ya!

Government won't ever change anything than stay largely status quo (swaying a bit) and the general public is unaware/uncaring/incapable anyway.

On August 21 2017 02:42 KwarK wrote:
Nazis don't believe in democratic society, they'll happily take advantage of the rights it gives them as a way to gain momentum but their end goal is revolutionary, they intend to overthrow it and establish a totalitarian regime that can purge non whites. Same for Bolsheviks obviously, they'll participate in the Duma but only in as much as it'll help them abolish the Duma and replace it. There's certainly an argument to make against affording those movements the same democratic privileges as others. My main problem with that is the potential for abuse. As I said a few days ago, I'm not sure that there's any restriction I could come up with that wouldn't have been illegally and incorrectly used as a weapon against the civil rights movement.

In Belgium we have a far right party called "The Flemish Interest" (loosely translated). The parties have an accord in which they exclude to ever form a coalition with that party because of its dangerous and inflammatory ideas. While they've changed their party policies in the meantime, it's still pretty much a certainty that the other parties won't ever work together with them.
https://en.wikipedia.org/wiki/Cordon_sanitaire#In_politics
Taxes are for Terrans
{CC}StealthBlue
Profile Blog Joined January 2003
United States41117 Posts
August 20 2017 18:03 GMT
#170124
President Donald Trump’s job approval ratings are sagging in three crucial states that helped him secure the presidency last year, according to NBC News/Marist polls released Sunday.

Trump’s approval is below 40 percent in Michigan (36 percent), Pennsylvania (35 percent) and Wisconsin (34 percent), according to the surveys, conducted in the four days after a violent white supremacist march last weekend in Charlottesville, Virginia. That time span includes Trump’s alternating responses to the episode, which drew widespread criticism from elected officials in Washington.

In all three states, Trump's disapproval ratings exceed 50 percent. The three states provided Trump 46 electoral votes in the 2016 election, arguably his margin of victory.

Voters in the three states — by margins ranging from 8 to 13 percentage points — told pollsters they would prefer that Democrats control Congress after the 2018 elections. Nearly two-thirds of those polled in each state say they’ve been embarrassed by Trump’s performance as president. Though they rate him about even on his handling of the economy, more than half say he’s weakened the country’s standing in the world.

In Michigan, about the same number of voters say they have a favorable view of Trump (34 percent) as they do about Kid Rock, a prospective Republican Senate candidate (36 percent) in the 2018 midterms.

Each of the three polls relied on a sample of about 800 registered voters, and each carries a margin of error of plus or minus 3.5 percentage points.


Source
"Smokey, this is not 'Nam, this is bowling. There are rules."
micronesia
Profile Blog Joined July 2006
United States24761 Posts
August 20 2017 18:17 GMT
#170125
What would it take for his approval rating to go below, I don't know, ten percent? I feel like if he walked around Washington stabbing random people nonstop for a month, killing thousands of innocents in the process, he would take a 1-2% dip in job approval ratings. Either it's not an effective measuring tool for anything useful or there are no actual standards for doing a non-terrible job in Washington.
ModeratorThere are animal crackers for people and there are people crackers for animals.
LegalLord
Profile Blog Joined April 2013
United States13779 Posts
August 20 2017 18:18 GMT
#170126
On August 21 2017 03:17 micronesia wrote:
What would it take for his approval rating to go below, I don't know, ten percent? I feel like if he walked around Washington stabbing random people nonstop for a month, killing thousands of innocents in the process, he would take a 1-2% dip in job approval ratings. Either it's not an effective measuring tool for anything useful or there are no actual standards for doing a non-terrible job in Washington.

If he presided over a complete and utter collapse of the US economy along with everything that happened now I think that would do it .
History will sooner or later sweep the European Union away without mercy.
{CC}StealthBlue
Profile Blog Joined January 2003
United States41117 Posts
Last Edited: 2017-08-20 18:21:37
August 20 2017 18:21 GMT
#170127
Some of the world’s leading robotics and artificial intelligence pioneers are calling on the United Nations to ban the development and use of killer robots.

Tesla’s Elon Musk and Google’s Mustafa Suleyman are leading a group of 116 specialists from across 26 countries who are calling for the ban on autonomous weapons.

The UN recently voted to begin formal discussions on such weapons which include drones, tanks and automated machine guns. Ahead of this, the group of founders of AI and robotics companies have sent an open letter to the UN calling for it to prevent the arms race that is currently under way for killer robots.

In their letter, the founders warn the review conference of the convention on conventional weapons that this arms race threatens to usher in the “third revolution in warfare” after gunpowder and nuclear arms.

The founders wrote: “Once developed, lethal autonomous weapons will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.

“We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.”

Experts have previously warned that AI technology has reached a point where the deployment of autonomous weapons is feasible within years, rather than decades. While AI can be used to make the battlefield a safer place for military personnel, experts fear that offensive weapons that operate on their own would lower the threshold of going to battle and result in greater loss of human life.

The letter, launching at the opening of the International Joint Conference on Artificial Intelligence (IJCAI) in Melbourne on Monday, has the backing of high-profile figures in the robotics field and strongly stresses the need for urgent action, after the UN was forced to delay a meeting that was due to start Monday to review the issue.

The founders call for “morally wrong” lethal autonomous weapons systems to be added to the list of weapons banned under the UN’s convention on certain conventional weapons (CCW) brought into force in 1983, which includes chemical and intentionally blinding laser weapons.

Toby Walsh, Scientia professor of artificial intelligence at the University of New South Wales in Sydney, said: “Nearly every technology can be used for good and bad, and artificial intelligence is no different. It can help tackle many of the pressing problems facing society today: inequality and poverty, the challenges posed by climate change and the ongoing global financial crisis.

“However, the same technology can also be used in autonomous weapons to industrialise war. We need to make decisions today choosing which of these futures we want.”

Musk, one of the signatories of the open letter, has repeatedly warned for the need for pro-active regulation of AI, calling it humanity’s biggest existential threat, but while AI’s destructive potential is considered by some to be vast it is also thought be distant.

Ryan Gariepy, the founder of Clearpath Robotics said: “Unlike other potential manifestations of AI which still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability.”

This is not the first time the IJCAI, one of the world’s leading AI conferences, has been used as a platform to discuss lethal autonomous weapons systems. Two years ago the conference was used to launch an open letter signed by thousands of AI and robotics researchers including Musk and Stephen Hawking similarly calling for a ban, which helped push the UN into formal talks on the technologies.

The UK government opposed such a ban on lethal autonomous weapons in 2015, with the Foreign Office stating that “international humanitarian law already provides sufficient regulation for this area”. It said that the UK was not developing lethal autonomous weapons and that all weapons employed by UK armed forces would be “under human oversight and control”.

While the suggestion of killer robots conjures images from science fiction such as the Terminator’s T-800 or Robocop’s ED-209, lethal autonomous weapons are already in use. Samsung’s SGR-A1 sentry gun, which is reportedly technically capable of firing autonomously but is disputed whether it is deployed as such, is in use along the South Korean border of the 2.5m-wide Korean Demilitarized Zone.

The fixed-place sentry gun, developed on behalf of the South Korean government, was the first of its kind with an autonomous system capable of performing surveillance, voice-recognition, tracking and firing with mounted machine gun or grenade launcher. But it is not the only autonomous weapon system in development, with prototypes available for land, air and sea combat.

The UK’s Taranis drone, in development by BAE Systems, is intended to be capable of carrying air-to-air and air-to-ground ordnance intercontinentally and incorporating full autonomy. The unmanned combat aerial vehicle, about the size of a BAE Hawk, the plane used by the Red Arrows, saw its first test flight in 2013 and is expected to be operational sometime after 2030 as part of the Royal Air Force’s Future Offensive Air System, destined to replace the human-piloted Tornado GR4 warplanes.

Russia, the US and other countries are currently developing robotic tanks that can either be remote controlled or operate autonomously. These projects range from autonomous versions of the Russian Uran-9 unmanned combat ground vehicle, to conventional tanks retrofitted with autonomous systems.

The US’s autonomous warship, the Sea Hunter built by Vigor Industrial, was launched in 2016 and while still in development is intended to have offensive capabilities including anti-submarine ordnance. Under the surface, Boeing’s autonomous submarine systems built on the Echo Voyager platform are also being considered for long-range deep-sea military use.


Source
"Smokey, this is not 'Nam, this is bowling. There are rules."
zlefin
Profile Blog Joined October 2012
United States7689 Posts
Last Edited: 2017-08-20 18:35:16
August 20 2017 18:24 GMT
#170128
On August 21 2017 03:17 micronesia wrote:
What would it take for his approval rating to go below, I don't know, ten percent? I feel like if he walked around Washington stabbing random people nonstop for a month, killing thousands of innocents in the process, he would take a 1-2% dip in job approval ratings. Either it's not an effective measuring tool for anything useful or there are no actual standards for doing a non-terrible job in Washington.

there are very few actual standards. humans' cognitive biases largely prevent them from accepting that the people on their side that they support are bad; especially in terms of approval ratings. or if necessary, they assert the "other" side is worse, again in iorder to maintain their viewpoint; or assert that their guy is bad as a person, but acceptable/good enough because they're accomplishing some desired goal in the job.
even when nixon resigned in disgrace he had quite a few supporters and was polling around 25% approval. (iirc he had >50% amongst republicans)

it has substantial issues as a measuring tool but isn't entirely worthless if you know ohw to read and interpret them. it also of course depends on what you're trying to measure.

do you want any more info on it?
Great read: http://shorensteincenter.org/news-coverage-2016-general-election/ great book on democracy: http://press.princeton.edu/titles/10671.html zlefin is grumpier due to long term illness. Ignoring some users.
Uldridge
Profile Blog Joined January 2011
Belgium5069 Posts
Last Edited: 2017-08-20 18:26:32
August 20 2017 18:24 GMT
#170129
I'm relatively sure that terrorists don't have the resources (knowledge, materials and funds) to make an actual killer robot. Or we'd have seen one by now.
It is a smart appeal though, It'd be horrific to see a robot dog kill off the citizens/soldiers/rebels of the country the US decides to interfere with next.
However, on the other side of the coin, we're getting closer to a Robot Wars scenario where countries could duke it out on an impartially decided arena. Would definitely watch nations throw away billions of dollars to gain some sort of supremacy over one another.
Taxes are for Terrans
Plansix
Profile Blog Joined April 2011
United States60190 Posts
Last Edited: 2017-08-20 18:29:01
August 20 2017 18:26 GMT
#170130
Given the political difficulties created drone strikes, I agree developing further autonomous weapons is a risky idea. There are endless ethical problems and those weapons will be deployed in mostly one sided conflicts. Short range remote control weapons have a place in our modern military. But I never want to get to an era where the US is deploying AI driven tanks over seas to fight terrorism.

Also, the arms race to create weapons to disrupt control of these AI tanks might only make them useful against poor nations that cannot afford the R&D or those weapons.
I have the Honor to be your Obedient Servant, P.6
TL+ Member
Uldridge
Profile Blog Joined January 2011
Belgium5069 Posts
August 20 2017 18:28 GMT
#170131
On August 21 2017 03:26 Plansix wrote:
Given the political difficulties created drone strikes, I agree developing further autonomous weapons is a risky idea. There are endless ethical problems and those weapons will be deployed in mostly one sided conflicts. Short range remote control weapons have a place in our modern military. But I never want to get to an era where the US is deploying AI driven tanks over seas to fight terrorism.

Imagine the uproar when it blows up a school in attendance by accident; or a hospital.
Somehow I believe the people responsible will still get away with just an apology.
Taxes are for Terrans
Plansix
Profile Blog Joined April 2011
United States60190 Posts
August 20 2017 18:31 GMT
#170132
On August 21 2017 03:28 Uldridge wrote:
Show nested quote +
On August 21 2017 03:26 Plansix wrote:
Given the political difficulties created drone strikes, I agree developing further autonomous weapons is a risky idea. There are endless ethical problems and those weapons will be deployed in mostly one sided conflicts. Short range remote control weapons have a place in our modern military. But I never want to get to an era where the US is deploying AI driven tanks over seas to fight terrorism.

Imagine the uproar when it blows up a school in attendance by accident; or a hospital.
Somehow I believe the people responsible will still get away with just an apology.

The optics of the whole thing is terrible. Richest country in the world builds AI tanks to assure its citizens are not at risk while using them to attack poor nations that have terrorist. Just look how much shit we get for drone strikes now. Now we build Skynet lite, but deploy it to the middle east only.
I have the Honor to be your Obedient Servant, P.6
TL+ Member
warding
Profile Joined August 2005
Portugal2395 Posts
August 20 2017 18:32 GMT
#170133
On August 21 2017 03:24 Uldridge wrote:
I'm relatively sure that terrorists don't have the resources (knowledge, materials and funds) to make an actual killer robot. Or we'd have seen one by now.
It is a smart appeal though, It'd be horrific to see a robot dog kill off the citizens/soldiers/rebels of the country the US decides to interfere with next.
However, on the other side of the coin, we're getting closer to a Robot Wars scenario where countries could duke it out on an impartially decided arena. Would definitely watch nations throw away billions of dollars to gain some sort of supremacy over one another.

Uranium and nuclear weapons are hard to get. Electronics is cheap and software is free to transfer. It isn't a stretch to imagine an enemy/rogue state or a criminal organization in a failed state developing autonomous weapons and feeding them to terrorist organizations to mess with israel or the U.S.
Uldridge
Profile Blog Joined January 2011
Belgium5069 Posts
Last Edited: 2017-08-20 18:36:22
August 20 2017 18:32 GMT
#170134
What could go wrong!?

On August 21 2017 03:32 warding wrote:
Uranium and nuclear weapons are hard to get. Electronics is cheap and software is free to transfer. It isn't a stretch to imagine an enemy/rogue state or a criminal organization in a failed state developing autonomous weapons and feeding them to terrorist organizations to mess with israel or the U.S.

I doubt you understand what goes into making an AI. You don't just implement AI by pressing install and fuse some electronics together and put it in a husk.
You need mechanical, software and electrical engineers to fix that stuff. It's pretty difficult to make de novo able autonomous killing machines.
Taxes are for Terrans
Plansix
Profile Blog Joined April 2011
United States60190 Posts
Last Edited: 2017-08-20 18:39:12
August 20 2017 18:36 GMT
#170135
On August 21 2017 03:32 Uldridge wrote:
What could go wrong!?

Suicide bombers are bad because you can only use them once. What if we developed software and a sensor package that could remove the human from the equation? And then we started a bidding war to see who could make it the cheapest. How could this go wrong?

On August 21 2017 03:32 Uldridge wrote:
What could go wrong!?

Show nested quote +
On August 21 2017 03:32 warding wrote:
Uranium and nuclear weapons are hard to get. Electronics is cheap and software is free to transfer. It isn't a stretch to imagine an enemy/rogue state or a criminal organization in a failed state developing autonomous weapons and feeding them to terrorist organizations to mess with israel or the U.S.

I doubt you understand what goes into making an AI. You don't just implement AI by pressing install and fuse some electronics together and put it in a husk.
You need mechanical, software and electrical engineers to fix that stuff. It's pretty difficult to make de novo able autonomous killing machines.


My grandfather used to run a company that did military R&D for optics(sights for guns on boats or para-scopes). They made those things to be used by people with a 6th grade reading level. If this is developed for the military, there is a good chance it will be brain dead easy to use. And they will make it to be mass produced in some way.
I have the Honor to be your Obedient Servant, P.6
TL+ Member
Uldridge
Profile Blog Joined January 2011
Belgium5069 Posts
Last Edited: 2017-08-20 18:42:44
August 20 2017 18:38 GMT
#170136
Yeah exactly, what could go wrong?! We still want those resources, right? Isn't a faceless purge once in a while good for everyone?
Ok I'll stop being so moronically cynical now.

I'm not arguing about a preset AI killing machine, I was responding to actually making a killer robot from scratch.
Of course getting one hand delivered (with an instruction manual attached to it) will be pretty easy to utilize.
Taxes are for Terrans
Gahlo
Profile Joined February 2010
United States35172 Posts
August 20 2017 18:42 GMT
#170137
On August 21 2017 03:36 Plansix wrote:
Show nested quote +
On August 21 2017 03:32 Uldridge wrote:
What could go wrong!?

Suicide bombers are bad because you can only use them once. What if we developed software and a sensor package that could remove the human from the equation? And then we started a bidding war to see who could make it the cheapest. How could this go wrong?

Show nested quote +
On August 21 2017 03:32 Uldridge wrote:
What could go wrong!?

On August 21 2017 03:32 warding wrote:
Uranium and nuclear weapons are hard to get. Electronics is cheap and software is free to transfer. It isn't a stretch to imagine an enemy/rogue state or a criminal organization in a failed state developing autonomous weapons and feeding them to terrorist organizations to mess with israel or the U.S.

I doubt you understand what goes into making an AI. You don't just implement AI by pressing install and fuse some electronics together and put it in a husk.
You need mechanical, software and electrical engineers to fix that stuff. It's pretty difficult to make de novo able autonomous killing machines.


My grandfather used to run a company that did military R&D for optics(sights for guns on boats or para-scopes). They made those things to be used by people with a 6th grade reading level. If this is developed for the military, there is a good chance it will be brain dead easy to use. And they will make it to be mass produced in some way.

Couldn't you already just strap a triggered bomb onto a drone from someplace like Best Buy and pilot into position?
LegalLord
Profile Blog Joined April 2013
United States13779 Posts
August 20 2017 18:42 GMT
#170138
On August 21 2017 03:32 Uldridge wrote:
What could go wrong!?

Show nested quote +
On August 21 2017 03:32 warding wrote:
Uranium and nuclear weapons are hard to get. Electronics is cheap and software is free to transfer. It isn't a stretch to imagine an enemy/rogue state or a criminal organization in a failed state developing autonomous weapons and feeding them to terrorist organizations to mess with israel or the U.S.

I doubt you understand what goes into making an AI. You don't just implement AI by pressing install and fuse some electronics together and put it in a husk.
You need mechanical, software and electrical engineers to fix that stuff. It's pretty difficult to make de novo able autonomous killing machines.

The software on AI is really the most important part. The hardware is fairly straightforward; a few standard robotic parts put together in a generally, often universally, useful capacity. But how smart it tends to be is the real trick.

AI labs tend to have some standardized robots that they can use for a wide range of purposes, for example. The difference is always what software they load into it.
History will sooner or later sweep the European Union away without mercy.
Liquid`Drone
Profile Joined September 2002
Norway28778 Posts
August 20 2017 18:43 GMT
#170139
On August 21 2017 03:32 Uldridge wrote:
What could go wrong!?

Show nested quote +
On August 21 2017 03:32 warding wrote:
Uranium and nuclear weapons are hard to get. Electronics is cheap and software is free to transfer. It isn't a stretch to imagine an enemy/rogue state or a criminal organization in a failed state developing autonomous weapons and feeding them to terrorist organizations to mess with israel or the U.S.

I doubt you understand what goes into making an AI. You don't just implement AI by pressing install and fuse some electronics together and put it in a husk.
You need mechanical, software and electrical engineers to fix that stuff. It's pretty difficult to make de novo able autonomous killing machines.


being the first to develop a killer robot does indeed sound like an amazingly difficult task. But isn't the thing with electronics and programming that it's kinda easy to replicate? I mean missile technology does seem like a pretty damn well guarded secret, but they also blow up, so you can't reverse engineer one.
Moderator
Plansix
Profile Blog Joined April 2011
United States60190 Posts
August 20 2017 18:43 GMT
#170140
On August 21 2017 03:42 Gahlo wrote:
Show nested quote +
On August 21 2017 03:36 Plansix wrote:
On August 21 2017 03:32 Uldridge wrote:
What could go wrong!?

Suicide bombers are bad because you can only use them once. What if we developed software and a sensor package that could remove the human from the equation? And then we started a bidding war to see who could make it the cheapest. How could this go wrong?

On August 21 2017 03:32 Uldridge wrote:
What could go wrong!?

On August 21 2017 03:32 warding wrote:
Uranium and nuclear weapons are hard to get. Electronics is cheap and software is free to transfer. It isn't a stretch to imagine an enemy/rogue state or a criminal organization in a failed state developing autonomous weapons and feeding them to terrorist organizations to mess with israel or the U.S.

I doubt you understand what goes into making an AI. You don't just implement AI by pressing install and fuse some electronics together and put it in a husk.
You need mechanical, software and electrical engineers to fix that stuff. It's pretty difficult to make de novo able autonomous killing machines.


My grandfather used to run a company that did military R&D for optics(sights for guns on boats or para-scopes). They made those things to be used by people with a 6th grade reading level. If this is developed for the military, there is a good chance it will be brain dead easy to use. And they will make it to be mass produced in some way.

Couldn't you already just strap a triggered bomb onto a drone from someplace like Best Buy and pilot into position?

It is only a matter of time before someone tries that. I am sort of surprised it hasn't happened yet.
I have the Honor to be your Obedient Servant, P.6
TL+ Member
Prev 1 8505 8506 8507 8508 8509 10093 Next
Please log in or register to reply.
Live Events Refresh
Next event in 1h 57m
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
Nina 188
ProTech119
StarCraft: Brood War
GuemChi 3476
firebathero 1976
HiyA 266
Bisu 147
ToSsGirL 83
Noble 22
NotJumperer 21
Bale 19
ZergMaN 11
Nal_rA 10
[ Show more ]
Terrorterran 6
Dota 2
monkeys_forever740
febbydoto34
League of Legends
JimRising 501
Counter-Strike
Stewie2K1061
m0e_tv477
Other Games
ceh9406
Liquid`RaSZi173
Happy162
Trikslyr20
Organizations
Other Games
gamesdonequick855
Dota 2
PGL Dota 2 - Main Stream127
Other Games
BasetradeTV118
StarCraft 2
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
sctven
[ Show 13 non-featured ]
StarCraft 2
• Berry_CruncH191
• LUISG 1
• AfreecaTV YouTube
• intothetv
• Kozan
• IndyKCrew
• LaughNgamezSOOP
• Migwel
• sooper7s
StarCraft: Brood War
• BSLYoutube
• STPLYoutube
• ZZZeroYoutube
League of Legends
• HappyZerGling131
Upcoming Events
Sparkling Tuna Cup
1h 57m
Afreeca Starleague
1h 57m
Soulkey vs Ample
JyJ vs sSak
Replay Cast
1d
Afreeca Starleague
1d 1h
hero vs YSC
Larva vs Shine
Kung Fu Cup
1d 2h
Replay Cast
1d 15h
KCM Race Survival
2 days
The PondCast
2 days
WardiTV Team League
2 days
Replay Cast
2 days
[ Show More ]
WardiTV Team League
3 days
RSL Revival
4 days
Cure vs Zoun
herO vs Rogue
WardiTV Team League
4 days
Platinum Heroes Events
4 days
BSL
4 days
RSL Revival
5 days
ByuN vs Maru
MaxPax vs TriGGeR
WardiTV Team League
5 days
BSL
5 days
Replay Cast
6 days
Afreeca Starleague
6 days
Light vs Calm
Royal vs Mind
Wardi Open
6 days
Monday Night Weeklies
6 days
Liquipedia Results

Completed

Proleague 2026-03-23
WardiTV Winter 2026
Underdog Cup #3

Ongoing

KCM Race Survival 2026 Season 1
BSL Season 22
CSL Elite League 2026
CSL Season 20: Qualifier 1
ASL Season 21
Acropolis #4 - TS6
RSL Revival: Season 4
Nations Cup 2026
NationLESS Cup
BLAST Open Spring 2026
ESL Pro League S23 Finals
ESL Pro League S23 Stage 1&2
PGL Cluj-Napoca 2026
IEM Kraków 2026
BLAST Bounty Winter 2026
BLAST Bounty Winter Qual

Upcoming

2026 Changsha Offline CUP
CSL Season 20: Qualifier 2
CSL 2026 SPRING (S20)
Acropolis #4
IPSL Spring 2026
BSL 22 Non-Korean Championship
CSLAN 4
Kung Fu Cup 2026 Grand Finals
HSC XXIX
uThermal 2v2 2026 Main Event
IEM Cologne Major 2026
Stake Ranked Episode 2
CS Asia Championships 2026
IEM Atlanta 2026
Asian Champions League 2026
PGL Astana 2026
BLAST Rivals Spring 2026
CCT Season 3 Global Finals
IEM Rio 2026
PGL Bucharest 2026
Stake Ranked Episode 1
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2026 TLnet. All Rights Reserved.