The ethics of AI: a dozen Google employees resign over ethical concerns about autonomous weaponry

primesuspectprimesuspect Beepin n' BoopinDetroit, MI Icrontian

Google's project Maven could advance us towards a world in which autonomous weapons exist, and a dozen or so Google employees have put their foot down about it and resigned in protest. This is a fascinating read that brings up a lot of complex ethical questions.

https://gizmodo.com/google-employees-resign-in-protest-against-pentagon-con-1825729300

Comments

  • ThraxThrax 🐌 Austin, TX Icrontian
    edited May 2018

    I think it is inevitable that the military will use AI and ML to assess combat footage, analyze the battlefield, and maybe even engage hostiles. As a nation, America is always behind the ball when it comes to having moral and ethical discussions about the emerging nature of warfare. Military policy does not ask for permission, and only rarely begs for forgiveness.

    The day will come when a computer makes the decision to kill absent human intervention. Pandora will never go back into her box when that day comes.

    So while questions will be raised, they will be too late, and we will inevitably descend into an AI-fueled hellscape because there's money at the bottom of that pit.

  • CBCB Ƹ̵̡Ӝ̵̨̄Ʒ Der Millionendorf- Icrontian

    I never understand the 'I'm resigning because I don't agree with the direction we're going' mentality. If you feel that strongly about it, then resigning is the LAST thing that you should do, because then you give up all opportunity to change the direction or to help mitigate the disaster. All ou do it take yourself away from the situation. That's not a 'protest' that's 'giving up'

  • AlexDeGruvenAlexDeGruven Wut? Meechigan Icrontian

    I think it makes sense in highly talent-driven areas. Like, the division will be hurting without that person and they would be extremely difficult to replace.

    Outside of that, though, rank and file or management types doesn't make much sense, and I agree.

  • LincLinc Owner Detroit Icrontian
    edited May 2018

    @CB said:
    I never understand the 'I'm resigning because I don't agree with the direction we're going' mentality. If you feel that strongly about it, then resigning is the LAST thing that you should do, because then you give up all opportunity to change the direction or to help mitigate the disaster. All ou do it take yourself away from the situation. That's not a 'protest' that's 'giving up'

    I strongly disagree. There are ethical choices to be made, and some jobs must be refused. The "it's inevitable, so why not participate to guide it" mentality is deeply flawed.

    MiracleManSprimesuspect
  • LincLinc Owner Detroit Icrontian

    If there is no line at which you quit rather than protest, you have no morality.

    RyanMM
  • LincLinc Owner Detroit Icrontian
    edited May 2018

    @AlexDeGruven said:
    I think it makes sense in highly talent-driven areas. Like, the division will be hurting without that person and they would be extremely difficult to replace.

    Outside of that, though, rank and file or management types doesn't make much sense, and I agree.

    The idea that you need to be providing more valuable/expensive labor for your resignation to be appropriate & relevant is absurd.

    RyanMM
  • AlexDeGruvenAlexDeGruven Wut? Meechigan Icrontian

    I should clarify: As an outsider, that's what makes sense to me from a "What action could I take that would affect the greatest possibility of change"

    If one's moral guidance pushes them out the door, regardless of their business value, then I have nothing against that whatsoever.

  • CBCB Ƹ̵̡Ӝ̵̨̄Ʒ Der Millionendorf- Icrontian

    @Linc said:
    If there is no line at which you quit rather than protest, you have no morality.

    Perhaps once the cause is lost? But this seems really early in the game, when the cause is not lost, and there is still positive change that could be implemented by people who are well established in the organization.

    Like once the company starts actually murdering people or helping governments to murder people, be like "Okay, I'm not helping with this anymore", but while we're still at the 'this technology could eventually be used to murder people' stage, what does leaving accomplish, except for removing the - possibly much-needed - voice of dissent from the situation.

    AlexDeGruven
  • LincLinc Owner Detroit Icrontian
    edited May 2018

    @CB said:
    while we're still at the 'this technology could eventually be used to murder people' stage, what does leaving accomplish, except for removing the - possibly much-needed - voice of dissent from the situation.

    There's no "could", it's already "will". A contract with the Pentagon is for killing people, full stop. Bodies shouldn't have to hit the ground for the line to be crossed.

    Leaving accomplishes taking your own hands off the weapon that will be used for death.

  • LincLinc Owner Detroit Icrontian
    edited May 2018

    I fundamentally don't think working for people that think that contract is OK makes you more persuasive than being outside the company and talking about what you saw. There's no booking a conference room to hash things out in that situation.

    When the people you work for make fundamentally immoral decisions, it is foolish to think it's your responsibility to change their mind after the fact.

    Honestly I'm far more forgiving of someone who said "I really need this job to feed my family and don't think I can find another" than folks who delude themselves into thinking they are going to a moderating influence on the hand of Death so it kills less people.

    MiracleManS
  • LincLinc Owner Detroit Icrontian
    edited May 2018

    @AlexDeGruven said:
    I should clarify: As an outsider, that's what makes sense to me from a "What action could I take that would affect the greatest possibility of change"

    I think it is a grave miscalculation to set the bar for quitting at "it needs to be the straw that breaks them".

    I think those engineers are exactly right that they need to each quit individually in the hope that the cumulative effect creates the change. We as individuals greatly exaggerate our own impact alone.

    We're talking about this right now because of the scale of the quitting, not because X person quit or because they are arguing in a conference room.

    MiracleManSbrightGarg
  • Mt_GoatMt_Goat Head Cheezy Knob Pflugerville (north of Austin) Icrontian

    BUT THE SAD BOTTOM LINE IS

    Pandora's box has already been opened for years! Or we would not be where we are today. And now even a whole division of employees quitting or thousands not taking certain jobs on moral high ground will not make said companies stutter on these projects. Because for every one that quits or does not hire on there will be many more in line for that job. And even if one or several companies withdraw from this type of work the will be other greedier companies scurrying like rats for a morsel of food that will take on the task. And in all reality others may not be as conscientious. So it is truly a dilemma of damned if you do and damned if you dont. This is a case of we could go on and on with this and that but the future mold has already been cast . The only uncertainty is who will profit.

  • drasnordrasnor Starship Operator Hawthorne, CA Icrontian

    The way I read this is that there were some folks at Google that were surprised to learn that they worked for a defense contractor. My engineering education is in a dual-use field and I was fortunate that one of the senior professors took the time to lecture on the ethics of working in the defense industry. Distilled:

    • If all lives are weighted equally, the respect for persons viewpoint is most correct and engineering for defense is not ethical as defense either directly or indirectly causes harm to persons.
      ** Under this premise, conscience dictates that not participating is the correct course of action.

    • If the lives of friends are worth more than the lives of those who would do them harm, the utilitarian viewpoint is most correct and engineering for defense is ethical as defense maximizes the public good by allowing it to survive when threatened.
      ** Under this premise, conscience dictates that developing the technology before anyone else is correct because it is better for us to have this technology first than for our enemies to have it before us.

    These Google folks made their choice; I may not agree with their choice but I respect them for making it nonetheless.

    Cliff_ForsterUPSLynxIlriyas
  • edcentricedcentric near Milwaukee, Wisconsin Icrontian

    The part of this that I find so disingenuous is that people are paying attention because it is the "the google".
    People leave jobs or choose not to take them in hundreds of industries for personal moral reasons every year, but we pay no attention to that.
    Considering that the principle source of funding for electronics and computer science R&D has been military since the 1930's why were these people surprised that a project of theirs had military applications?
    No, they chose to leave because it became general knowledge and they saw a chance to make a statement. Genuine or not we cannot judge, but they did it for visibility not morals.

  • LincLinc Owner Detroit Icrontian

    @drasnor said:

    • If the lives of friends are worth more than the lives of those who would do them harm, the utilitarian viewpoint is most correct and engineering for defense is ethical as defense maximizes the public good by allowing it to survive when threatened.
      ** Under this premise, conscience dictates that developing the technology before anyone else is correct because it is better for us to have this technology first than for our enemies to have it before us.

    I'm generally skeptical of reductive philosophical arguments that have the effect of writing blank ethical checks for behavior.

  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian

    Yes, when you turn human lives into a flowchart, some things make more sense than others. Introduce empathy to the equation and it all falls apart.

  • drasnordrasnor Starship Operator Hawthorne, CA Icrontian

    @Linc said:
    I'm generally skeptical of reductive philosophical arguments that have the effect of writing blank ethical checks for behavior.

    This was the distilled version. It's not a blank ethical check anyway, the question was how someone could do something you find immoral like make things that kill people and this is an answer to that question. The ethical question is a healthy one to have but it's not fair to the folks that work on defense technology to characterize them all as unethical and immoral for doing so.

    @primesuspect said:
    Yes, when you turn human lives into a flowchart, some things make more sense than others. Introduce empathy to the equation and it all falls apart.

    You mean, empathizing with strangers that mean harm? Yes, that's certainly possible though admittedly difficult. It would be much better if the people that run our various societies did a better job at making friends than enemies. There's no need to defend against friends.

    I'm an engineer and flowcharts are a tool we use to solve ethical quandaries among other problems. It's not a nuanced approach and I'm not a philosopher; the technique simplifies complex problems so that a solution can be identified so there's an inherent bias towards simple answers. Society needs philosophers who think about this stuff and can present a more compelling argument than "hurting people is bad" and doesn't depend on wishful thinking.

    Cliff_ForsterIlriyas
  • RyanMMRyanMM Ferndale, MI Icrontian

    Remember when Google's motto was "Do No Evil?"

    Now it's "We may write code for autonomous killing machines."

    Basically the same thing, AMIRITE

  • @RyanMM said:
    Remember when Google's motto was "Do No Evil?"

    Now it's "We may write code for autonomous killing machines."

    Basically the same thing, AMIRITE

    It was "Don't be Evil" actually... And to @drasnor point that is a gross oversimplification of a very complex issue.

  • BlueTattooBlueTattoo Boatbuilder Houston, TX Icrontian

    You may feel that a job is a job and that someone will make the weapons, so why not. That’s true. Or you may be proud to build the next great weapon for your county. You have that choice to work or not work for a company that directly produces an important part of a product that is used for something with which you disagree.

    I personally have serious disagreements with how our military uses drones, and would not help build them, including software development. That alone probably wouldn’t keep me from working for the company in a different capacity. After all, I did work for contractors on the Space Shuttle program for 24 years. These companies all built weapons systems, but also communications satellites, commercial planes, radios, automotive parts, power tools, and many other products, good, neutral, and maybe, bad. I worked where I felt good about what I was doing.

    When I was young, I was less liberal and might have thought that working on a killer drone would be cool. I don’t know.

  • edcentricedcentric near Milwaukee, Wisconsin Icrontian

    and, these people didn't mind collecting (and storing for long term) tons of personal data on users so that they can show you selective search results and sell it for targeted advertising, but they balk at physical destruction?

  • ketoketo Occupied. Or is it preoccupied? Icrontian

    Recently read a 'post apocalypse' type novel, where America was in civil war about 50 years out from now. One of the big dangers was the sky being full of armed drones, that targeted humans, that the owners had lost control of and were wandering targeting random civilians. Wasn't that big of a stretch to see that as a possibility for the future.

  • LincLinc Owner Detroit Icrontian

    @drasnor said:
    You mean, empathizing with strangers that mean harm?

    The problem is we also stop empathizing with that stranger's neighbors, siblings, and cousins.

    Flowcharts depend on drawing circles around people to quantify them as either friends or enemies, which is what stokes the conflicts in the first place.

  • @edcentric said:
    and, these people didn't mind collecting (and storing for long term) tons of personal data on users so that they can show you selective search results and sell it for targeted advertising, but they balk at physical destruction?

    I detect sarcasm?

  • LincLinc Owner Detroit Icrontian

    ICE withdraws "Extreme Vetting" software RFP

    If you read between the lines, companies basically said they couldn't / wouldn't build that part of the system, so it got pulled.

    Every job is political, but few more so than software development.

    mertesnRyanMM
  • Mt_GoatMt_Goat Head Cheezy Knob Pflugerville (north of Austin) Icrontian

    @keto said:
    Recently read a 'post apocalypse' type novel, where America was in civil war about 50 years out from now. One of the big dangers was the sky being full of armed drones, that targeted humans, that the owners had lost control of and were wandering targeting random civilians. Wasn't that big of a stretch to see that as a possibility for the future.

    Not too far from Skynet either!

  • edcentricedcentric near Milwaukee, Wisconsin Icrontian

    @Cliff_Forster said:

    @edcentric said:
    and, these people didn't mind collecting (and storing for long term) tons of personal data on users so that they can show you selective search results and sell it for targeted advertising, but they balk at physical destruction?

    I detect sarcasm?

    Just a little (I still wish that there was a font call 'sarcasm').
    They see no issue with violation of personal privacy and disrespecting individuals. They are fine with selling access to purveyors of misinformation and sellers of scams. but physical target discrimination is too touchy for them?
    They are OK with ruining peoples lives with software, just not hardware.
    Automated targeting is a long way off. Pulling the trigger (or pushing the button) is a form of control that people like.
    The analog in business is that you could automate the process of vetting and funding new projects, but controlling the money is what senior management does, even if they do a crappy job of it.

Sign In or Register to comment.