• x00z@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 days ago

    Well, I might get disliked for this opinion, but in some cases it’s perfectly fine for a computer to make a management decision. However, this should also mean that the person in charge of said computer, or the one putting the decision by the computer into actual action, should be the one that gets held responsible. There’s also the thing where it should be questioned how responsible it is to even consider the management decisions of a computer in a specific field. What I’m saying is that there’s no black and white answer here.

  • csm10495@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 days ago

    I’ve thought about this wrt to AI and work. Every time I sit in a post mortem it’s about human errors and process fixes.

    The day a post mortem ends with “well the AI did it so nothing we can do” is the day I look towards… with dread.

  • limer@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    3 days ago

    I asked computer if I should read the article, it said no. Am I in an abusive relationship?

    That is ridiculous, clearly. I’ll use mainstream search engine, tailor made to my needs, to make sure it cannot happen

  • melsaskca@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    3 days ago

    A complete one-eighty nowadays…“As a highly paid “business” exec I have no ideas…computer, tell me what to do.”

  • onnekas@sopuli.xyz
    link
    fedilink
    arrow-up
    0
    ·
    3 days ago

    I generally agree.

    Imagine however, that a machine objectively makes the better decisions than any person. Should we then still trust the humans decision just to have someone who is accountable?

    What is the worth of having someone who is accountable anyway? Isn’t accountability just an incentive for humans to not just fuck things up? It’s also nice for pointing fingers if things go bad - but is there actually any value in that?

    • petrol_sniff_king@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      0
      ·
      3 days ago

      Imagine however, that a machine objectively makes the better decisions than any person.

      You can’t know if a decision is good or bad without a person to evaluate it. The situation you’re describing isn’t possible.

      the people who deploy a machine […] should be accountable for those actions.

      How is this meaningfully different from just having them make the decisions in the first place? Are they too stupid?

    • Maroon@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      3 days ago

      Imagine however, that a machine

      That’s hypothetical. In the real world, in the human society, the humans who are part of corporations and receiving profits by making/selling these computers must also bear the responsibility.

      • calcopiritus@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        3 days ago

        Tbf that leads to the problem of:

        Company/Individual makes program that is in no way meant for making management decision.

        Someone else comes and deploys that program to make management decisions.

        The ones that made that program couldn’t stop the ones that deployed it from deploying it.

        Even if the maker aimed to make a decision-making program, and marketed it as so. Whoever deployed it is ultimately the responsible for it. As long as the maker doesn’t fake tests or certifications of course, I’m sure that would violate many laws.

        • ZombiFrancis@sh.itjust.works
          link
          fedilink
          arrow-up
          0
          ·
          3 days ago

          The premise is that a computer must never make a management decision. Making a program capable of management decisons already failed. The deployment and use of that program to that end is already built upon that failure.

      • onnekas@sopuli.xyz
        link
        fedilink
        arrow-up
        0
        ·
        3 days ago

        I believe those who deploy the machines should be responsible in the first place. The corporations who make/sell those machines should be accountable if they deceptively and intentionally program those machines to act maliciously or in somebody else’s interest.

    • sleepundertheleaves@infosec.pub
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      3 days ago

      Unfortunately, what’s actually happening is humans are being kept in the loop of AI decisions solely to take the blame if the AI screws up.

      So the CEOs who bought the AI, and the company that sold the AI, and the AI tool itself, all get to dodge responsibility for the AI 's failures by blaming a human worker.

      For example, this discussion of an AI generated summer reading guide that hallucinated a bunch of non-existent books:

      The freelance writer who authored this giant summer reading guide with all its lists had been tasked with doing the work of literally dozens of writers, editors and fact-checkers. We don’t know whether his boss told him he had to use AI, but there’s no way one writer could do all that work without AI.

      In other words, that writer’s job wasn’t to write the article. His job was to be the “human in the loop” for an AI that wrote the articles, but on a schedule and with a workload that precluded his being able to do a good job. It’s more true to say that his job was to be the AI’s “accountability sink” (in the memorable phrasing of Dan Davies): he was being paid to take the blame for the AI’s mistakes.

      https://doctorow.medium.com/https-pluralistic-net-2025-09-11-vulgar-thatcherism-there-is-an-alternative-f1428b42a8fd

  • Appoxo@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    0
    ·
    4 days ago

    You are essentially saying
    “Management is essential, replace the common work force with AI”

    Well…If I get fired, I will hold you accountable!

  • Captain Aggravated@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 days ago

    The computer can’t be held accountable, but the programmer and operator can.

    I could go on a whole thing about mission rules and command decisions here, but I’m sick of typing for the day.

  • sunbeam60@lemmy.one
    link
    fedilink
    arrow-up
    0
    ·
    4 days ago

    This endless separation into “managers” and “not managers” is so unproductive. Everyone manages something. That’s why you’re employed.

    • kadu@scribe.disroot.org
      link
      fedilink
      arrow-up
      0
      ·
      3 days ago

      Everyone manages something.

      Most workers manage something and create value. Managers are only managing, remove them and nothing changes - usually things get more optimized, actually.

    • Pat_Riot@lemmy.today
      link
      fedilink
      arrow-up
      0
      ·
      3 days ago

      Sounds like something a manager would say. Some of us produce, create value through our labor, while some sit their fat asses at a desk and only grace the production floor to make everybody’s day just a little more difficult. So you just get on back up there to the big house and let us handle things out here where you can’t hack it.

  • Credibly_Human@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    4 days ago

    I don’t think this is wise at all.

    Its just people putting into words their wish to be able to punish and appoint blame above their wishes to be pragmatic.

    If software is better at something, there is no reason to be mad at that software.

    More than that, the idea that the software vendor could not be held liable is farcical. Of course they could be, or the company running said software. In fact, they’d probably get more shit than managers who regularly get away with ridiculous shit.

    I mean wage theft is the biggest form of theft for a reason, and none of the wage thieves are machines (or at least most aren’t).

        • cassandrafatigue@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          4 days ago

          You don’t reason someone out of a position they didnt reason themselves into, and I cannot figure out how to come to conclusions that incorrect. Just leaving a warning; my reply wasn’t for you.

          You’re right about wage theft being common. So that’s something.

          • Credibly_Human@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            4 days ago

            This is pure pseudo intellectualism because you literally have no argument or point.

            You have no reasoning and are projecting that onto me because you can’t explain this opinion your feelings have brought you to.

            • cassandrafatigue@lemmy.dbzer0.com
              link
              fedilink
              arrow-up
              0
              ·
              4 days ago

              I’m not willing to argue with you. I’ve argued this with you¹ a thousand times, you are not rational. Everyone who reads your shit knows what I’m talking about. Ask them.

              ¹perhaps with a different name and face, but otherwise indistinguishable. It gets tedious.

              • Credibly_Human@lemmy.world
                link
                fedilink
                arrow-up
                0
                ·
                edit-2
                3 days ago

                With the amount you’ve typed you could have easily typed a rationale. The truth is your opinions don’t hold weight and have no good rationale. That is all.

                • korazail@lemmy.myserv.one
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  3 days ago

                  The burden of proof is on you. Show me one example of a company being held liable (really liable, not a settlement/fine for a fraction of the money they made) for a software mistake that hurt people.

                  The reality is that a company can make X dollars with software that makes mistakes, and then pay X/100 dollars when that hurts people and goes to court. That’s not a punishment, that’s a cost of business. And the company pays that fine and the humans who mode those decisions are shielded from further repercussions.

                  When you said:

                  the idea that the software vendor could not be held liable is farcical

                  We need YOU to back that up. The rest of us have seen it never be accurate.

                  And it gets worse when the software vendor is a step removed: See flock cameras making big mistakes. Software decided that this car was stolen, but it was wrong. The police intimidated an innocent civilian because the software was wrong. Not only were the police not held accountable, Flock was never even in the picture.