• ForestGreenGhost@literature.cafe
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          3 days ago

          No I’m not saying that you’re not allowed to comment. I’m just saying that your takes are stupid and that you probably shouldn’t.

            • ForestGreenGhost@literature.cafe
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              2 days ago

              I’m sorry that I called you stupid. That was wrong of me and you didn’t deserve that.

              If you’re interested, I could explain to you why your comment that I initially responded to was a false equivalence, and why claiming that I was stifling your free speech is nonsensical. Let’s talk it out and maybe both of us can walk away from this having learned something. :)

              • Tja@programming.dev
                link
                fedilink
                arrow-up
                1
                ·
                2 days ago

                Sure, I’ll be happy to.

                My point is that chatbots, and other LLM applications, are useful tools that in isolated cases have caused people to become addicted and other harmful effects, including deaths.

                The same can be said of many other things, from parasocial relationships with celebrities, tools like heavy machinery, aircraft, medicine with side effects, gyms, and a long list of others. People become obsessed, addicted and in certain cases even die. Or the tool fails and kills them.

                The solution shouldn’t be to immediately ban them and accuse the CEO of murder (super specific legal definition, btw) but try to regulate, add guardrails, make it safer and help the victims however they need. Sure, let’s investigate each death and see if there has been negligence, but pitchforks are not the solution.