• Infynis@midwest.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    22 days ago

    This is what AI should actually be used for. Strengthen the algorithms that identify it to reduce the load humans need to review, and it should hopefully be more manageable

  • bassomitron@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    22 days ago

    I hope they win their lawsuit. I listened to a Radio lab episode a few years ago about FB moderators. The shit they have to see day in and day out sound absolutely horrible. Pics and videos of extreme violence and child pornography sounds like it’d give any normal person some major trauma.

      • PerogiBoi@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        21 days ago

        Go on the Reddit front page or Twitter home page with a heart rate and blood pressure monitor and scroll for 10 minutes.

        You will find that in almost every instance, both measures go up. The whole point of social media is to agitate because agitation correlates with engagement which correlates with mucho ad dinero.

  • ArbitraryValue@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    edit-2
    22 days ago

    One time on Reddit, a mod of /r/askhistorians described some of the content of this sort that he had seen, and he wasn’t as dispassionate about it as this article is. Just his verbal description is both disturbing and difficult to forget, so I can believe that these employees are traumatized.

    With that said, what about other careers that expose people to disturbing things? I used to know a pathologist who once told me that she had a bad day because two infants died during childbirth at the hospital where she worked. She had to autopsy them. I didn’t know her well at the time so of course I assumed that she was upset for the same reason that such direct exposure to the death of babies would upset most people, but I was wrong. She was upset because she had to work late.

    Why can pathologists do their job without being traumatized? Maybe the difference is that pathology isn’t something that a guy off the street just gets hired to do one day. The people who end up being pathologists usually have other options, and they choose pathology because it doesn’t particularly bother them for whatever reason. Meanwhile these moderators are immediately dumped into the deep end, so to speak, and they may not be financially secure enough to leave the job even after they experience what it is.

    Can content moderation be done without traumatizing people? It isn’t a high-skilled, well-paid job so I don’t think filtering candidates the way that pathologist are filtered is practical. Not having content moderators also isn’t practical.

    (I’m using pathology as an example because that’s what I know a little about, but I think my statements are probably valid for other careers, like homicide detective, which also involve regular exposure to disturbing things.)