• eldavi@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 day ago

    it’s not just ai or training data for it; it’s developers themselves too, including fediverse ones.

    the biggest non-ai/training examples that i can think of came from times before ai was ever a thing like:

    • usps had difficulty validating addresses because the software they obtained assumed euro-centric naming schema

    • airlines, health care providers, hotels, and state motor vehicle departments rejected registration/reservations because trans people have to option to select their sex

    • health care providers misdosed patients because the software they used didn’t account for highly-athletic/bodybuilding people or people with chronic conditions

    there are SO MANY examples out there where the bias clearly comes from the developer instead of the training data and there’s no way that any piefed developer is immune or can even effectively mitigate their biases.