April 19, 2024

cedric-lachat

education gives you strength

Mutant Algorithms Are Coming for Your Education

(Bloomberg Opinion) — Bad algorithms have been causing a lot of trouble lately. One, designed to supplant exam scores, blew the college prospects of untold numbers of students attending International Baccalaureate schools around the world. Then another did the same for even more students in lieu of the U.K.’s high-stakes “A-level” exams, prompting Prime Minister Boris Johnson to call it a “mutant” and ultimately use human-assigned grades instead.

Loading...

Load Error

Actually, I would argue that pretty much all algorithms are mutants. People just haven’t noticed yet.

The foibles of algorithms usually go unseen and undiscussed, because people lack the information and power they need to recognize and address them. When, for example, computers issue scores that decide how much people pay for mortgage loans or insurance policies, or who gets a job, the victims of mistakes typically don’t know what’s going on. Often, nobody even tells them their score, let alone how it was calculated or how to compare it against a benchmark. This makes it difficult for them to identify one another, band together and complain — or to compel authorities to come in and fix things.

The situation with the student-grading algorithms was an exception. Tens of thousands of students were assessed at the same time. They had “ground truths” with which to compare their scores — for example, how they were doing before the assessment and what grades their teachers expected them to receive. They were well equipped to share and inspect their results for consistency and fairness — and to see, for example, bias against kids from disadvantaged neighborhoods and schools. And they had a powerful weapon — outraged parents — to push politicians and university administrators to discard the flawed results.

So an unusually public scandal shed some light on how bad most algorithms really are. But it didn’t fix or eliminate them. On the contrary, as the coronavirus crisis and pre-existing trends force budget cuts, computers are likely to be replacing human judgment even more, particularly in American higher education. Many admissions and student-services offices already use algorithms, typically to supplement humans. Now that’s likely to flip, with a few humans overseeing a fleet of algorithms that essentially replace bureaucracy. And there’s no reason to expect those algorithms to be better than Boris Johnson’s mutant.

Worse, institutions across the country and around the world tend to buy their algorithms from the same third-party data companies. So those algorithms’ weaknesses and biases — which tend to perpetuate and amplify the racial and gender biases baked into the historical data — have the potential to proliferate broadly through the entire process of admitting students, distributing financial aid, grading, recommending classes or majors and surveilling students for Covid-19 or other risks. This doesn’t bode well for kids who happen to be born in places with failing schools, poor health care, toxic environments and few successful college graduates.

The companies and institutions that deploy high-stakes algorithms tend to rely on plausible deniability — if they don’t know what’s going on in the black box, they can’t be held responsible. That’s not good enough. All algorithms should be seen as untrustworthy until proven otherwise. Until we as a society acknowledge this, and insist on the transparency required for the public to assess reliability and fairness, we’re not ready to use them.

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Cathy O’Neil is a Bloomberg Opinion columnist. She is a mathematician who has worked as a professor, hedge-fund analyst and data scientist. She founded ORCAA, an algorithmic auditing company, and is the author of “Weapons of Math Destruction.”

For more articles like this, please visit us at bloomberg.com/opinion

©2020 Bloomberg L.P.

Continue Reading

Source Article