I know most of my followers on here are from the U.S and Canada, so if you have *any* interest in algorithmic accountability, please look into the A-level test result debacle that is currently unraveling in the UK. https://www.theguardian.com/education/2020/aug/13/who-won-and-who-lost-when-a-levels-meet-the-algorithm">https://www.theguardian.com/education...
Algorithmic-driven classification and prediction risk entrenching what Margaret Hu once called an "algorithmic Jim Crow."
(You can read more in a 2017 article by the same name. Scholarly paywalls apply. Msg me if you want access.) 2/x
(You can read more in a 2017 article by the same name. Scholarly paywalls apply. Msg me if you want access.) 2/x
This, in combination that these test results impact minors (aka children), a group of people who are already, by design, disenfranchised from political influence, means that test-takes face an even steeper hill if they want to fight this. 3/x
As I& #39;ve written elsewhere: because algorithms are enmeshed in political decision-making, these technologies claim to offer a vision of & #39;social good& #39; that can compete with liberal democratic commitments. 4/x
Like, say, building an inclusive society where a child& #39;s lifelong earning potential and social mobility aren& #39;t predicted by their public school affiliation. 5/x
Just a reminder: The term ‘mathwashing’ or ‘data
fundamentalism’ are sometimes used to describe the popular, though incorrect, perception that algorithmic systems are objective. 6/x
(See: Fred Berenson 2018; Kate Crawford 2019)
fundamentalism’ are sometimes used to describe the popular, though incorrect, perception that algorithmic systems are objective. 6/x
(See: Fred Berenson 2018; Kate Crawford 2019)
While I think people are actively pushing back against the idea that "algorithms are objective because they use math," I wouldn& #39;t be surprised if some begin arguing that the algorithm can& #39;t discriminate against poor students because "math can& #39;t discriminate." 7/x
Make no mistake: we are actively engineering the society we want to build.
We can either be intentional about the values we want automated systems to amplify, or we can hum loudly and let our social biases leak into the technical pipeline. 8/x
We can either be intentional about the values we want automated systems to amplify, or we can hum loudly and let our social biases leak into the technical pipeline. 8/x
All of which is to say: the Ofqual algorithm needs to undergo an algorithmic impact assessment (i.e. an algorithmic audit.) Like, last year would have been a good time. 9/x