During an in-class case study discussion about algorithms predicting financial risk, the most obvious realization hit me. If, as these companies boast, we really can predict to nearly to a tee who is susceptible to defaulting on their loans, why would we use that information to deny loans, rather than providing social support and interventions to this target population? The reason certainly is tied to the fact that this point in history is particularly characterized by the criminalization and pathologization of poverty. In reality, nobody wants to be poor. Nobody wants to default on their loans. Instead of moralistically blaming people for their missed payments, what would it look like to study and dismantle the barriers to timely payments?
Though I may have been on my own limb in that regard, we did come together as a class regarding other interactions of the technological, the social, and the political. We ultimately decided that regardless of the inherent profit motive of individual corporations, products must be held to different ethical standards before their algorithms are sent to market. Most importantly, these standards must be set by public officials without a primary interest in the companies’ profits, and ideally, established by those trained in social and humanistic backgrounds who could revise the ethical standards regularly.
On a related note, the keynote for the Open Learning Consortium Accelerate 2018 conference revolved around the following video released by the Institute for the Future:
Though I had to miss the keynote because of classes, upon first watching this video, my body was teeming my mixed feelings. The video itself felt very promising, exciting- even inviting. The music, the animations, and the content all spoke to the ideas those attending an open learning conference would likely be attracted by. The concepts of expanding our understanding of who is a learner, who is a teacher, and what counts as formal education were central to the idea of the edublock.
In navigating my confusion around the topic, Dr. Sample introduced me to the term “silicon snake oil.” Referencing the contemporary adaptation of the peddling of snake oil as cure-alls to physical ailments, in Silicon Valley, the snake oil is a flashy, new technical product proffered as the answer to all social and political problems. For me, the main takeaway of this conference and this video came from my friend and colleague Annie Sadler, “Students are not a problem to be solved by your next ed-tech product.” What would happen if ed-tech as an industry was less focused on finding the next snake oil solution to students’ problems and more focused on actually working through social barriers to accessible and meaningful educational experiences? I imagine that I’ll be sitting with that question for quite a time to come.