Agree, and while folks have brought this up time and time again, a few CS Ethics folks (myself included) actually argue that in addition to this, ethics should be embedded *throughout* all STEM courses, not just treated as a one-time / one-off course that's taken separately.
A lot of STEM/CS programs do require some sort of ethics course, but it's often too general, too abstract, and too far removed from the particulars of what they study. It ends up being a course that a lot of students don't take seriously (and sometimes for good reason).
Having worked for a few years in the CS Education space, it's been really interesting to see how these sorts of conversations have evolved. There are definitely pros and cons to the one-course vs embedded ethics approaches!
For example, a lot of CS professors themselves just don't have the training, materials, resources, or interest in teaching ethics in their courses. This is in a way a political choice -- many CS professors do believe that CS systems are morally neutral, for example.
In my last job, I had the task of writing an accessible AI curriculum from scratch. Given this task, one of my goals was to have ethics included throughout all the topics, not just as a separate lecture of its own. This turned out *extremely* well.
Examples: When we got to the chapter on the importance of training data, it was the perfect time to talk about data privacy and bias. When we got to neural networks, it was a perfect time to talk about "black-box" algorithms and accountability.
What I realized while developing my curriculum with ethics at its core is that the technical examples help motivate the ethical questions. It makes them extremely concrete and allows students to understand the ethical implications on a practical and technical level.
In developing this work, I looked at a ton of existing CS ethics courses, by the way. If you haven't seen it before, the spreadsheet that @cfiesler started is *the* document to look at, with currently 273 courses listed from universities across the world.
For any of the folks who saw OP's tweet, or are reading this thread but unfamiliar with the work in CS ethics, the spreadsheet linked in the article above is something I highly recommend checking out. Esp. if you're a professor hoping to start your own CS ethics course!
Btw, the curriculum I developed was specifically for AI (though we covered CS fundamentals, just enough to work on some AI things), and it was mainly for *high schoolers*. I fully believe we can teach CS (& more generally, STEM) ethics to high school students! And we should!
In fact, one of the "Big Ideas" in the AI for K-12 guidelines is Ethics & Societal Impact -- the guideline treats ethics as something that should be present across all AI teaching, not just as a standalone topic. K-12 educators believe in this approach.
Here are the 5 "Big Ideas" in the AI for K-12 guidelines, in PDF poster form. Notice "Societal Impact" at the foundation, linked to other Big Ideas. (Disclaimer: I'm an advisory board member of the initiative, who... honestly needs to do so much more 😅)
Anyways, my main point being: we *can* teach ethics embedded within technical CS/STEM courses, it doesn't make the content any less technical, and in fact, it makes the issues even more compelling and grounded. It works better and has a deeper impact, imo.
And additionally, we *can* teach STEM ethics before college/university. I've had the great honor of working with high school students across North America, and the social implications & ethics discussions aren't just understandable, they're memorable for the students.
Importantly, for those who haven't heard of it, there's also a field called STS (Science, Technology, and Society studies) that focuses on how society and culture affects our technologies & how our technologies affect society and culture in return. Lots comes from STS!
Also, taking an ethics course is decently different from discussing tech ethics (or ethics specific to any STEM field). Like, there's a huge difference between talking about Kant's categorical imperative on its own and discussing it in relation to Asimov's Laws of Robotics.
Another reason why having ethics embedded within STEM courses is that it allows for specificity. Courses also carry specific domain knowledge, and an ethics course from the Philosophy department might actually push away STEM folks from thinking about ethics in their work.
Anyways, as a final note, when I say "ethics" above, I really mean ethics + social implications + STS + critical race theory + gender studies + so much more, because it's honestly all of that should be discussed in STEM courses.
It's valid and valuable to discuss things like ethical systems (like utilitarianism, virtue ethics, deontology, etc.), but that shouldn't exclude key topics like race and gender (which we know has huge life/death impacts in fields like medicine and AI).
But again, this makes things incredibly difficult from a logistics perspective. Is it possible for STEM teachers across the world to teach all of this in addition to the core content they're already trying to fit into their course? It's a non-trivial obstacle.
At least with a standalone ethics course, you can have specific "CS ethics" or "Medical Ethics" courses that rely on a textbook or pre-established standalone courses. Trying to change *every* STEM class that's taught to embed ethics? Hugely difficult (for the teachers).
But my hope is.. if we discuss ethics as a core, *technical* skill that's foundational to STEM (and not distinct), we'll train future generations to think and teach with it already fully embedded into the field. And I'm extremely hopeful, seeing the work of STEM educators today.
Ok, I know I missed some stuff with this thread, but I also can't keep it going forever 😅 Just know that for any STEM field, there are some great educators doing ethics/society/critical work (often women and non-binary folks of color leading the charge) -- look for them!!
Also! Now that I'm finally here... within CS/AI specifically, there's been a huge shift as of last year to focus on *justice* and not just *ethics*, which is a necessary reframing to address technology and its role in things such as systemic oppression.
This is more on the radical side of things apparently, but as part of all this, we gotta talk about social justice, we gotta talk about tech and labor, we gotta talk about **power**. A lot of existing stuff misses these, and imo, *that's* the issue.
Folks to look for: Dr. Safiya Noble, Dr. Shannon Vallor, Meredith Whittaker, Dr. Mar Hicks, Dr. Casey Fiesler, Dr. Cathy O'Neil, Dr. Sarah Roberts, Dr. Meredith Broussard, Dr. Ruha Benjamin, Joy Buolamwini, Dr. Madeleine Clare Elish, Dr. Genevieve Bell, Dr. Arvind Narayanan, ...
There are so many!!! Great work from AI Now, Data & Society, FAccT*, NYU's Critical Race & Digital Studies, UCLA's C2i2, The Markup, Algorithmic Justice League, and just hundreds of folks on here who I wouldn't be able to finish naming if I tried.
That's it, that's the thread. It's the middle of the night, so unfortunately not too many folks will be able to see this thread, but the "STEM/CS needs required ethics courses" tweets surface once a month, and I finally wanted to say a lil bit this time around. 😅
Here are some book recommendations that have informed how I think about social implications / ethics / critical race theory with regard to CS/AI, and I also highly recommend the forthcoming anthology, "Your Computer Is On Fire" which will be *essential*.
And instead of watching "The Social Dilemma" on Netflix, you *absolutely* must watch "Coded Bias", which covers some of the most crucial questions in AI today, around facial detection & recognition, surveillance, oppression, and so much more.
I've been giving talks and teaching in this space since 2016, and I'm happy to talk about the education stuff as well as the technical stuff! I have a background in CS, AI, and STS & am a teacher at heart. Happy to answer any questions or expand on any topics the best I can!
You can follow @WellsLucasSanto.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: