Courtesy of Sumil Koumar.
Ethics for Programmers, not PhilosophersMateo MontoyaBlockedUnblockFollowFollowingFeb 25Moving ethics out of the ivory tower and into cyberspaceLessons from a post-industrial world: specialize, specialize, specialize.
In this frenzy to specialize, young professionals are encouraged to focus on niche problems in discrete fields.
This expertise, we are told, earns us a seat at the table whether in academia or industry.
Increasingly, we are seeing the detrimental effects of this trend in its impacts on practice.
The separation of ethics from practice is one that is quietly accepted.
Philosophy and related departments discuss ethics in a professional setting, systematizing ethical frameworks that draw from myriad intellectual sources.
We are told that we can choose from deontology, consequentialism, or virtue ethics — each carries its own intellectual history which should be studied in order to get at its nuance.
Indeed, committed philosophers even refuse general terms such as these, and instead limit themselves to the study of one or, if they are daring, two previous philosophers, teasing out the stakes and implications of their arguments.
The study of ethics, then, rightfully consumes the effort and develops the expertise of these students who preserve and invigorate the tradition.
This important practice, however, largely fails to reach the audience required to make ethics a part of the everyday.
Similarly, in the world of technical education, specialization also runs supreme.
Students of computer science, statistics, or the emerging field of data science take a few introductory courses which give them breadth, and then are expected to funnel into certain specialities: machine learning, network security, data analytics, etc.
Put your nose to the grindstone, focus on one subject, and there will be a high-paying job at the end of the tunnel.
Ethics is relegated to perhaps a single class requirement which gives an overview of the various broad ethical frameworks, and attempts to get students to reflect on their fields in those views.
What’s the problem?.Even in these courses, I believe that ethics is too detached from practice to make an impact on a practitioner’s consciousness and, well, practice.
It is nice to think of what should happen, but when that knowledge is not synchronized and embedded within technical education, it can be hard to imagine how to apply it — or what the benefits of learning ethics even are.
What happens when ethics is divorced from practice?.In the early 2000s, Northpointe, now Equivant, created COMPAS, a polarizing algorithm which claimed to predict the recidivism risk of inmates and citizens on bail hearings.
It was, and still is, widely used to influence decisions in courtrooms — with journalistic watch dogs such as ProPublica claiming that it exhibits concerning racial bias.
In 2014, the Facebook Contagion Experiment was published in PNAS, a respected scientific journal, where Cornell researchers used data collected from Facebook studies that alternated positive and negative content on their users’ feeds to determine if it had an effect on their future posts, abstracted as their psychology.
Without following any of the requirements of human subjects research guidelines, private social media companies continue to rapidly expand psychological testing on unknowing and unconsenting subjects.
In 2016, Cambridge Analytica and others bought the personal data of hundreds of thousands of Facebook users, targeting political ads to select groups and, in so doing, potentially swayed the results of a U.
Much has been made of Russia’s role in the 2016 election, but just as much should be made of an American platform that profits on targeted advertisements and is complicit in the practice of willful misinformation.
These more well known cases are symptoms of a much wider and deeper problem: ethics does not play a sufficient part in the calculus of large tech companies to the detriment of citizens, and in response, the companies themselves.
More concerning than the fact that these companies have shown little effectual change in their practices after substantial public and private pressure is that people’s outrage and desire to change circumstances seem to have no real outlet.
There is no legitimate alternative to Facebook, to Instagram, to Snapchat, or to any of the big name companies that make their profits from user data (Facebook makes over $25 per user per quarter in the US).
Compounding this, US law lacks any real teeth to tackle current privacy issues being discussed and the US Congress seems unable to cognize how the Internet, let alone Facebook, functions.
In short, people are beginning to feel the heat of large ethical issues that have been discussed in universities for at least a few years now, but we are seeing little change outside of the ivory tower.
The response, then, is to dive into these issues with greater intensity, greater urgency, and with a larger audience.
Ethics in 2019 are just as, if not more, important for the programmer than they are for the philosopher.
To answer the call, we must consider the role of programmer, as programmer, to include ethical questions.
We must provide programmers with an ethical toolkit to draw from, and have them practice applying ethical lenses to their projects and technical work.
We must create more opportunities and spaces for students and eager professionals to take risks and explore more ethical possibilities for tech platform profit schemes, study designs, and start-ups.
We must combat narratives and anxieties that ethics are not profitable by treating budding ethical ideas in the technical world with the reverence, respect, and investment required to get them off the ground.
We must demand that existing companies offer ethical alternatives, with fiscal trade-offs where required.
The privilege of existing within the protective walls of the university also comes with the responsibility to develop audacious ideas, construct positive incentive structures in the world, and challenge the current arc of the technical ecosystem that today bends towards exploitation and alienation.