Ethical Development: The responsibility of software developers in society
“May you live in interesting times”, or so goes the ancient Chinese curse. That certainly seems to be applicable to the past few years, arguably even decades. Technology marches on apace, creating a razor sharp double-edged sword that is capable of holding governments to account, yet also capable of slicing into our personal privacy and stifling dissent. Social media has now moved on from an innocent way of staying in touch with remote friends, encouraging collaboration, tolerance and discussion, and has now been weaponised in order to influence opinions, support corrupt organisations, and manipulate the Overton Window. We now have wannabe dictators attempting to force online platforms to treat speculation and lies the same as undisputed truths. We even have leaders of hugely influential companies abdicating their social responsibility to act against bad actors, allowing their platform to be taken over by fascist and racist agendas, presumably to protect bottom line profit. It is now possible, more than ever, for Governments to harvest information on their citizens and profile them - something which we know historically is a Very Bad Idea™ if it falls into the wrong hands[1].
Yet one thing binds all of this together. One fundamental thing that is needed that realises all of this potential, for both good and evil. Software. And software is written by people.
This might be an unpopular opinion, but I truly believe Software Developers need to raise their heads above their monitors and take a long hard look at the effect that their craft is having on society. Some already are - there have been notable, vocal resignations and non-cooperation inside Facebook recently, following Zuckerberg’s refusal to shut down fake news (incidentally, refusing to take down deliberate, malicious fake news is not “arbitration of truth”. It is blatant cowardice in my opinion. But the nature of fake news and what to do with it is a discussion for another day, not here. However, do look up Popper's Paradox of Tolerance). Google was relatively recently forced to stop developing weaponised technology. But this behaviour needs to be more widespread. It has to be in order to maintain a liberal, open, welcoming society based on mutual respect and equality.
As software devs, we all need to become more aware of the wider impact of what we do within society, and how it impacts everyone. Unfortunately there is one BIG problem. For as long as I remember, Software Developers in general suffer from a degree of naivety. There is an innate innocence. It is almost like we can create hugely sophisticated software, solve ridiculously complicated problems, but have a massive blind spot to how it could be used. We trust everyone to play by the rules. For example, a lesson from history:
In the beginning (seriously! Late '60s) Donald Davies, helped by Roger Scantlebury and others, developed a technique called 'packet switching', the backbone of modern internet protocols. This was designed to be resistant to attack, so if sections of the network were, say, blown up (this was the height of the Cold War, remember. Nuclear strikes were considered a distinct possibility) it would route around and recover. Much, much later the inventors realised that this was its strength and weakness, and the architecture really needed stronger authentication, but by then it was ubiquitous and too difficult to change. Why was that mistake made? "I knew everyone on the early internet. They would not misbehave because the community would not allow it to happen"[2].
This pattern of naive trust has carried on throughout the modern day internet. Another notable incident happened around 1994. Usenet (the predecessor to modern day social media) was self-policing, with the more experienced users helping new users learn correct "netiquette". Then a lawyer, Laurence Canter, realised that he could advertise his services across all groups. And so spam was created....
And do I really have to mention Facebook and the Cambridge Analytica scandal?
Developers are undoubtedly fantastic at writing software and solving problems, especially when it is an intellectual challenge. But generally we do not consider the potential, unexpected uses for our solutions. Or if we do, we don't act on it, making us complicit in the damage. It might seem fun to break into, say, an iPhone. Or collect all the data about everyone. It might even seem superficially necessary for all kinds of possible reasons. But what happens when organised crime gets hold of the ability and grabs your banking login? Or when the government uses the software to decide whether you are speaking to the "wrong" people? Or you are refused medical insurance because data analysis suggests you might be a bad risk? It is these side effects that can matter far more at a human and societal level.
Paraphrasing Dr Malcolm in Jurassic Park:
“We are so preoccupied with whether we can, we don’t stop to think whether we should”So how can we become more aware?
One idea that has been bubbling under for years is the idea of agreeing to a binding Code of Ethics for software development; an ethical oath very much like the Hippocratic Oath that defines the moral framework for the conduct of doctors and other healthcare professionals. I am no stranger to the concept - my personal career started in electrical and electronic engineering, which led me to join the IEE (now the IET), which sets well defined expectations on how to behave professionally. As such, I chose to be voluntarily bound by a code of behaviour and ethics for decades (my professional registration as a Chartered Engineer depends on it), and it forms a strong part of my values. I strongly believe ethics help define me as a professional software engineer. As a result, there have been a number of projects I have consciously chosen not to work on.
So what could a Code of Ethics look like? Here’s a pretty good one from the IET magazine from back in March 2012:
IET Magazine, Engineering & Technology, March 2012 |
The IET, my current professional institute, has Rules of Conduct covering these matters, including:
29. Persons in any category of membership shall at all times uphold the dignity and reputation of their profession, act with fairness and integrity towards everyone with whom their work is connected, and towards other members and safeguard the public interest in matters of health, safety, the environment and otherwise.
The Engineering Council (who I am certified through) and the Royal Academy of Engineering also define a Code of Ethics, including the following:
• be alert to the ways in which their work and behaviour might affect others and respect the privacy, rights and reputations of other parties and individuals
If you are BCS registered, they have one too...
I think we can all agree that to be guided by principles such as these would be a good idea, and would help our industry become better respected. Not to mention making the world a better place. Whether we need to make it a legal requirement for practicing software development is a discussion for another day.
Applying these principles is not just a nice to have. For me, and many others, they are a fundamental part of being a professional engineer, a necessary check and measure that differentiates someone from a hired hand that blindly does anything asked. Who would you rather designed and built a road bridge? A civil engineer who is required to advise you correctly or lose her effective licence to operate, or a yes-man who just nods along, does what he's told and cashes the cheques?
I would encourage everybody to read these Codes of Conduct, and reflect on how they apply to how and what you are working right now. Would an ethical code such as these make you uncomfortable about anything you are doing, or working on at the moment? Why? How can what you are currently doing affect wider society? Can it be used or manipulated in unexpected ways to have undesirable results?
What are you going to do about it as a professional engineer/craftsperson?
[1] Pre-WW2, The Netherlands kept detailed population registers, which included religious views. Needless to say, when the Nazis invaded the subsequent persecution was made much easier by these records. They eliminated a far higher percentage of the Jewish population than in comparable, presumably less well documented, countries.
[2] Aleks in Wonderland, Radio 4, interview with Roger Scantlebury.
Comments
Post a Comment
Comments on this blog are moderated. Intelligent comments will be published, but anything spammy or vexatious will be ignored.