New Ethics Code Urges Tech Firms and Coders To Avoid Harming Society

Share

Selling a new Web-connected thermostat or other wired gizmo to consumers without a plan to deliver the necessary security patches is not only bad business—it’s unethical. So is failing to challenge a law or tech company rule that governs work on technology products, if that rule causes unjustifiable harms to people or the environment.

The New York-based trade group, founded in 1947, last updated its ethics code in 1992, around the time when today’s twenty-something tech startup founders were being born. Those babies entered a world where the commercial Internet was also in its infancy, with few home computers connected to a network. Yet to arrive were social media, e-commerce, widespread GPS tracking, rampant network hacking, bots, trolls, artificial intelligence, and the proliferation of wired cameras on store fronts, house entryways, and family cars.

It was time for ethics to catch up, says Don Gotterbarn, one of the leaders of an ACM committee that surveyed the international association’s nearly 100,000 members to formulate a longer and more detailed set of principles for computer professionals. While the commercial uses of computer programming began with simple business efficiency tools like making interest calculations for banks, Gotterbarn says, software and digital devices now pervade financial, social, and political activity—profoundly changing society.

“Code is involved in every aspect of your life,” Gotterbarn says.

It would be natural to suppose that the ACM began its ethics code revamp in response to the turbulent issues that have been swirling around the tech sector in recent years. Not only did cyberattacks continue against big corporate targets, but concerns also emerged about the easy hackability of connected devices that are flooding into U.S. homes, from smartphone-operated surveillance systems to dolls that converse with kids. Individual companies came under criticism as well. Uber was accused of turning a blind eye to sexual harassment, of mistreating its drivers, and of purloining the intellectual property of Waymo, an autonomous car unit of Google’s parent company Alphabet. Then Facebook was hauled before Congress to explain how the personal information of millions of its users ended up in the hands of consulting firm Cambridge Analytica, which assisted Donald Trump’s presidential campaign—now under investigation for alleged coordination with Russian operatives to influence the U.S. election outcome.

The ACM, however, didn’t start on its new ethics code to address those specific incidents, Gotterbarn says, but rather to keep up with the rapid pace of technological change in areas such as cybersecurity and artificial intelligence. The long-term aim was to influence programmers and tech company leaders to make a routine practice of recognizing the societal impact of advancing technology, and to mitigate its harms. To that end, some of the more specific sections of the new ethics code cover cutting-edge technologies such as machine learning.

The new ACM principles urge technologists to take “extraordinary” care to flag and reduce possible risks in machine learning systems, in which software robots learn from experience and modify their own actions as they carry out tasks—without the need for re-programming by a human being.

“A system for which future risks cannot be reliably predicted requires frequent reassessment of risk as the system evolves in use, or it should not be deployed,” the ACM code advises.

The same careful monitoring is urged for products that become broadly integrated into essential human activities, such as travel, government, and healthcare. That ethical responsibility grows as the societal influence of the technology increases, the ACM guidelines say.

“When appropriate standards of care do not exist, computing professionals have a duty to ensure they are developed,” the ethics code states.

The code’s guidelines on protecting the privacy of personal data are more detailed than ACM’s privacy provisions in 1992, and they intentionally track with the European Union’s stringent General Data Protection Regulation (GDPR), Gotterbarn says. (The E.U. began enforcing the GDPR rules in May.)

For example, the code says that tech companies should collect only the minimum amount of personal information necessary, protect it from unauthorized use, and give users a choice about those permitted uses. Transparent policies should disclose what data is collected, provide users the opportunity to give informed consent, and allow them to review, correct, and delete their personal data held by companies, according to the new ethics code.

Similarly, the new code has expanded its safeguards against workplace bias and abuse. Back in 1992, the ACM’s ethics code contained strictures against discrimination on the basis of gender, race, and other categories. But today’s language includes a broader range of protected groups. For example, it now covers an individual’s self-definition relating to gender.

“Prejudicial discrimination on the basis of age, color, disability, ethnicity, family status, gender identity, labor union membership, military status, nationality, race, religion or belief, sex, sexual orientation, or any other inappropriate factor is an explicit violation of the Code,” the guidelines advise.

Sexual harassment and bullying in the workplace are also called out as unethical forms of discrimination.

Other provisions require tech companies and their staffers to make technology accessible to people hampered by physical disabilities or other constraints; to make sure their products are shielded by robust security measures; and even to consider becoming whistleblowers if leaders fail to eliminate or minimize unacceptable harms that are caused by computer systems.

ACM’s new code of ethics is more detailed than the standards laid out by IEEE, another large technology trade group based in New York. IEEE has both a Code of Conduct and a Code of Ethics that members are pledged to follow. Both cover topics touched on by the ACM code, as well as provisions on avoiding conflicts of interest and bribery.

What about “fake news?”

Some of the thorniest ethical questions facing tech companies and the public right now are not specifically addressed in detail in the new ACM code. For example, how should a company react if its social communication channels are being used to spread “fake news?”

Gotterbarn points to several general principles in the current ethics code for guidance. First, the code … Next Page »