Engineers and Ethics in Software

A serial tech entrepreneur in Silicon Valley once asked me to design a “social stockade” for his financial services customers. It would lock people out of their social media accounts and tweet out/FB share to their friends when they hadn’t paid a loan. He pitched it to prospective employees as meaningful work that would reduce the cost of loans for the needy.

I was horrified that his product was being built and that many others would likely take the role I was turning down. And he was hardly the first to pitch his “innovation” as providing only good.

Every software engineer I’ve worked with has had a strong sense of personal values and ethics, but the organizations we work for can take actions that are at odds with these. I’d like to highlight a few of the key challenges you’ll face and provide feedback for living your personal values. Most importantly, it’s critical that you think about the impact of your work and consciously set your personal values in advance of inevitable future challenges.

Part of the reason for the mismatch is that software engineers especially care about the loss of privacy, the centralization of power, and the destruction of open standards/software – all things that can be highly profitable. We envision a world where we are better off with technology (the PC, the dawn of the Internet, cryptocurrencies), not a technological dystopia.

Personal Dilemmas

Engineering ethics are much discussed when lives are directly at risk (aeronautics, construction, automobiles, defense) – but not as consistently discussed in software. As it is, small changes in many important software products (text messaging, a newsfeed, search result rankings) can mean the difference between life and death.

Even when they don’t, they dramatically impact the welfare of individual users and our society. This includes everything from dark patterns to the addictive behavior encouraged by our products to the bias encoded into AI algorithms.

Here are examples of personal dilemmas faced by software engineers:

  • Designing a social echo chamber: You design and then code up the world’s best algorithm to determine what content a user wants to consume. It greatly increases revenue and time in app helping your company make money. It often makes your users insanely happy as they get to see pictures of their family more and connect with their friends. But it also surrounds them in an echo chamber of news shared by their friends which highlights selective facts and distorts their view of the world (partially inspired by Facebook)
  • Disabling security: Your team of smart, passionate PhDs has some powerful technology to compare and match images. After a lot of false business paths, you engineer a way to generate revenue for your company by serving up relevant ads to consumers. But to get there, you have to override existing ads by disabling your user’s browser certificate system by using your own self-signed cert (inspired by Superfish)
  • Encouraging compulsive user behavior: You’re an engineer at a gaming company building sophisticated games. Your team is composed of brilliant artists, graphic engineers, and storytellers who you love working with. Most of your revenue comes from a small subset of whales, either children or other users who seem to show behavior consistent with gamblers. You then use a number of techniques to coax even more money from these whales, negatively impacting a number of lives/relationships (inspired by Tap Fish and video slot machines)

These examples are meant to provoke a discussion – not call out organizations. Even though all examples above have negative effects, your personal work may be helping your team and is something your immediate users “value”.

Some examples above will face universal opprobrium among hackers (Superfish), while others will generally be cheered (Facebook). In other cases, organizations will face strong anger due to their privacy choices, but many engineers still aspire to work at them. In past generations, we could blame the “suits” – though today, more engineers wear them.

To me, the engineer is the person most likely to trip the ethics circuit breaker in questionable circumstances. They generally have a strong sense of right and wrong, have substantial power in many tech organizations, and don’t always have P&L responsibility. They have an independent streak and are willing to disagree, as they don’t need to fit in. And most didn’t pursue engineering to get rich, and are willing to trade money for other benefits.

Our world is replete with examples of engineers who put personal principles above other considerations. Google’s founding engineers expressed their famous “don’t be evil” mantra – so that people would have the permission to consciously discuss “Is this evil?” in their day to day work.1 Edward Snowden weighed his personal ethics versus the goals of his organization, and made sacrifices for these principles. And ignoring such outsize examples, I’ve heard multiple times about a “problem” engineer being the reason that a company wasn’t able to engage in morally questionable or illegal activity.

It may seem overly simple to lay these questions at the feet of a single individual or function. Most of the debates I see engineers struggle with should frankly be set at the management, governmental, or societal level. Still, society may be too hard or slow to influence. And many companies’ interests often focus on financial returns to the exclusion of much else.

Personal Value Challenges

I’ll make three observations:

Problematic Problems

By far, the biggest mistake we make is not paying attention to the impact of our work (see one engineer’s reflection). Personal challenges will be dressed up as exciting engineering problems, with uneven effects on our users/society. You shouldn’t let an exciting technical problem override thinking about the impact.

You’ll often be shielded from the negative parts of your work: your company will only highlight the benefits of your work, often giving you an inspiring, but false mission. Your team will value your work and your coworkers will generally be good people, who it seems never could cause harm. The harm doesn’t come from the fact that they’re malevolent people. Rather, the effects of their work are far reaching and they’re selectively focusing on its benefits from afar.

Just like the abstraction in our codebases, you’ll be decoupled from your users and the effects that your software has on them. You’ll often work at a very detailed level, without seeing the bigger picture. It’ll take effort – and often time – to understand the effects. And when critics call your team/product out, you’ll have to reflect first rather than get defensive.

For most, your work will always have both benefits and downsides – and it’s important to reflect on both of these. Just like you wouldn’t deploy code without understanding the impact, make sure to reflect on the complex changes your work causes for yourself, your team, and your community. And make adjustments, when after “deployment”, you realize that you were wrong in your assumptions.

How would you react if you (or your family) were the one having to deal with the negative effects? How comfortable are you with the tradeoffs you’re making?

Shades of Gray

Real world ethics entail shades of gray, not easy good and bad distinctions. Many schools teach engineering ethics and values with clear life or death risks like the Space Shuttle Challenger and the Ford Pinto.

For software developers today, a more common example may the dangers of adding highly addictive elements into your game or social media product – even when your users “love” your product.

Given how much of engineering is about tradeoffs and understanding complex systems, engineers are uniquely well suited to handle these gray questions. Like much of what we do, though, it takes time and reflection.

Value Distortion

Your personal values will get distorted over your career.

Ethics quandaries get more challenging when you have more to lose and so have less incentive to speak up. Think about how your incentives and principles suddenly might change if you got a promotion or new financial responsibilities at home.

Values are also contagious, meaning that the people you surround yourself by will subtly influence yours. By choosing to work at any organization, you’re setting your values for years to come.

When there are many people involved, there’s a failure of the commons – as each person is deferring responsibility to someone else. And if you don’t undertake an act that goes against your principles, competitors will – which is a challenging problem, but also an excuse for all nature of behavior.

Setting Your Personal Values

Personal values are personal – and so rather than suggesting what they should be, I will offer feedback on your process:

First, realize that you have power – and responsibility. Our power derives from how valued engineers are in technology, and the important roles we play on our teams. Our responsibility comes from the fact that we create products and because we know better. Just because you’re not CEO or team lead, does not mean you have no responsibility.

This isn’t to say that everything depends on us alone; frankly, much needs to change in societal norms and management. But we’re one important check.

At NASA, engineering decisions are critically important – and employees will think deeply about the impact their mistakes will have. In many of the great technology companies, decisions may be even more important, even though it may not seem that way. It’s incumbent on you to reflect on the potential changes your work causes.

If you’re junior, it may be harder to have your voice heard – but because you’ve been in the organization for such a short time, you may also be the one most likely to see problematic problems.

Second, spend time thinking about your personal values. List them out – and reflect on how important each one is to you. Why? Which ones are you willing to change or give up, given other benefits? Are you ok with that? Based on the jobs available, which principles are you likely to break?

Crucially, you need to tell people around you about them, and have them hold you accountable (this is a form of the Ulysses pact). This provides a check in new circumstances. Repeat the exercise every few years to see how they’ve changed, as they inevitably will. And to get a sense of what’s to come, talk to your engineering friends about what tradeoffs they’ve had to make.

Third, don’t be ok with the status quo in yourself or those around you. Engineers have a strong sense of morals and are comfortable doing uncomfortable things like speaking up, even when it comes at a profound personal cost.

You should critically think whenever you hear something from management, and understand the incentives they face. Have a thoughtful discussion with friends who you think are straying from their personal values. Don’t think principles only come when you’re financially secure.

As a counterpoint, many values should not be taken dogmatically: if you’re financially struggling, it may be hard to follow through on personal principles. At the same time, you also have choices in your personal burn rate to reduce the likelihood that you’ll need to do certain work in the future.

There is value to remaining in organizations that are struggling with their values, as you may be able to influence the dialogue. But if you are in a truly problematic organizations or teams, you should leave. You should also privately tell internal and external engineers about the concerns you have – which is especially powerful in an industry where attracting top talent is so critical even for the most successful companies.

What personal values are important to you? What issues have you faced in your career related to these personal values? What advice would you have for other engineers?


All opinions expressed are solely my own. Thanks to the following for feedback and warm-spirited debates:

  • Paul Buchheit
  • Josh Cincinnati
  • Siddhartha Dalal
  • Timnit Gebru
  • Eddie Kim
  • Ben Lerner
  • Deyan Vitanov
  • Tony Xiao
  1. Thanks to Paul Buchheit for providing context around the value of “Don’t be evil” in the early days of Google. In his words, “I wanted something that, once you put it in there, would be hard to take out.”
%d bloggers like this: