Signs of Triviality

Opinions, mostly my own, on the importance of being and other things.
[homepage] [index] [jschauma@netmeister.org] [@jschauma] [RSS]

Ethical Obligations in Internet Operations

October 13th, 2015

The following is (more or less) the talk I gave today at Velocity NY 2015. The slides are also available here; the results of my survey leading up to this talk are available here. O'Reilly also has a video of the session available here.


Thank you for coming! I'm very excited to be here and to get to talk about a topic that I think is really important: drug abuse by medical practitioners. No, sorry, actually, it's about ethics in game journalism. Ugh, no, wrong again. Actually, it is about ethics, but in Internet Operations. But you already knew this, since you came to this talk. Thank you! Let's start said talk.

This talk is different. It contains no memes. That was difficult. This talk also does not include the words "docker", "containers", "introvert", "imposter syndrome", "dunning-kruger", "blameless post-mortem", or "microservices". My apologies.

This talk is different. You don't get a long list of my various and impressive accomplishments and professional affiliations. I'm not speaking on behalf of any current, former, or future employer.

"I'm from the Internet, and I'm here to help."

The only thing I'll quickly tell you about me is that you can reach me on Twitter via @jschauma, and I'll be happy to follow up on any discussions or comments you might throw at me. If you're not on Twitter, that's ok, too. You can email me, if you like.

This talk is different. There will be no "one more thing". I'm aware that I am not Steve Jobs, which is one of the reasons why I do not own a black turtleneck.

Another thing I'm aware of is that most of you are likely to only pay attention at the beginning of the talk and then stop listening and stare intently at your phones or laptop screens, seeing what you might be missing out on in the other talks. But because I think this topic is important, I'm going to quickly give the most pertinent point, a summary -- I refuse to call it a "tl;dr" -- right away:

We are stewards of our users' data.
We are obligated to act in the public interest.

This is the one thing I want you to take away from this talk: we are stewards of our users' data; we should strive to act in the public interest.

There, you can leave now. Well, I'll try to give you a bit of context over the next 20 minutes or so, but that really is the essence. I have no lessons to teach you, and the subject matter requires you to make up your own mind instead of getting spoonfed solutions. Again, I apologize: I know that's work.

You will find that this talk bounces back and forth between two topics: the definition of our profession, and the quest for strong guidelines that help us navigate ethical dilemmas. These two topics are intricately entwined: without understanding our profession, we can't define any useful ground rules to guide us. But I'm getting ahead of myself.

Oh, and one more thing: Everybody lies.

Which is not necessarily unethical. Ethics are tricky. In Ethics, intent counts, but the ends don't justify the means. Especially in a profession that is not well defined. Or on TV.

Take 'House' for example. Misantrophic, drug addicted, constantly flaunting any available rules, regulations, laws. It's almost as if he worked in Infosec. Anyway, entirely unethical. Great entertainment, shitty role model.

First, do no harm. This is what every med student is taught, what every hospital show on TV references. It is the most recognizable, morally binding principle in a profession.

It seems like a good idea, too. If you practice a profession in which you are responsible for the lives of others, having a simple ground rule like this should help practitioners of said profession make difficult decisions.

There are other professions which have legally binding Codes of Ethics, and violations of these rules can lead to severe consequences, including legal repercussions or a loss of a license to practice said profession. The Hippocratic Oath, the American Bar Associations' Model Rules of Professional Conduct, or the American Society of Civil Engineers' Code of Ethics are examples of such self-regulating or self-policing organizational definitions.

Our profession... is different. It is entirely uncontrolled, unlicensed, unregulated: anybody can call themselves a "software engineer" or "systems architect".

This is reflected in how we select employees: requirements are fluid, formal credentials optional. To be honest, we don't really know exactly what our profession is.

Leading up to this talk, I posted a survey with a few questions relating to 'Ethics in Internet Operations', and the results have been very interesting and encouraging for me. Show of hands: who where in the audience filled out this survey? Well, anyway, thanks to all who did.

So I asked: "Does your profession have a Code of Ethics?". The majority of people responding did not consider their profession "well defined"; half of you said that your profession either doesn't have a Code of Ethics or you simply don't know.

But there are plenty of professional organizations, complete with Codes of Ethics for their members. It just seems that the majority of WebOps, SysAdmins, SREs, or software engineers have never heard of them.

Ok, another show of hands: who here is a member of a professional organization? (DevOps Anonymous does not count) ACM, IEEE, Usenix, perhaps?

Again, the majority of people who responded to the survey were not members. And that is despite the fact that the survey was promoted on the mailing lists of exactly those organizations. It's also worth noting that the survey -- much like this audience -- is not fully representative of all practitioners in our profession (or professions, as the case may be).

Our profession is not well-defined. But not having a well-defined job is actually quite wonderful. It allows for people from all walks of live to join us in building the internet and all the amazing things around it. It lowers the barrier of entry, and avoids elitist money walls. System Administration, for example, has long eschewed a formal definition as much as a career path or common, required education. In a field that reinvents itself every couple of years, having an expensive, formal, lengthy education and licensing process would be fatal.

And yet, without any formally required expertise, common body of knowledge, without a formal definition of our profession, we are the ones who are running this show. This is amazing.

We run the DNS.

We run the peering points. (This graph comes from the New York International Internet Exchange, a peering point housed here in NYC at Telehouse, near the Highline.) We run the cables on the ocean floors as shown in today's keynote.

We build the OS and the apps running on the various personal tracking devices voluntarily carried by most people.

We run the infrastructure and the services that allow the mapping of relationships between human beings (and probably a fair share of cats) to an amazing degree. (This graph from Facebook.)

How cool is that?

Here's what else we do: we run the companies that collect all of our users' data. We build software that directly or indirectly influences people's lives. We run the cloud.

And we actually know that there's no such thing as 'The Cloud', only other people's computers.

And of course we know that all these things we build are constantly under attack. We know that this house of cards we're building is this close to tumbling down. We know that script kiddies and nation state attackers alike are after our users' data.

Which is why we also build tools to defend against the constant attacks. We build and run services for users whose lives depend on all of us not fucking things up. And the software we create, the services we maintain, are neither good nor evil; software is -- by and large -- neutral, and can be used for good or bad.

And yet, somehow, we keep thinking that it's a good idea to put computers and the internet on other things. Like...

...refrigerators... (shout-out to @internetofshit)

...or cars. Which are then hacked.

...or guns. "These apps interact with embedded wifi servers..." - what could possibly go wrong?

When we do these things, are we acting as good stewards of our users' data? Are we acting in the public interest? Are we ensuring that we are not putting our users or the infrastructure of the internet at risk? Are we ensuring that we first do no harm?

We've been building Skynet for a long time now. The road to skynet is paved with civilian unicorns. Did you know that Cyberdyne's original marketing slogan was "Make the world a better place"?

Many companies have 'core values', 'corporate mottos', or moral boosting slogans and mission statements. But note that 'primum non nocere' does not mean "first, disrupt urban dogfood delivery". It does not mean "first, increase shareholder value", either.

"First do no harm" is powerful because it explicitly includes the possibility of unintentional harm as a side-effect. "Don't be evil", which many of you may have heard as an example of a corporate motto, is significantly more dismissive here. For starters, "evil" is subjective.

The funny thing about most companies is that at the end of the day they remain beholden to outside pressure to act a certain way. Companies are organisms with a self-preservation interest, and no matter how noble their slogans, at some point ethical conflicts are inevitable.

In my own survey, almost 70% of you reported having encountered ethical conflicts in your career, and the example you gave run the gamut from snooping on others' emails to being ordered to cook the numbers, to enforcing NSLs, to, and this is a quote, "building Skynet and Big Brother".

"This is a daily thing."

So most of you are intimately familiar with ethical dilemmas, yet we have no guiding principles beyond each of our own personal moral compass.

You may consider consulting interally with HR or your legal department, but, and it's easy to forget this, HR and Legal do not have your interests at heart, nor do they primarily care about your users. Their job is to ensure that the company is not liable.

Legal draws a line for you. A line that tells you what you must not do. And as a result, when people ask Legal's advice, they often take their response as a recommendation to tip-toe as closely to the line as possible. But ethics is not black and white. Something that's legal may well be unethical.

When you tip-toe all the way up to the line, interpreting rules or laws by the letter, then you may find yourself in a situation like this:

Remember Volkswagen? Nice cars. Great mileage. Clean diesel.

Think about this: at some point, somebody at VW went to the engineers and said "please have the software detect when emission tests are being performed, and then lower them".

This is the literal interpretation of passing emissions guideline tests. If you're a lowly software engineer at Volkswagen, and your manager orders you to write that code to pass the goddamn tests, do you think HR is going to be on your side when you speak up?

This is why whistleblowing takes so much courage, and why I think our industry would do well to instate a routine position in every company with the explicit task to watch how the company behaves internally. A watchdog, a public ombudsman, a customer advocate. Somebody who fights for the users.

We need Tron.

We need a formal position explicitly looking out for the public interest. A position to question the direction business wants to take, to ask whether or not something is in the best interest of the users, without fearing retribution or having to think about politics and the power plays involved.

And how cool would it be to have a position where people ask: "Who's that?" and you can answer "That's Tron. He fights for the users."

We have to fight for the users, because our users trust us. This is an explicit responsibility beyond our immediate job duties and reaching further than our company goals. Software eats the world, and data keeps on paying the bill. Users entrust us with their data -- some of it consciously, but most of it unconciously. (Nobody reads the ToS. I don't, and I'm paranoid.)

Users trust us. Sadly, more often than not we betray their trust by collecting more data than we need, by sharing it with others ("legal said that's ok"), or by not properly protecting it.

Remember how I said that we are stewards of our users' data? The word 'steward' is derived from the old 'stigward', a guardian of the sty. I like this, because it pretty much tells us that data is pig manure, and we're the ones shoveling it from one corner of the sty to another.

Data is a liability. You want less of it, not more. And there we frequently find a conflict of interest between your company's "mission" and the responsibility we have to our users. When the VCs come home to roost, our growth hackers and monetization teams need something to show for.

The data we collect will outlive any intended use case or good-will by your company; your company's values and mottos, even if they were uncompromisable, do not apply to the profession at large. I'd much rather we didn't have all this pig manure to deal with in the first place.

And data can hurt us. The Data that we collect can have real-life impact on actual people if it is not protected appropriately. Now it is easy to wag your finger at Ashley Madison and its users, but it's worth remembering that the data leaked from the Ashley Madison hack has the potential to ruin people's lives; several suicides have already been linked to this data having been made public.

The data we collect is dangerous. We are responsible for how it's used, how it's protected. The more data we collect, greater our responsibility. One year of free credit reports does not make up for identity thieves ruining your credit.

Here's an interesting thing I've come across in preparing for this talk: while our profession is not well-defined, there is a well-defined consensus as to what a Code of Ethics should address. Here's what I've found to be the very first, the most important guideline across a significant number of organizations, the prime directive if you will:

Don't be a dick.

Alright, that's excellent advice, but most organizations try to reach a bit further.

The ACM Code of Ethics begins: "Contribute to society and human well-being."

(ISC)2's Code of Ethics: "Protect society, the commonwealth, and the infrastructure."

IEEE Code of Ethics: "to accept responsibility in making decisions consistent with the safety, health, and welfare of the public"

Australian Computer Society: "You will place the interests of the public above those of personal, business or sectional interests."

This is the primary, the most important guideline various independent professional organizations have agreed upon. And I think that's a pretty good one. At least as good as Wheaton's Law.

I particularly like that it explicitly acknowledges the conflict of interest we may encounter, stressing that it is our responsibility to place the public interest above those of our immediate stakeholders (or shareholders).

We are privileged. We are privileged to have the knowledge and means to influence the world. And this privilege comes with the implicit responsibility of protecting and putting above others the public interest.

The medical profession has the hippocratic oath; while not legally binding, violations may lead to malpractice claims. The American Bar Association's Model Rules for Professional Conduct are equally non-binding, and yet we all know the TV dramas in which lawyers face disbarment over ethical violations.

I have not once heard of anybody in Internet Operations or software engineering losing their ability to work -- not just a job -- based on these grounds.

Our profession may be undefined (and certainly seems infinitely complex), but I don't think that should keep us from agreeing on common ethical guidelines. We already do agree, it seems to me. What we're missing is bringing these principles to the forefront, to make them known, understood, appreciated by all of us. To teach them to our young 'uns.

Doing the right thing isn't always easy, but doing the easy thing isn't always right. I hope that with an increased awareness of our privileged position it will become easier and more common to acknowledge our responsibilities beyond those to whomever currently pays the bills.

We are stewards of our users' data.

We are obligated to act in the public interest.

You all can help further this goal. Be Tron. Fight for the users.

Thanks for your time.

October 13th, 2015


Related:


[Ethical Obligations in Internet Operations - Survey Results] [Index] [Three Simple Questions]