Signs of Triviality

Opinions, mostly my own, on the importance of being and other things.
[homepage] [index] [jschauma@netmeister.org] [@jschauma] [RSS]

All Is Not Lost (But We Need Your Help)

January 21, 2014

The following is, to some degree, anyway, the content of a talk I gave on January 20th, 2014, at the New York City @OpenITP Techno Activism 3rd Monday. I work in Information Security and I live in New York; I'm usually rather cynical (but I repeat myself). Here, I tried to scale that down a bit in an effort to encourage others, to be a bit more optimistic.

The talk was recorded by ISOC NY; you can find the video here. If you don't have the time to read this page or watch the video, you can quickly skim through the slides here:


I have 2 daughters, one of them old enough to take to musicals or movies. In the last couple of months, we took her to see "Annie" and "Matilda", and after later hearing the soundtrack a few hundred thousand times, I noticed a stark contrast in the message that each story sends.

Little Orphan Annie is clearly a story of the Great Depression, "It's the Hard-Knock Life" is defining. The moral appears to be to keep a stiff upper lip through all the misfortune, to stick out your chin, and grin, to hang on 'til tomorrow, come what may. Eventually Big Daddy Warbucks is gonna bail you out and take care of you.

Matilda, on the other hand, is very different. One of the main themes in the book and stressed in the musical is the need for you to stand up for yourself, to not accept things as given, and that even if you're little you can change your story. I find this rather uplifting, and particularly apropos to some of the news stories we've seen in the last few months.

Social media and privacy appear to be at odds with one another. The only viable business model for startups appears to be:

  1. Offer some free service.
  2. Collect data about users.
  3. Shove advertizing in their faces.
  4. Make it rain.

This is not always explicitly outlined quite in these terms, but it appears to happen more or less automatically and perhaps surreptitiously. Investors provide buckets of money based on the possibility to "monetize" "big data". There really isn't much "disruption" going on, and the effects of rich, spoiled, white kids building applications to address the needs of other rich, spoiled, white kids have been elaborated upon elsewhere.

We do have a conflict of interests, but I believe that the last few months have made a significant difference in influencing the direction in which we channel our energies. The world is changed.

On June 4th, 2013, we followed the same mundane headlines as we had before. On June 5th, 2013, this story broke:

June 5th
Headline

The world is changed. But it was changed for the better. In the coming months, we heard revelation after revelation about how our privacy is violated, disregarded, and dismissed. The scale of the secret dragnet spying of one nation's intelligence agency on its citizens, allies, and the rest of the world is perhaps only met by the laughable choice of program names and almost farcical powerpoint slides. PRISM, MUSCULAR, XKEYSCORE, ..., RAGEMASTER, HOWLERMONKEY, DROPOUTJEEP. The list goes on. And on. On Friday we learned more details about project DISHFIRE and listened to a predictably disappointing set of "reforms" being suggested by the president.

Given these news, how can I say that the world is changed for the better? Recognized security experts appear to tell even advanced users that the bottom line is to give up and accept the fact that if the Mossad is part of your threat model, "YOU'RE STILL GONNA BE MOSSAD'ED UPON". Replace "Mossad" with "NSA" -- the argument remains. But, while that article is certainly funny and makes a number of valid points (including the one about properly identifying your threat model), this particular aspect is a dangerous over-simplification, sending the wrong message: just hold still, there's nothing you can do, you might as well enjoy it.

Mathilda

But the world is changed. It was changed for the better, because now we know. June 4th, 2013, was the darker time. Now we can stand up, we can fight back.

If you sit around and let them get on top, you
Might as well be saying you think that it's OK,

And that's not right.

The world is changed. All is not lost, because it turns out that Newton's Laws of Motion apply in cyberspace as well. (Why yes, I did say "cyber" -- drink!) For every action, there is an equal an opposite reaction. People will fight back. Many already do.

All is not lost. Companies are strengthening their encryption efforts. Cynics will point out that they are only doing so now that the spotlight is on them, and that they should have done so all along. But be that as it may, the motivation here does not influence the end result, which is better security for their customers. There's visible progress.

But we need your help.

Privacy is not about "hiding" things. It is about being able to say "That is none of your business." And that makes privacy your business. Privacy is all of our business. We are in this together, and we need your help. These companies are businesses, as are almost all of the other players in this game. How do businesses work? Businesses are in the money making business. They like that business. And they want to stay in that business.

Given the necessarily limited resources available to almost all businesses, they will inevitably favor projects that are likely to increase revenue, and stop funding or supporting those that do not. This happens even in companies that do give a fuck about privacy. So we need your help.

As customers, especially those who are in the privileged situation of actually understanding some of the privacy and security implications, you have a responsibility to request the features that are important to you. Lean on these companies to implement privacy preserving features; demand security features, end-to-end encryption, transparency. And take your business elsewhere if companies continue to ignore these requests. This is necessary because businesses are more and more metrics driven, and they attempt to measure the impact of all features.

Dilbert on metrics

Unfortunately, this approach can become a trap, especially when it comes to privacy or security. Since we're metrics driven, we require that anything we do be measurable. As a result, we are often influenced by both observational and confirmation biases in our metrics: we measure that which we believe to be important, and we we assign greater value to that which is measured. But not everything that is important can be measured -- metrics in information security, for example, would often require you to prove a negative in order to show measurable success.

It's important to relate what you're doing to actual humans, not abstract numbers of users only. Consider end-to-end encryption of private messages (such as on Twitter, Facebook, Apple's iMessages, text messages, ...): from a privacy perspective, most of us don't care much that those are available to the companies (and, under certain circumstances, certain governments/criminals). Only a very small number of people would be directly affected by this, but the impact for those who are, can be immense. For those, it makes your product unusable. But any research purely based on numbers would necessarily conclude that the feature in question does not make "business sense" -- too few of your customers would benefit directly, and nobody's asking for it.

Mole with flies Also note that if you're actually successful, and your product becomes used by a lot of people, measuring your impact may lead you to appeal to the lowest common denominator and only push or fund what is likely to appease the majority. Aside from having your artisanal, bespoke Elite-Hipster membership card revoked, you run the danger of watering down your product and tailor it to the "average" user. (This is a variation of the Tyranny of the Masses.)

90% of everything is crud, and a million flies can't be wrong.

Steve Jobs famously said: "People don't know what they want until you show it to them." I would suggest that sometimes people don't know what they want even if you show it to them. Understanding online privacy and security is difficult, and "normal" people appear to be psychologically inclined to choose options different from those us geeks would expect. Also: humans just plain suck at evaluating actual risk.

Security software is developed by security nerds, targeting a group of users who can make a reasonable decision when faced with a host-key mismatch or a certificate error. That is a very different set of people from the ones actually using your product. Products shipped to your general user base are frequently developed without the input of anybody with a security background. It is no surprise that the former, if at all usable by non-experts in the domain, often fail or frustrate other users, and that the latter may end up exposing users to significant and frequently unobvious risks.

But security nerds fall into the easy trap of blaming the user. What sort of idiot wouldn't know that their host key has changed and that they need to update that one line in their ~/.ssh/known_hosts file before they connect to the target host, but only after having verified -- out of band -- the validity of the new fingerprint?

Comicbook Guy

Over here in what I jokingly refer to as the "real world", it is the security nerds' responsibility to seek out help. It doesn't really matter what you are working on, you need help. If you are a product developer, talk to your friendly security team early on in the development process; if you're working on security, talk to your developers and sysadmins when you are planning on implementing new restrictions or features. Regardless of what you're working on, talk to some freaking user interface experts.

One of the biggest challenges privacy and security tools face is the way in which they interact with their users, because they have the rather difficult task of making something very, very complex become easy and transparent. Failure to choose the right options or defaults can have disastrous consequences: Falsely believing that you ordered a pair of shoes when you didn't results in some annoyance and having to place a new order; falsely believing you sent a message unable to be read by anybody but the intended recipient when it could, in fact, have been intercepted or was sent to the wrong person can land you in jail. Or worse.

As meaningful social bonds are limited by Dunbar's number, we tend to divide groups into "us" and "them": your team versus the others, your company versus the others, your industry versus the others. But it's important to remember that "others" are, for the most part, anyway, neither inherently evil, nor made up exclusively of bozos. It is too easy to make such generalizations, but doing so allows us too easily to dismiss any chance for hope or improvement.

Chart of how we see
ourselves

And that isn't helping. Privacy is not just the business of one single team within a company. You need to build a culture of privacy, of protecting your users, of defending their rights, and to improve the industry and society. To do that, you need collaboration.

This collaboration between teams that traditionally had nothing much in common with one another has, over the last few years, become one of the hallmarks of the DevOps movement, in which developers work more closely with operational people instead of tossing deployment snapshots over the wall to deploy on your systems. As a culture that encourages sharing and collaboration, we could do worse than to adopt some of those same principles in the Information Security world.

In order to produce more secure, more trustworthy products, we don't need more security nerds -- we need a security-aware culture. And the best way to build a security-aware culture is to build a learning culture. In our case, this could entail:

  • Allow and encourage your team members to join other teams for some time in order to understand the way they work and what their pain points are, especially with a focus on the interactions with your team. "Before you criticize somebody, walk a mile in their shoes" -- then you're a mile away, and you have their shoes! This also allows everybody to see the "other" team as actual people, which makes interactions ever so much more pleasant.
  • Be an enabling part of the overall effort. It is near impossible to increase security throughout your systems against the will of the users. You must not get in the way of them getting their job done -- if you do, they will find ways around any restrictions you come up with. What's worse, they will learn to avoid you. Instead of saying "No, you can't do that", say "Let me find a way for you to do this safely." Say "yes" more than you say "no".
  • If you want your users to follow certain recommendations, they need to understand why they should do so. The best part of this responsibility is that while you teach others, you are going to learn more about the topic yourself. (I've been teaching for over ten years, and yet every semester I learn something new about the topics at hand.) Likewise, encourage others to teach you, to allow you to learn how the product flows, what the data needs are, for example.
  • Seek out communities outside of your area of expertise. Visit or speak at meet-ups, usergroups, or conferences that are outside of your filter bubble.

Here's the best part: Knowledge is not a zero-sum game. You sharing your knowledge does not mean that you have less of it (quite the contrary, actually). And this holds across the industry: information security is a non-competitive factor in social media. All participants are facing the same threat actors, and sharing of information and intelligence here should be the norm.

Even small acts can have a big impact. Don't be shy to stand up and ask what your company is doing to make things better, to protect your users, to help others, to support non-profit organizations like EFF or OpenITP.

Courage is contagious -- which is why all is not lost. But we do need your help.


Just because you find that life's not fair, it
Doesn't mean that you just have to grin and bear it.
If you always take it on the chin and wear it,
You might as well be saying you think that it's OK,
And that's not right.
And if it's not right, you have to put it right.

But nobody else is gonna put it right for me.
Nobody but me is gonna change my story.
Sometimes you have to be a little bit... naughty.

January 21, 2014


[Wait, wait... Don't Pwn Me!] [Index] [Using Tor to Circumvent Country Origin Restrictions]