Quantcast
Channel: Terry Zink: Security Talk
Viewing all articles
Browse latest Browse all 243

Microsoft, the NSA, the Backfire Effect and how we all make bad decisions

$
0
0

A couple of weeks ago, I read a blog post on the Wall Street Journal where they were commenting on comments made by Brad Smith, Microsoft’s top legal counsel. His comments were in response to latest revelations that the NSA sometimes sniffs network traffic between data centers:

Microsoft’s top lawyer compared the National Security Agency to elite hackers, and said the technology giant will encrypt customer information traveling between its data centers, according to a company blog post published Wednesday night.

That makes Microsoft the latest Internet company – following Google, Facebook and Yahoo – to say it is encrypting internal traffic in response to NSA snooping efforts. The agency sometimes siphons off customer information traveling on rented fiber optics cables between U.S. company data centers, former U.S. officials have said.

Brad Smith, Microsoft’s general counsel, said the NSA is circumventing the legal process if those assertions are accurate. Smith, of course, does not mention the NSA by name, but clearly alludes to them.

“If true, these efforts threaten to seriously undermine confidence in the security and privacy of online communications,” Smith said in the blog post. “Indeed, government snooping potentially now constitutes an ‘advanced persistent threat,’ alongside sophisticated malware and cyberattacks.”

In other words, Microsoft is not okay with unauthorized government collection of user data.

image


But a more interesting article is one in Wired entitled Clash of the Titans! Inside Microsoft’s Battle to Foil the NSA. The title sounds like a spy novel, and in it Wired talks with Microsoft Technical Fellow Mark Russinovich who is one of the lead architects in Azure.

I have never met Russinovich but I have heard his name and seen it in various articles and possibly on email threads. But the part I want to get to is Russinovich’s opinion on whether or not Microsoft collaborates with the US government on creating back doors into its systems:

Amid the Snowden revelations, many pundits have also wondered whether the Microsoft brain trust — the people who run the company — have actively worked with the NSA to provide access to data. More than a decade ago, privacy geeks questioned Microsoft’s relationship with the agency when a researcher discovered a variable called “_NSAKEY” buried in the Windows operating system. More recently, Snowden’s leaked documents reportedly show that Microsoft cooperated with the FBI to make sure the government — including the NSA — could access Outlook.com e-mail.

But Russinovich says the NSAKEY controversy was a red-herring, and he believes that Microsoft would only be hurting itself if it cozied up to the NSA. “I can’t say for sure that that hasn’t happened, but I will say that I’m really skeptical that it could. The risk to the business is monumental,” he says.“Without trust, there is no cloud. You’re asking customers to give you their data to manage, and if they don’t trust you, there’s no way they’re going to give it to you. You can screw up trust really easily. You can screw it up just by showing incompetence. But if you show intentional undermining of trust, your business is done.”

The way I interpret these comments is that Microsoft never knowingly puts in back doors into its software and gives them to any government. To say that he can’t say for sure means that there may be some secret program he is not aware of but it would be localized to a very small group of people and it would be difficult to keep secret given the amount of scrutiny code receives internally.

That’s my view, too, but I’m just a ham-and-egger here within the company. I’m not that far up the chain.

image

But this is not what I want to focus on, either. Instead, I want to look at a psychological phenomenon known as The Backfire Effect.

Many of us here are familiar with Confirmation Bias. This is when we, as people, look for things we agree with and ignore things we don’t agree with. For example, if you’re a staunch Republican you probably watch Fox News and read right-wing blogs. If you’re a die-hard, left-wing liberal you probably watch Rachel Maddow and read The Huffington Post.

Confirmation Bias has been studied many times and confirmed multiple times over and it’s not just politics. It is psychologically painful to be on one side of an issue and read or listen to the opposing side. Try it yourself sometime – if you’re a political left-winger, watch Fox News’ editorials for 20 minutes without changing the channel. If you're a political right-winger, watch Rachel Maddow for 20 minutes and not tune out. You will struggle to reach the end of that 20 minutes. It will feel like such a relief when you flip back to what you already agree with.

The Backfire Effect is related to Confirmation Bias. It occurs when you are given material that contradicts what you currently believe, you discard it and it then ends up actually reinforcing what you previously believed. It doesn’t change your beliefs, it makes you more secure in what you though previously.

From You Are Not So Smart:

In 2006, Brendan Nyhan and Jason Reifler at The University of Michigan and Georgia State University created fake newspaper articles about polarizing political issues. The articles were written in a way which would confirm a widespread misconception about certain ideas in American politics. As soon as a person read a fake article, researchers then handed over a true article which corrected the first. For instance, one article suggested the United States found weapons of mass destruction in Iraq. The next said the U.S. never found them, which was the truth. Those opposed to the war or who had strong liberal leanings tended to disagree with the original article and accept the second.

Those who supported the war and leaned more toward the conservative camp tended to agree with the first article and strongly disagree with the second. These reactions shouldn’t surprise you. What should give you pause though is how conservatives felt about the correction. After reading that there were no WMDs, they reported being even more certain than before there actually were WMDs and their original beliefs were correct.

They repeated the experiment with other wedge issues like stem cell research and tax reform, and once again, they found corrections tended to increase the strength of the participants’ misconceptions if those corrections contradicted their ideologies. People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence was interpreted as threatening to their beliefs, they doubled down. The corrections backfired.

Once something is added to your collection of beliefs, you protect it from harm. You do it instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you.

 

When you read a negative comment, when someone dumps on what you love, when your beliefs are challenged, you pore over the data, picking it apart, searching for weakness. The cognitive dissonance locks up the gears of your mind until you deal with it. In the process you form more neural connections, build new memories and put out effort – once you finally move on, your original convictions are stronger than ever.

image

Via XKCD.

If you’re reading this, I hope you don’t feel too smug. I do this all the time. And so do you.

And that brings me back to the article in Wired. The gist of the article is this:

  • Microsoft was surprised by the scope of data collection by the US government
  • Microsoft is planning to encrypt all of its data
  • Microsoft does not insert any back doors into its software

Let’s now head to the comments of the article. An example of the Backfire Effect would be this: “Microsoft says they don’t insert back doors. Well, the fact that they deny it proves that they do it! Why else would they deny it!”

Do we see any examples like this in the comments? Yes, we do!

“Smokescreen. Microsoft regularly hands over encryption keys to governments such as India, Pakistan, UAE, China (and others), so they can monitor Skype and other programs.

As usual, follow the money. This is nothing more than a sophisticated PR campaign by the mega-corps”

And this:

yeah right, after MS being the first one to hop on the NSA bandwagon we now have to believe that they are fighting them, lipstick on the pig. I don't believe anything from a company who's business model was always about monopolizing and using their customers at any cost.

And this:

what's in it for Microsoft? you ask
GOVERNMENT CONTRACTS MONEY$$$$$$

 

And this:

Microsoft? The same company that altered Skype so that all calls go through a server that they control instead of directly between the two callers so it would be easy for the government to spy on them?

Yeah, this sounds like a puff piece of PR crap.

 

And this:

This M$ fluff piece is up there with 60 Minutes. Sad and tired, Wired.


Example after example of people discarding what the article said and re-iterating what they previously believed. This is a textbook example of the Backfire Effect. And here’s the thing – the more informed a person is about something, the more biased they are towards their own beliefs.

That’s part of the problem of an Internet-connected world with social media and news articles. Aren’t we supposed to live in an information utopia where we can learn everything, where right beliefs are only a few clicks away?

Yes, we do live there. But, our brains are not wired that way. For you see, millions of years of evolution have programmed us to protect our beliefs and shield our sense of selves from conflicting evidence. Rather than using the Internet to correct ourselves, we use it to reinforce what we believe. We quickly run to the sources that make our brains feel good and we express it online despite what anyone else says. From You Are Not So Smart:

When our bathroom scale delivers bad news, we hop off and then on again, just to make sure we didn’t misread the display or put too much pressure on one foot. [tzink – I do this] When our scale delivers good news, we smile and head for the shower. By uncritically accepting evidence when it pleases us, and insisting on more when it doesn’t, we subtly tip the scales in our favor.

- Psychologist Dan Gilbert in The New York Times

That is not to say Microsoft does or does not put in back doors (I don’t know but like Russinovich, I doubt it).

But I do know is this – I will interpret the evidence in a way that I already agree with. And so will you.

 

image


Viewing all articles
Browse latest Browse all 243

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>