[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Netsec] FW: CRYPTO-GRAM, May 15, 2011

-----Original Message-----
From: Bruce Schneier <schneier@SCHNEIER.COM>
Reply-To: Bruce Schneier <schneier@SCHNEIER.COM>
Date: Sat, 14 May 2011 21:04:28 -0400
Subject: CRYPTO-GRAM, May 15, 2011

>                  CRYPTO-GRAM
>                  May 15, 2011
>               by Bruce Schneier
>       Chief Security Technology Officer, BT
>              schneier@schneier.com
>             http://www.schneier.com
>A free monthly newsletter providing summaries, analyses, insights, and
>commentaries on security: computer and otherwise.
>For back issues, or to subscribe, visit
>You can read this issue on the web at
><http://www.schneier.com/crypto-gram-1105.html>.  These same essays and
>news items appear in the "Schneier on Security" blog at
><http://www.schneier.com/blog>, along with a lively comment section.  An
>RSS feed is available.
>** *** ***** ******* *********** *************
>In this issue:
>      Status Report: "The Dishonest Minority"
>      RFID Tags Protecting Hotel Towels
>      News
>      Hijacking the Coreflood Botnet
>      Schneier News
>      Drugging People and Then Robbing Them
>      Interviews with Me About the Sony Hack
>** *** ***** ******* *********** *************
>      Status Report: "The Dishonest Minority"
>Three months ago, I announced that I was writing a book on why security
>exists in human societies.  This is basically the book's thesis statement:
>     All complex systems contain parasites.  In any system of
>     cooperative behavior, an uncooperative strategy will be effective
>     -- and the system will tolerate the uncooperatives -- as long as
>     they're not too numerous or too effective. Thus, as a species
>     evolves cooperative behavior, it also evolves a dishonest minority
>     that takes advantage of the honest majority.  If individuals
>     within a species have the ability to switch strategies, the
>     dishonest minority will never be reduced to zero.  As a result,
>     the species simultaneously evolves two things: 1) security systems
>     to protect itself from this dishonest minority, and 2) deception
>     systems to successfully be parasitic.
>     Humans evolved along this path.  The basic mechanism can be
>     modeled simply.  It is in our collective group interest for
>     everyone to cooperate. It is in any given individual's short-term
>     self-interest not to cooperate: to defect, in game theory terms.
>     But if everyone defects, society falls apart.  To ensure
>     widespread cooperation and minimal defection, we collectively
>     implement a variety of societal security systems.
>     Two of these systems evolved in prehistory: morals and reputation.
>     Two others evolved as our social groups became larger and more
>     formal: laws and technical security systems.  What these security
>     systems do, effectively, is give individuals incentives to act in
>     the group interest.  But none of these systems, with the possible
>     exception of some fanciful science-fiction technologies, can ever
>     bring that dishonest minority down to zero.
>     In complex modern societies, many complications intrude on this
>     simple model of societal security. Decisions to cooperate or
>     defect are often made by groups of people -- governments,
>     corporations, and so on -- and there are important differences
>     because of dynamics inside and outside the groups. Much of our
>     societal security is delegated -- to the police, for example --
>     and becomes institutionalized; the dynamics of this are also
>     important.
>     Power struggles over who controls the mechanisms of societal
>     security are inherent: "group interest" rapidly devolves to "the
>     king's interest."  Societal security can become a tool for those
>     in power to remain in power, with the definition of "honest
>     majority" being simply the people who follow the rules.
>     The term "dishonest minority" is not a moral judgment; it simply
>     describes the minority who does not follow societal norm.  Since
>     many societal norms are in fact immoral, sometimes the dishonest
>     minority serves as a catalyst for social change.  Societies
>     without a reservoir of people who don't follow the rules lack an
>     important mechanism for societal evolution.  Vibrant societies
>     need a dishonest minority; if society makes its dishonest minority
>     too small, it stifles dissent as well as common crime.
>At this point, I have most of a first draft: 75,000 words.  The
>tentative title is still "The Dishonest Minority: Security and its Role
>in Modern Society."  I have signed a contract with Wiley to deliver a
>final manuscript in November for February 2012 publication.  Writing a
>book is a process of exploration for me, and the final book will
>certainly be a little different -- and maybe even very different -- from
>what I wrote above.  But that's where I am today.
>And it's why my other writings -- and the issues of Crypto-Gram --
>continue to be sparse.
>Lots of comments -- over 200 -- to the blog post.  Please comment there;
>I want the feedback.
>** *** ***** ******* *********** *************
>      RFID Tags Protecting Hotel Towels
>The stealing of hotel towels isn't a big problem in the scheme of world
>problems, but it can be expensive for hotels.  Sure, we have moral
>prohibitions against stealing -- that'll prevent most people from
>stealing the towels.  Many hotels put their name or logo on the towels.
>  That works as a reputational societal security system; most people
>don't want their friends to see obviously stolen hotel towels in their
>bathrooms.  Sometimes, though, this has the opposite effect: making
>towels and other items into souvenirs of the hotel and thus more
>desirable to steal.  It's against the law to steal hotel towels, of
>course, but with the exception of large-scale thefts, the crime will
>never be prosecuted.  (This might be different in third world countries.
>  In 2010, someone was sentenced to three months in jail for stealing
>two towels from a Nigerian hotel.)  The result is that more towels are
>stolen than hotels want.  And for expensive resort hotels, those towels
>are expensive to replace.
>The only thing left for hotels to do is take security into their own
>hands.  One system that has become increasingly common is to set prices
>for towels and other items -- this is particularly common with bathrobes
>-- and charge the guest for them if they disappear from the rooms.  This
>works with some things, but it's too easy for the hotel to lose track of
>how many towels a guest has in his room, especially if piles of them are
>available at the pool.
>A more recent system, still not widespread, is to embed washable RFID
>chips into the towels and track them that way.  The one data point I
>have for this is an anonymous Hawaii hotel that claims they've reduced
>towel theft from 4,000 a month to 750, saving $16,000 in replacement
>costs monthly.
>Assuming the RFID tags are relatively inexpensive and don't wear out too
>quickly, that's a pretty good security trade-off.
>Blog entry URL:
>Stealing hotel items:
>Nigerian case:
>or http://tinyurl.com/3z7p98w
>RFID chips in towels:
>or http://tinyurl.com/6bp4lkr
>** *** ***** ******* *********** *************
>      News
>WikiLeaks cable about Chinese hacking of U.S. networks:
>Increasingly, chains of evidence include software steps.  It's not just
>the RIAA suing people -- and getting it wrong -- based on automatic
>systems to detect and identify file sharers.  It's forensic programs
>used to collect and analyze data from computers and smart phones.  It's
>audit logs saved and stored by ISPs and websites.  It's location data
>from cell phones.  It's e-mails and IMs and comments posted to social
>networking sites.  It's tallies from digital voting machines.  It's
>images and meta-data from surveillance cameras.  The list goes on and
>on.  We in the security field know the risks associated with trusting
>digital data, but this evidence is routinely assumed by courts to be
>accurate.  Sergey Bratus is starting to look at this problem.  His
>paper, written with Ashlyn Lembree and Anna Shubina, is "Software on the
>Witness Stand: What Should it Take for Us to Trust it?."
>Interesting blog post on the security costs for the $50B Air Force
>bomber program -- estimated to be $8B.  This isn't all computer
>security, but the original article specifically calls out Chinese
>computer espionage as a primary threat.
>A criminal gang is stealing truckloads of food.  It's a professional
>operation.  The group knew how wholesale foodstuff trucking worked.
>They set up a bogus trucking company.  They bid for jobs, collected the
>trailers, and disappeared.  Presumably they knew how to fence the goods,
>The CIA has just declassified six documents about World War I security
>techniques.  (The media is reporting they're CIA documents, but the CIA
>didn't exist before 1947.)  Lots of stuff about secret writing and
>pre-computer tradecraft.
>or http://tinyurl.com/6h5e6zg
>Hard-drive steganography through fragmentation:
>or http://tinyurl.com/4xz4vc5
>or http://tinyurl.com/3cyhves
>As I've written before, I run an open wi-fi network.  After the stories
>of people being arrested and their homes being invaded based on other
>people using their networks to download child porn, I rethought that
>position -- and decided I *still* want to run an open wireless network.
>or http://tinyurl.com/3nvokkh
>The EFF is calling for an open wireless movement.
>It's standard sociological theory that a group experiences social
>solidarity in response to external conflict.  This paper studies the
>phenomenon in the United States after the 9/11 terrorist attacks.
>or http://tinyurl.com/3oxwkm5
>or http://tinyurl.com/3moz2en
>Good paper:  "Loving the Cyber Bomb? The Dangers of Threat Inflation in
>Cybersecurity Policy," by Jerry Brito and Tate Watkins.
>or http://tinyurl.com/3dcahg3
>or http://tinyurl.com/3pdmlou
>Also worth reading is an earlier paper by Sean Lawson: "Beyond Cyber
>"ReallyVirtual" tweeted the bin Laden assassination without realizing it.
>The Nikon image authentication has been cracked.
>or http://tinyurl.com/4yv49pw
>Canon's system is just as bad, by the way.
>Fifteen years ago, I co-authored a paper on the problem.  The idea was
>to use a hash chain to better deal with the possibility of a secret-key
>According to this article, students are no longer learning how to write
>in cursive.  And, if they are learning it, they're forgetting how.
>Certainly the ubiquity of keyboards is leading to a decrease in writing
>by hand.  Relevant to security, the article claims that this is making
>signatures easier to forge.  I'm skeptical.  Everyone has a scrawl of
>some sort; mine has been completely illegible for years.  But I don't
>see document forgery as a big risk; far bigger is the automatic
>authentication systems that don't have anything to do with traditional
>Unintended security consequences of the new Pyrex recipe: because it's
>no longer useful in cooking crack cocaine, drug makers now have to steal
>better stuff from laboratories.
>or http://tinyurl.com/6967a22
>"Operation Pumpkin":  Wouldn't it have been great if this were not a
>joke: the security contingency in place if Kate Middleton tried to run
>away just before the wedding.
>Bin Laden's death causes spike in suspicious package reports.  It's not
>that the risk is greater, it's that the fear is greater.
>Exactly how did they confirm it was bin Laden's body?
>or http://tinyurl.com/3vrate8
>Here's a clever Web app that locates your stolen camera by searching the
>EXIF data on public photo databases for your camera's serial number.
>Forged memory: a scary development in rootkits.
>or http://tinyurl.com/3dpxsyk
>New vulnerability in online payment system: the connection between the
>merchant site and PayPal.
>or http://tinyurl.com/3q3j4ob
>In online hacking, we've moved to the world of "steal everything."  As
>both data storage and data processing becomes cheaper, more and more
>data is collected and stored.  An unanticipated effect of this is that
>more and more data can be stolen and used.  As the article says, data
>minimization is the most effective security tool against this sort of
>thing.  But -- of course -- it's not in the database owner's interest to
>limit the data it collects; it's in the interests of those whom the data
>is about.
>Medieval tally stick discovered in Germany.  Note the security built
>into this primitive contract system.  Neither side can cheat -- alter
>the notches -- because if they do, the two sides won't match.
>"Resilience of the Internet Interconnection Ecosystem," by Richard
>Clayton -- worth reading.
>or http://tinyurl.com/69fcyql
>or http://tinyurl.com/3kkzdmq
>or http://tinyurl.com/3fmskr7
>FBI surveillance tools:
>** *** ***** ******* *********** *************
>      Hijacking the Coreflood Botnet
>Earlier this month, the FBI seized control of the Coreflood botnet and
>shut it down:  "According to the filing, ISC, under law enforcement
>supervision, planned to replace the servers with servers that it
>controlled, then collect the IP addresses of all infected machines
>communicating with the criminal servers, and send a remote 'stop'
>command to infected machines to disable the Coreflood malware operating
>on them."
>This is a big deal; it's the first time the FBI has done something like
>this.  My guess is that we're going to see a lot more of this sort of
>thing in the future; it's the obvious solution for botnets.
>Not that the approach is without risks:  "'Even if we could absolutely
>be sure that all of the infected Coreflood botnet machines were running
>the exact code that we reverse-engineered and convinced ourselves that
>we understood,' said Chris Palmer, technology director for the
>Electronic Frontier Foundation, 'this would still be an extremely
>sketchy action to take. It's other people's computers and you don't know
>what's going to happen for sure. You might blow up some important
>I just don't see this argument convincing very many people.  Leaving
>Coreflood in place could blow up some important machine.  And leaving
>Coreflood in place not only puts the infected computers at risk; it puts
>the whole Internet at risk.  Minimizing the collateral damage is
>important, but this feels like a place where the interest of the
>Internet as a whole trumps the interest of those affected by shutting
>down Coreflood.
>The problem as I see it is the slippery slope.  Because next, the RIAA
>is going to want to remotely disable computers they feel are engaged in
>illegal file sharing.  And the FBI is going to want to remotely disable
>computers they feel are encouraging terrorism.  And so on.  It's
>important to have serious legal controls on this counterattack sort of
>or http://tinyurl.com/63qupg8
>or http://tinyurl.com/3koydsp
>** *** ***** ******* *********** *************
>      Schneier News
>Last year, I spoke last year at a regional TED event: TEDxPSU.  The talk
>is now on the TED website.
>** *** ***** ******* *********** *************
>Interviews with Me About the Sony Hack
>These two interviews are what I get for giving interviews when I'm in a
>bad mood. For the record, I think Sony did a terrible job with its
>customers' security. I also think that most companies do a terrible job
>with customers' security, simply because there isn't a financial
>incentive to do better. And that most of us are pretty secure, despite
>One of my biggest complaints with these stories is how little actual
>information we have. We often don't know if any data was actually
>stolen, only that hackers had access to it. We rarely know how the data
>was accessed: what sort of vulnerability was used by the hackers. We
>rarely know the motivations of the hackers: were they criminals, spies,
>kids, or someone else? We rarely know if the data is actually used for
>any nefarious purposes; it's generally impossible to connect a data
>breach with a corresponding fraud incident. Given all of that, it's
>impossible to say anything useful or definitive about the attack. But
>the press always wants definitive statements.
>** *** ***** ******* *********** *************
>      Drugging People and Then Robbing Them
>This is a pretty scary criminal tactic from Turkey.  Burglars dress up
>as doctors, and ring doorbells handing out pills under some pretense or
>another.  They're actually powerful sedatives, and when people take them
>they pass out, and the burglars can ransack the house.
>According to the article, when the police tried the same trick with
>placebos, they got an 86% compliance rate.
>Kind of like a real-world version of those fake anti-virus programs that
>actually contain malware.
>or http://tinyurl.com/3flomba
>** *** ***** ******* *********** *************
>Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing
>summaries, analyses, insights, and commentaries on security: computer
>and otherwise.  You can subscribe, unsubscribe, or change your address
>on the Web at <http://www.schneier.com/crypto-gram.html>.  Back issues
>are also available at that URL.
>Please feel free to forward CRYPTO-GRAM, in whole or in part, to
>colleagues and friends who will find it valuable.  Permission is also
>granted to reprint CRYPTO-GRAM, as long as it is reprinted in its
>CRYPTO-GRAM is written by Bruce Schneier.  Schneier is the author of the
>best sellers "Schneier on Security," "Beyond Fear," "Secrets and Lies,"
>and "Applied Cryptography," and an inventor of the Blowfish, Twofish,
>Threefish, Helix, Phelix, and Skein algorithms.  He is the Chief
>Security Technology Officer of BT BCSG, and is on the Board of Directors
>of the Electronic Privacy Information Center (EPIC).  He is a frequent
>writer and lecturer on security topics.  See <http://www.schneier.com>.
>Crypto-Gram is a personal newsletter.  Opinions expressed are not
>necessarily those of BT.
>Copyright (c) 2011 by Bruce Schneier.
>** *** ***** ******* *********** *************
>To unsubscribe, click this link: