Accés ràpid intranet

Més informació...

a a a



Hiding in the crowd: Privacy Through Plausible Deniability in P2P Systems


Jordi Duch

Professor/a organitzador/a


Northwestern University, Evanston, Illinois


27-03-2009 12:00


Peer-to-peer computing has enabled a wide range of new and important Internet applications ranging from largescale data distribution to video streaming and telephony. The approach provides scalability, reliability and high performance by taking advantage of large-number of cooperative, interconnected hosts. While much of the strength of the P2P model lies on the large numbers of connections among participating nodes, these same connections offer multiple of opportunities for eavesdropping. With P2P networks increasingly under surveillance from private and government organizations, there is an urgent need for privacy-enhancing systems that are both effective and practical. A number of efforts attempt to conceal connection data with private, trusted networks and variable levels of encryption. Although effective at restricting access to the content exchanged over a given connection, many existing approaches leave the existence of the connection itself visible. In our tech report, we show that these connections erode user privacy in a way that is ignored by most distributed systems and transparent to end users. This work focuses on the BitTorrent file-sharing network where peers connect solely on the basis of common and concurrent interest in the same content, rather than on friendship, common language or geographic proximity. Using connection patterns gathered from BitTorrent users, we study the existence of communities -- collections of peers significantly more likely to connect to each other than to other random peers. We show that strong communities naturally form in BitTorrent, with users inside a typical community being 5 to 25 times more likely to connect to each other than with users outside. Historically, this ability to classify users has been abused by third parties in ways that violate individual privacy. We show how these strong communities enable a guilt-by- association attack, where an entire community of users can be classified by monitoring one of its members. Our study demonstrates that, through a single observation point, an attacker trying to identify such communities can reveal 50% of the network using only knowledge about a peer's neighbors and their neighbors (i.e., up to two hops away). Further, an attacker monitoring only 1% of the network can correctly assign users to their communities of interest more than 86% of the time. To address this threat, we propose a new privacy-preserving layer for P2P systems that obfuscates user-generated network behavior. We show that a user can achieve plausible deniability by simply adding a small percent (between 25 and 50%) of additional random connections that are statistically indistinguishable from natural ones. Based on this result, we designed SwarmScreen, a system that generates such connections by participating in randomly selected torrents without appearing suspicious.


Lab 231