European researchers explored the potential for self-regulation on peer-to-peer (P2P) networks. Their empirical studies revealed a lower incidence of exchange of unlawful content (e.g. child porn and pirated music) in those networks in which users had an ongoing relationship and were identifiable to each others and system administrators. Some notes on their paper, published free online at First Monday.org, follow.
Read more ...
They review sociological and anecdotal literature, including writings of Rheingold, Hornsby, Castells and Van Dijk. They examined two distinct networks, one an "open," public network in which users had a substantial expectation of anonymity, the second a "private" network with access limited to those in one organization, in which anonymity was not assured. They obtained information about what types of content were available on each network, examined online discussion forum messages between users, and also polled users regarding attitudes toward mores and content.
They report on data in both networks as to user attitudes and mores regarding file sharing. However, the validity of their questionnaire data is cast into doubt by the poor response rate (less than 9%) obtained from the "open" network users, with whom a Web-based instrument was used. A greater rate of response came from the "private" network (78%), in which users were provided questionnaires in person. The data derived from reading discussion forum threads may also be of debatable statistical significance, but was mostly offered in interpretation of other data.
From this data, they saw similarities in attitudes and mores, including a general "laissez faire" attitude to all but the most extreme forms of content (e.g. child porn and bestiality content), with some greater conservatism among the members of the "private" network. The analysts reported evidence that a norm that promoted sharing was strong, and that any norms favoring restraint and peer control were weaker.
They reported a greater tendency among users of the "private" network to accept and conform to social pressure from other users. From the study report: "One reason for this was that anonymity was limited and sanctions could consequently be direct. Users had to be very brave or very foolish to ignore serious warnings from their peers. In fact one respondent reported an incident in which a user had discovered child pornography on the network. This incident was dealt with through collective action by a number of users and system administrators who found out the owner of the computer and posted his identity on the mailing list. After this incident child pornography was no longer encountered."
They contrasted this with their interpretation of online interviews of users of the "open" network who were contacted online: "when we interviewed users of the open WinMX network who seemed to be breaching existing norms, either by downloading while not sharing or by offering offending material. Their reactions * * * , combined with the data about material shared * * *, strongly suggest that in open networks the effects of informal social sanctions are far more limited."
They concluded that "Although there were not many examples of explicit sanctioning in the closed network, extreme forms of deviant behaviour, such as child pornography, were suppressed quite effectively through informal social control."
They did not find the same suppression effect in the "open" network, citing several explanations. First, the practical limitations on real sanctions in a "virtual" world, a lack of use of messaging and blocking and also the observation that "worldwide p2p systems provide anonymity and thus prevent even the naming and shaming of deviants."
The authors cite to several recent studies, but I was disappointed to see no reference to Robert Axelrod's 1984 book on his seminal studies of cooperation through empirical research using the durable, iterated prisoner's dilemma, "The Evolution of Cooperation," a context that seems directly comparable to a "private" network such as that studied by Svennson and Bannister. See an earlier note on Axelrod, the Evolution of Cooperation (1984).
Axelrod's findings were that "the shadow of the future" was a powerful motivation for individuals (both human and non-human) to develop collaborative behavior that may conflict with their immediate self-interest. If those in an encounter (such as a P2P network) expect to meet again, they are more likely to cooperate than when dealing with a "one time only" counterpart. "The future can therefore cast a shadow back upon the present," wrote Axelrod, "and thereby affect the current strategic situation." Because it is founded on the mathematics of game theory, Axelrod's theories are not dependent upon the affinity or consciousness of the collaborating parties.
Among the suggested ways Axelrod suggested for fostering the emergence of cooperation were to enlarge the "shadow of the future" by making interactions between players more frequent and more durable. This can be done by keeping others away (exclusive clubs are one example), by establishing hierarchy and bureaucracy that concentrates interaction between specialists, and by decomposing issues into smaller, more frequent encounters rather than a few large ones. Another way is by improving recognition capabilities with reliable identification of players that enables them to verify which have cooperated or defected in the past and to act accordingly.
These characteristics appear to be likely features of the "private" network (and appear to be absent from the "open" network) examined by Svennson and Bannister. The absence of a reference to either Axelrod or to Dawkins is puzzling, and one upon which the authors may choose to comment if they read this note.
The full paper is available free online at: Pirates, sharks and moral crusaders: Social control in peer–to–peer networks by Jörgen S. Svensson and Frank Bannister
First Monday, volume 9, number 6 (June 2004),