[Computer systems] alone or in their technical interconnectedness can contain personal data of the affected person in a scope and multiplicity such that access to the system makes it possible to get insight into relevant parts of the conduct of life of a person or even gather a meaningful picture of the personality.Ralf Bendrath has detailed analysis of the decision and its background. Meanwhile, the IPKat suggests that this may have implications for the use of privacy invasive DRM and for disclosure of information held by ISPs in civil cases.
Friday, February 29, 2008
German Constitutional Court recognises a new right of "Confidentiality and Integrity of Computer Systems"
On 27 February the German Constitutional Court issued what's being described as a landmark ruling which recognises a new fundamental right of privacy, confidentiality and integrity in computer systems. The case was brought to challenge a law which, amongst other things, permitted government agencies to hack into computer systems, for example by using a Trojan Horse to monitor suspects' internet use. The reasoning of the Court was based on its finding that computer systems will often contain information presenting a complete picture of a person's most private life:
Thursday, February 28, 2008
'Cause I'm the Taxman: Facebook and the Revenue
Now my advice for those who die,
Declare the pennies on your eyes.
'Cause I’m the taxman,
Yeah, I’m the taxman. - The Beatles
There's been a good deal of media coverage of the revelation by Evert Bopp that the Revenue is gathering information from Facebook and other social networking sites as part of its audits of individuals. There has been a tendency to present this as a privacy issue, leading to discussion of whether information on social networking sites should be treated as essentially in the public domain. This seems to me, however, to be the wrong way of looking at this question, not least because a definition of privacy remains elusive. Leaving privacy per se aside, are there other reasons why this sort of material should not be used?
There are, for me, at least two reasons. First, this material is often unreliable. As one Irish blogger demonstrated recently, it's quite easy to fake profiles in the name of others and to do so in a convincing way (Google cache). Consequently government agencies should be slow to use information derived in this way. Where they do so they should inform the individual concerned and offer an opportunity for that person to correct or challenge the material. (Something which would in any event be required by the Data Protection Rules.)
Secondly, and perhaps more importantly, this may lead to irrelevant criteria being used in a way which harms individuals. The legitimacy of bureaucracy is based, at least in part, on the impersonal application of general rules. Bureaucrats are not allowed to take other factors - such as the sexual orientation of the individual - into account, and indeed are expressly prohibited from inquiring about these factors. But where social networking profiles are being searched, it is likely that this principle may be undermined. For example, suppose that Blogger X is openly out on their blog. That is no business of the Revenue (for example) in dealing with him. But if an official is influenced by their search, we may find him being discriminated against in a way which would not have been likely otherwise.
Daniel Solove has considered some of the issues arising from what he describes as the "self exposure problem" in his fascinating new book The Future of Reputation: Gossip, Rumor and Privacy on the Internet - the full text of which is now available online under a non-commercial CC licence. It's required reading for anyone interested in this area.
Declare the pennies on your eyes.
'Cause I’m the taxman,
Yeah, I’m the taxman. - The Beatles
There's been a good deal of media coverage of the revelation by Evert Bopp that the Revenue is gathering information from Facebook and other social networking sites as part of its audits of individuals. There has been a tendency to present this as a privacy issue, leading to discussion of whether information on social networking sites should be treated as essentially in the public domain. This seems to me, however, to be the wrong way of looking at this question, not least because a definition of privacy remains elusive. Leaving privacy per se aside, are there other reasons why this sort of material should not be used?
There are, for me, at least two reasons. First, this material is often unreliable. As one Irish blogger demonstrated recently, it's quite easy to fake profiles in the name of others and to do so in a convincing way (Google cache). Consequently government agencies should be slow to use information derived in this way. Where they do so they should inform the individual concerned and offer an opportunity for that person to correct or challenge the material. (Something which would in any event be required by the Data Protection Rules.)
Secondly, and perhaps more importantly, this may lead to irrelevant criteria being used in a way which harms individuals. The legitimacy of bureaucracy is based, at least in part, on the impersonal application of general rules. Bureaucrats are not allowed to take other factors - such as the sexual orientation of the individual - into account, and indeed are expressly prohibited from inquiring about these factors. But where social networking profiles are being searched, it is likely that this principle may be undermined. For example, suppose that Blogger X is openly out on their blog. That is no business of the Revenue (for example) in dealing with him. But if an official is influenced by their search, we may find him being discriminated against in a way which would not have been likely otherwise.
Daniel Solove has considered some of the issues arising from what he describes as the "self exposure problem" in his fascinating new book The Future of Reputation: Gossip, Rumor and Privacy on the Internet - the full text of which is now available online under a non-commercial CC licence. It's required reading for anyone interested in this area.
Wednesday, February 27, 2008
An overview of ISP Voluntary / Mandatory Filtering
Irene Graham of Electronic Frontiers Australia has compiled an invaluable overview of ISP level filtering systems as part of the EFA campaign against mandatory filtering in Australia. What's most striking about her survey is that unlike much previous work which focused on countries such as China or Saudi Arabia, she looks at the systems put in place in various democracies (including Canada, the United Kingdom and Finland) but still finds the same problems - a lack of democratic legitimacy, opaque systems, overblocking, and indications of function creep.
Full Disclosure and the Law - a European Survey
Full disclosure - the practice of making security vulnerabilities public - is an area of uncertain legality. The companies whose products are shown to be insecure would like to suppress this information. In addition, new laws criminalising so-called hacking tools have caused security researchers to worry that simply possessing the tools of their trade or publishing their research may expose them to criminal liability. Legal certainty isn't helped by the fact that the laws on this point differ greatly from jurisdiction to jurisdiction. Federico Biancuzzi has now produced a very helpful survey of European laws in this area by interviewing lawyers (including myself) from twelve EU countries on their national laws. Most seem to agree that the law is unsettled. But some common themes do emerge. In particular, full disclosure is not being regulated by any specific law - instead, the consequences of full disclosure tend to be considered in a rather ad hoc way under a variety of different legal regimes. In addition, civil liability (imposed by general copyright law or by specific contractual or licensing restrictions) appears to be just as much a deterrent to research and publication as newer laws criminalising hacking tools.
Wednesday, February 13, 2008
Sabam v. Tiscali (Scarlet) - English translation now available
The recent Belgian decision in SABAM v. Tiscali (Scarlet) appears to be the first time in Europe a court has considered whether ISPs can be required to monitor or filter the activities of their users in order to stop filesharing on peer to peer networks. The Cardozo Arts & Entertainment Law Journal has now provided an English translation of the decision. The decision deserves to be read in full, but here are some of the most important passages:
the issue of future potential encryption cannot today be an obstacle to injunctive measures since this one is currently and technically possible and capable of producing a result, as it is in the case before this court; that the internet sector is constantly evolving; that in crafting injunctive relief, the judge cannot consider speculations about potential future technical developments, especially if these might also be subject to parallel adaptations concerning blocking and filtering measures
the average cost of implementing these measures does not appear excessive; that, according to the expert, this estimated cost over a 3 year period (the time of amortization) and on the basis of the number of users on the order of 150,000 persons should not exceed 0.5 each month for each user
these measures could also have as secondary consequence to block certain authorized exchanges; that this circumstance that an injunctive measure affects a group of information [exchanges], of which some are not infringing (such as film, book, CD. . ..) does not prevent, nevertheless, it [the court] from enforcing the injunction
SA Scarlet Extended disputes, nonetheless, this court’s power to order an injunction by arguing that:
* the technical measures requested would lead to impose upon it [Scarlet] a general monitoring obligation for the totality of all “peer-to-peer” traffic, which would constitute an on-going obligation contrary to the legislation on electronic commerce (Directive 2000/31 ...,
* the installation of filtering measures may lead to the loss of the safe harbor from liability for mere conduit activities that technical intermediaries enjoy by virtue of Article 12 of Directive 2000/31,
* the technical measures requested in so far as they lead to “installing in a permanent and systematic way listening devices” will violate fundamental rights and, in particular, the rights to privacy, confidentiality of correspondence, and freedom of expression;
Directive 2000/31 of 8 June 2000, related to certain legal aspects of information society services, and in particular electronic commerce in the internal market, states, in its Article15, that “. . .Member states shall not impose a general obligation on providers . . . to monitor the information which they transmit or store” ...
Article 15, which is part of Section 4 of the Directive related to “Liability of intermediary service providers,” aims to prevent a national judge from imposing liability for breach by the service provider of a general monitoring obligation due only to the presence on its networks of illegal material ... this provision that thus governs the issue of provider liability is, however, exclusively addressed to the judge of liability and has no impact on the present litigation since injunctive relief does not require any prior finding of negligence by the intermediary
Scarlet wrongfully considers that this injunction would result in its loss of the safe harbor from liability contained in Article 12 of Directive 2000/31 ... that benefits a provider of mere conduit or access to the internet conditioned upon it neither selecting nor modifying the information being transmitted;
That in accordance with “whereas” clause 45 of Directive 2000/31, “the limitations of the liability of intermediary service providers established in this Directive do not affect the possibility of injunctions of different kinds; such injunctions can in particular consist of orders by court . . . requiring the termination or prevention of any infringement, including the removal of illegal information or the disabling of access to it.”
That the only fact that the filtering technical instrument would not filter some infringing works belonging to the SABAM repertoire does not imply in any way that those works would have been selected by Scarlet; that indeed the fact that one does not succeed in blocking some content does not imply that this content has been selected by the intermediary as long as this intermediary does not target the information to be provided to his clients; the filtering measure is purely technical and automatic, the intermediary having no role in the filtering;
That, furthermore, even assuming that Scarlet would lose the benefit exemption of liability, it does not necessarily follow that it would be found liable; it would still have to be proven that it was negligent; that such litigation would nevertheless fall within the sole competence of a judge of liability;
filtering and blocking software applications do not as such process any personal information; that, like anti-virus or anti-spam software, they are simple technical instruments which today do not involve any activity implicating identification of internet user
Friday, February 08, 2008
Government databases - Why "the innocent have nothing to fear" simply isn't true
The Times has a very sad story:
A pensioner was killed after a couple used a policeman friend to trace him and then attacked his home in a dispute over a supermarket parking space, a jury was told yesterday.Samizdata puts it well: "The innocent have nothing to fear - so long as they have not annoyed anyone who knows a copper who can be persuaded to look up an address."
Bernard Gilbert, 79, died of a heart attack after a brick was thrown through his window.
The former Rolls-Royce worker became a target when he shouted at Zoe Forbes, 26, because she parked her car in a space he had earmarked for himself at a branch of Asda, Nottingham Crown Court was told.
Mrs Forbes was upset and called her husband Mark, who told her to note down Mr Gilbert’s numberplate. He then asked a policeman friend to check Mr Gilbert’s address on the police national computer, using the car registration number.
Mr Forbes sent his wife a text message reading: “We’ll smash his car to bits and then his hire car and then whatever he gets after that until he dies.”
The couple deny manslaughter.