Thursday, 31 December 2009

Live music - how UK government can *really* help musicians

Never mind the wrong-headedness of certain bits of the Digital Economy Bill. If the UK government truly wanted to help the creative industries and live music in the UK - which, leaving aside studio bands for now, is how intending musicians really develop their craft and hone their skills, even in this digital age - then they'd support Lord Clement-Jones's private members' bill to amend the Licensing Act 2003.

Dry as that may sound, it would make a difference. Since the Licensing Act 2003 came in towards the end of 2005, a licence has been needed for the performing of any live music in any UK venue. That involves time, red tape and money. As a result, it's been much more difficult for musicians to get gigs and for owners to have live music performed in their venues - and that includes colleges and hospitals as well as pubs and clubs, classical music as well as pop, rock, RnB.

Many pro musicians I know tried to campaign against that Act, to no avail (campaigners included the Musicians’ Union, Equity, British Music Rights, the Arts Council, even the English Folk Dance and Song Society, just to indicate the breadth of musical interests affected - morris dancing got a special exemption!).

Again, it's musicians at the start of their careers and smaller business owners who have been hit disproportionately the hardest - see Wikipedia note. Circus acts and comedians are allowed - so why discriminate against live music? "Temporary event notices" can be used for occasional events, but of course again they cost time and money (fees) and admin to deal with.

In his letter to the Guardian, Lord Clement-Jones pointed out:

"A case in 1899 (Brearley v Morley) established that a pub landlord could let customers use a piano on his premises without an entertainment licence. Today, such a landlord could face criminal prosecution where the maximum penalty is a £20,000 fine and six months in prison."

And now Lord Clement-Jones has brought forward a Live Music Bill (PDF) as a private member's bill (i.e. without government support, and only a small chance of becoming law).

His Bill, which received its first reading in November 2009 (a formality with no debate), would change the Licensing Act to exempt live music from the licensing requirements in the case of:

  • small venues - premises that take no more than 200 people (including shows for an audience of up to 200 performed at schools, colleges & hospitals during which alcohol isn't sold), or
  • two in a bar rule - where unamplified / minimally amplified music is performed by 1 or 2 people only.

The exemption would be conditional so that a licence for live music in premises that sell alcohol can be reviewed and if local residents complain there can be a proper hearing.

For more background on this Bill, see the Lib Dem news release, Live Music Forum note from June 2009, and Lord Clement-Jones's full June 2009 speech about his intentions and the background.

It seems not to be a coincidence that, perhaps prodded by the introduction of the Live Music Bill, yesterday the Department for Culture, Media and Sport or DCMS came out with a consultation on a proposal to exempt small live music events from the requirements of the Licensing Act 2003 (the full text PDF - "Proposal to exempt small live music events for audiences of not more than 100 people from the requirements of the Licensing Act 2003" - includes the text of a draft Legislative Reform Order in Appendix C). See also the DCMS news release.

Disappointingly, but not surprisingly, the government's proposals are much narrower than Lord Clement-Jones's - allowing for an audience of 100 rather than 200, not providing for small unamplified performances at all, cutting perfomance times off at 11 pm instead of midnight, and making the exemption revocable rather than permanent. It also requires performances to take place wholly inside a building, which seems fair enough although if the building's in the middle of a field with no within earshot that shouldn't be necessary.

The DCMS do acknowledge (see 1.6 of the consultation paper) that there would be costs savings and benefits (see 1.7) for licensed premises such as clubs and pubs, unlicensed premises such as cafes, restaurants, scout huts, record shops, etc. and individuals that wish to stage small, live music events; musicians – particularly those starting out in the business - who will benefit from the greater availability of venues; and the wider public and communities who will benefit from the increased opportunity to hear live music. See also the impact assessment.

I say the government should forget the consultation and just support the Live Music Bill. The only improvement I'd make to the Bill would be to extend the definition of hospitals to include old folks' homes too. And, if necessary, concede on 11pm rather than midnight and widen the ability for residents to complain about noise.

The Bill gets its second reading on 15 Jan 2010. If you have views on the consultation do contact your MP or Lords about the Bill, or perhaps respond to the DCMS - email licensingconsultation@culture.gsi.gov.uk. The deadline for responses is 26 March 2010.

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Wednesday, 30 December 2009

Government IT strategy - what's needed? Competition - contribute your views

IdealGov and the Centre for Technology Policy Research are running a competition to develop an ideal government IT strategy "which everyone wins. Everyone who contributes is invited to a party". Even better (or not, depending on your fondness for parties!), "The Conservative, Labour and Lib Dem Parties alike have all agreed to have the results presented and to consider them in the context of their own thinking around the development of more effective technology policies."

I can't see an exact end date for entries (typical lawyer, imprecision makes me nervous), but it runs for 6 weeks and as the blog was dated 18 Dec 2009 I'm guessing the deadline is close of business UK time on Friday 22 Jan 2010. There.

If interested, you can contribute to the wiki before the deadline, currently under the following headings (including data protection law issues):
- governance of public-sector IT
- technical architecture which supports the real-world intention
- procurement of technology and tech-based services
- design that works for front line staff and users
- basis for participative public services
- public data
- personal data
- trust, dignity & legality under human rights & DP law
- political engagement, openness and trust in the political process
- and above all saving vast, vast amounts of money

My view is, just get the basics right. Which seems to be a battle in itself judging by the example I give below.

With major plans like a (ID card, big whatever, fill in the blank here) for the whole of the (NHS, Home Office, country, fill in the blank here), start small. Make sure it's done in a scaleable way, so you can grow in manageable stages from there, but start with a small pilot in one office, hospital, location etc, and make sure that for all stages including planning you involve the people who are going to be using the tech day to day in real life, both government staff and the public. It's not rocket science.

If a tender sounds too good/cheap to be true, it probably is (lure 'em in with an enticing low quote then, oops, it'll take longer and cost a lot more now, but they don't want to back out or cut their losses do they, they'll feel they're stuck with it, and so they are).

Sometimes, losses just have to be cut, rather than keep throwing good money after bad. Consider some of the government IT projects that we've been lumbered with - if someone had tried to initiate or run a project like that in a business environment, they'd have been fired or the project would have been canned very early on. I don't know if part of the problem is the "other people's money" (i.e. taxpayers') syndrome. And I don't think I'm alone in thinking that those who've benefited the most from UK government technology projects have been, very clearly, IT consultants and IT companies - rather than the British public.

Anyway, to my example. Now the ICO have at least fixed the full-document download link issue with their new consultation responses system, good on them. But I was despondent to find some basic deficiencies with the Home Office's Press Office website just today:

  • The "More press releases" link at the bottom of the main Press Office page only takes you to a "Press release search" page, not a list of recent press releases.

  • All very well, but the Press release search doesn't work - try searching for anything, nothing happens. (The advanced search does work, at least). Before you ask, I did try several different browsers. I regularly use Firefox and Opera as well as Chrome and Internet Explorer.

  • The left menu link to Press Office RSS news feed goes to a page headed "What is RSS"? At the bottom of that page a "Subscribe" box gives the correct feed URL http://press.homeoffice.gov.uk/press-releases/pressreleases.xml in the link text, but when you click or rightclick the link itself, the link goes to the wrong address http://press.homeoffice.gov.uk/rss/index.html# - which is the URL of the same webpage, not the feed address! Check out the URL in the status bar in the screenshot below.

  • If you try subscribing to the feed, click on a news item (any one!), and you get this:

  • Hint - adding ".html" to the end of the URL makes it work, in some - but sadly not all - cases.

Yes, never mind fancy Web 2.0 stuff and trendy YouTube videos, just working government websites would be rather helpful to the public, for starters.

(Very small cough - the RSS link at the bottom of the page on the CTPR site is wrong too, though the one at the top is fine. It's probably me being a details merchant, again. But my point about the Home Office Press pages basic search and links from feed to individual items not working still holds!)

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Monday, 21 December 2009

EU healthcare - interoperable eHealth - legal framework

Following the recent OASIS standards for interoperable e-health data authorisation, access and information exchange I thought I'd also mention a document which doesn't seem to have received much attention.

An EU-commissioned Study on the Legal Framework for Interoperable eHealth in Europe SMART 2007/0059 was issued on 15 September 2009. It "mainly provided a better insight in the relevant legal framework of the Member States and emphasized the complexity of the issues".

To quote from the summary, its aim was (emphasis added):

"to identify and analyse the legal and regulatory framework for electronic health services in the EU Member States and for cross-border services when provided via eHealth applications, in particular in the areas of electronic health records, telemedicine and e-prescription. The report contains the analysis and assessment of the information collected in the Member States and draws some conclusions and recommendations. The study shows that it is absolutely necessary to invest in further legal study in this field. More in-depth legal analysis is, for instance, urgently needed with regard to the upcoming national legislation with regard to electronic health records. Better insight in the current legal discussions on this topic in the Member States should feed the discussion on a European scale and prevent additional fragmentation. The same effort is without any doubt also needed in the areas of telemedicine and ePrescription."

As well as summarising healthcase systems in EU member states, the legal and regulatory framework for the healthcare profession, processing of personal health data as personal data and data protection in that context, and patients' rights including human rights, it also (as you'd expect from their summary) looks at the regulatory frameworks for e-health in member states relating to electronic health records or electronic patient records, telemedicine and electronic prescriptions.

See more on EU ehealth studies.

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Saturday, 19 December 2009

OASIS - Healthcare Data Security & Privacy Authorization & Access Control Standards approved

The web services open standards group OASIS, an industry body which has done a lot of good work agreeing and promulgating technological standards, has just approved two new healthcare industry-related technology standards for health information interoperability i.e. access to healthcare data across different organisations, both as of 1 November 2009:

  1. the Cross-Enterprise Security and Privacy Authorization (XSPA) Profile of the Security Assertion Markup Language (SAML) for Healthcare, version 1.0
    • a framework designed to provide access control interoperability in the healthcare environment via cross-enterprise security and privacy authorization (XSPA), using SAML assertions with common semantics and vocabularies in specified exchanges
    • aimed at satisfying requirements for information-centric security within the healthcare community; will enable hospitals and other service providers to validate requests for information access, allowing user attributes to be matched against the security policies related to user location, role, purpose of use, data sensitivity, and other relevant factors
    • includes a privacy policy that enforces patient preferences, consent directives and other privacy conditions (object masking, object filtering, user, role, purpose, etc.)
  2. the XSPA Profile of the eXtensible Access Control Markup Language (XACML) for Healthcare, version 1.0
    • a cross-enterprise security and privacy profile that describes how to use XACML to provide a mechanism to exchange security and privacy policies, evaluate consent directives and determine authorizations in an interoperable manner
    • i.e. describes mechanisms for authenticating, administering, and enforcing authorisation policies which control access to protected information residing within or across enterprise boundaries, thus promoting interoperability within the healthcare community by providing common semantics and vocabularies for policy enforcement.

For non-technical lawyers - the references to "security policies" and "privacy policies" here are used not so much in the sense of what people normally understand as "privacy policies" and the like, but rather as means to clearly represent and automatically check and enforce through technology the underlying policies or rules in the traditional sense.

These new standards set out a framework and means for exchanging data securely and consistently with any privacy policies, but (as with the ISTPA Privacy Management Reference Model) they still need to be implemented technically to see use.

No doubt the members of OASIS, who include IBM, Sun Microsystems, AOL, Boeing, Booz Allen Hamilton, CA, Cisco, EMC, HP, Intel, Jericho Systems, Neustar, Nokia, Oracle, Red Hat, SAP, Skyworth TTG, U.S. Veterans Health Administration and others, will be amongst the first to do so.

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Friday, 18 December 2009

Google London recruiting lawyer

I see, even in these times of cutbacks and mass redundancies in UK and US law firms, that Google London are looking for another associate legal counsel.

Interesting - and a sign perhaps that things are still good at Google? It's for a transactional lawyer too, not a litigator.

But they want "experience drafting and negotiating contracts for technology and media clients" - so that rules me out then, although I could have ticked all the other boxes.

No commission or finders' fee necessary, if you heard about the job from me and get it, you can always treat me to a cuppa sometime…

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Thursday, 17 December 2009

France - Sarkozy's party in copyright breach music video

It's ironic and not a little amusing that this excruciating (and excruciatingly funny) "lip dub" video featuring lip synching politicians from the UMP party of French President Nicolas Sarkozy was produced and released without obtaining copyright permission for the use of the song, so they'll now have to cough up! (It seems the copyright owner had refused consent when they tried to clear the rights, but they went ahead anyway.)

Sarkozy was of course behind France's controversial three strikes law for disconnecting copyright infringers from the internet.

I make no comment on the quality of the lip synching or the dancing. You'll have to decide for yourself whether you'd rather have had your internet access cut off than watch it!

Via Techdirt.

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Digital Britain - December 2009 update released

The UK Department for Culture, Media & Sport (DCMS) have issued Digital Britain - Implementation Update (PDF), December 2009.

This reports on the progress in implementing the recommendations in the June 2009 Digital Britain White Paper - not just the Digital Economy Bill (which got only a brief mention) but the many other projects to take forward the various measures e.g. digital inclusion and access to public data (on which see, if you've not come across it yet, the Smarter Government site which includes sections on 1.3 Radically opening up data and promoting transparency and 2.3 Harnessing the power of comparative data).

See generally the Digital Britain sub site.

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Wednesday, 16 December 2009

"I've got nothing to hide", rebutted - summary of Solove's paper

I only came across Prof Daniel Solove and his work relatively recently.

For anyone who's not yet read his brilliant analyses, here's a summary of his superb 2007 paper 'I've Got Nothing to Hide' and Other Misunderstandings of Privacy which pulls apart the "I've got nothing to hide" argument commonly put forward as a reason to justify and legitimise invasion of privacy. The paper was mentioned during the discussion after Bruce Schneier's talk on the future of privacy.

This is my own take on it so it's not necessarily in the same order as in his paper, but I hope and believe it correctly reflects his masterly dissection of the issues. I wholeheartedly recommend that everyone interested in this area should read his original paper.

  1. The right to "privacy" is usually seen as a kind of right to "secrecy", the right to hide bad things, to hide things about you which are negative (or embarrassing, or that you just plain think is no one else's business to know).
  2. The most seemingly compelling form of the "I've got nothing to hide from the government" argument is put as one involving a necessary trade-off between invading individual privacy and protecting national security - in which case of course most people will agree that national security must win out.
  3. But privacy is not an issue of individual interests vs. society's interests. Privacy, freedom from friction and others' intrusiveness, is itself a social value, a form of social control based on society's norms; and its protection is not a question of individual right vs. "greater social good", but a question of one social interest vs. other social interests. The true issue is, how can we strike the right balance between different social goods?
  4. Privacy is very hard to define as a single all-encompassing concept. Attempts to do so have resulted in views of privacy which are either too vague and broad, or too narrow - and this harms rather than helps analyses of the position.
  5. Privacy should be seen as a set of related issues, and privacy problems as a web or cluster of related problems - not necessarily connected by one common element, but which resemble each other. Rather than asking "Is this a "privacy" breach or isn't it?", we should ask: "Is this a problem that should be protected against because it causes harm to something valuable to society?"
  6. Solove defines a privacy problem or privacy violation as occurring when an activity (whether by individuals, government or business) causes harm by disrupting the socially valuable activities of others, e.g. by chilling free speech/association or by resulting in adverse power imbalances in society such as excessive executive power.
  7. He proposed a taxonomy of privacy to model the problems of privacy harms:
    • Information Collection
      • Surveillance
      • Interrogation
    • Information Processing
      • Aggregation
      • Identification
      • Insecurity
      • Secondary Use
      • Exclusion
    • Information Dissemination
      • Breach of Confidentiality
      • Disclosure
      • Exposure
      • Increased Accessibility
      • Blackmail
      • Appropriation
      • Distortion
    • Invasion
      • Intrusion
      • Decisional Interference
  8. Many people approach the privacy problems arising from the collection and use of personal data from the angle of Orwell's 1984 Big Brother metaphor, which focuses on the harms of surveillance (social control, inhibiting free speech etc) - but in fact most collected personal data isn't sensitive in itself, its collection wouldn't inhibit free speech, and most people wouldn't care very much about its being collected. Hence, many people will say, "Go ahead, I've nothing to hide".
  9. In fact, the better metaphor in relation to collection/use of personal data is Kafka's The Trial, about a faceless bureaucracy that uses personal data to make important decisions about people who are excluded from having any control or even say about the use of their data for purposes unknown.
  10. The problem isn't so much data collection but information processing, which alters the power relationship between citizen and state and also creates, not necessarily inhibition or chilling, but a feeling of helplessness and powerlessness.
  11. Data aggregation, by combining seemingly non-sensitive separate bits of information, may well reveal additional and possibly even sensitive information; so, without knowing exactly what information is deduced by the data mining software, we can't say definitively that it won't reveal any data that we'd want to hide.
  12. Initiatives such as the US National Security Agency's data collection and data mining, even if they don't uncover any information that people might want to hide, still cause privacy problems because they result in the Kafkaesque problems of bureaucracy (rather than surveillance) - "suffocating powerlessness and vulnerability… indifference, errors, abuses, frustrations, lack of transparency and accountability".
  13. Data mining and profiling also tries to predict future behaviour. If you are matched to a particular profile, that means that they (or the software) think that you are likely to follow a particular pattern of behaviour in future. But how can you deny something you've not done yet? "Having nothing to hide will not always dispel predictions of future activity". Are you happy to be judged as being a dodgy person in some way (whether to national security, or for insurance purposes) because of your "profile", which you may not even know about let alone have any control over?
  14. Data mining causes "exclusion" problems. You don't know what data is being held about you (or sometimes even that data is being held about you), let alone have the power to correct any errors in the data (e.g. that you are a bad credit, when in fact it's someone else who has the same name as you or used to live at your address).
  15. Thus the problem is not about whether the data collected is or is not something people want to hide; it's really about the structure and power of government: how government treats citizens, and the power imbalance between citizen and the executive branch of government [i.e. a separation of powers and checks and balances issue, fundamental to democracy].
  16. Similarly with the secondary use of data, where data collected for one purpose is then used for a different unrelated purpose without the person's consent. The potential uses may be endless, but people just can't properly evaluate the risks of the government (or whoever) having their data, because there are no limits, transparency or accountability. Again it's a power imbalance issue.
  17. Secondary use may also involve breaches of confidentiality or contract - e.g. some US airlines gave passenger records to government agencies without passenger consent, in breach of their privacy policies. There is a social interest generally in ensuring that promises are kept and that trust in business/customer relationships is maintained, and specifically in businesses/government keeping within any stated limits on the way they use personal data. If government/businesses can use personal data in any way they choose, then stated limits are meaningless and consumers are powerless. Again, the power imbalance is a structural harm.
  18. Privacy problems are mostly thought boring as generally they "lack dead bodies". Usually the threat to privacy is not from single obviously egregious actions, but "by a slow series of relatively minor acts which gradually begin to add up". Solove draws an analogy with some types of environmental harms which involve gradual pollution from lots of sources, rather than one big spill.
  19. So, in a nutshell, it's not a question of balancing privacy against security, it's not a question of whether government should or should not be allowed to engage in surveillance or data collection/analyses activities - the true question is: should the unelected executive branch of government be allowed to do these things without adequate (or any) judicial oversight (e.g. a requirement to obtain warrants) or data minimisation?
  20. Most democratic societies would hopefully answer "No" to that question. And while Solove's analysis is primarily in relation to information gathering by government, it applies equally, if not more so, to data collection and data mining by businesses in relation to their customers and others.
It's very worthwhile finding the time to read the paper if you're interested in these issues. The link again - 'I've Got Nothing to Hide' and Other Misunderstandings of Privacy.

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Bruce Schneier on the future of privacy & security v. privacy, 4 December 2009


Open Rights Group: Bruce Schneier Security Talk (Q&A)
from Open Rights Group on Vimeo.

To techies, and probably some others, Bruce Schneier is a security god. Not only does he really know his stuff, but he has a gift for communicating with great clarity.

He gave an excellent talk for the UK Open Rights Group on Friday 4 December 2009, on 'The Future of Privacy: Rethinking Security Trade-offs'.

Much of what he said will be familiar to those who read his security blog. In fact it was very similar to a previous blog post by him on the future of privacy.

The video of the talk, above, was posted on the ORG blog (linked to on his own blog) - with a giveaway of 4 copies of his book to new ORG joiners, see the ORG blog post for details.

For those who prefer text to video, below are some other selective highlights from the talk. They're from my notes (and I've not re-watched the video), so they're not necessarily verbatim or indeed even in order, and of course any mistakes are mine alone.

Anecdotes and quotes

Some good quotes from the talk (slightly paraphrased):

In the US, data which it's illegal for the government to collect, they buy from corporations; data which corporations can't get, they get from the government for free.

Data is the pollution problem for the information society. All processes produce it and it stays around and festers and we have to deal with it somehow.

And this I think is true - third party individuals such as friends, not just businesses, may disclose your personal data:

The hard thing to deal with is data about you legitimately captured and then posted by someone else.

My favourite anecdotes from this talk:

Someone from the US Department of Homeland Security asked him, don't you want to know who's sitting next to you on the airplane?

He said: "No, I don't want to know who he is; I just want to know that he's not going to blow up the plane. And if he is going to blow up the plane, I don't care who he is either!"

In other words, it's not about identity, it's about intention. He thinks simply trying to map identity to intention makes little sense. (See also his blog post of his testimony about ID cards - others like the Economist think chipped cards are a particularly bad idea - and see his blog post about airline security generally.)

On the "I've got nothing to hide" argument (discussed further below):

On live radio someone raised the "I don't care about my privacy, I've nothing to hide" argument. His answer: "What's your salary?"

The questioner refused to reply…

More highlights

Data trails are inevitable; data storage & processing gets ever cheaper

We leave digital footprints everywhere. It's not a question of malice on the part of governments or businesses; data is just a natural byproduct of computer and communications technology, of the information society, of the things we do.

Computers including mobile phones create transaction records of everything we do. And our data, these records, have value, e.g. for marketing - not always lots, but some.

Data is also a byproduct of information society socialisation - the way we socialise now produces data; all socialisation systems that 20 years ago were face to face or by voice are now mediated by computers in such a way that they produce data. More and more corporations and IT systems are mediating and producing data.

In the information society, most of your data is not controlled by you; phone records, credit card transactions etc, all that data is about you but is not controlled by you.

Data storage costs are dropping to free, and similarly with data processing. It is becoming easier and cheaper to keep all the data rather than work out what to keep and what to throw away. Data collection, storage and processing costs are continuing to drop.

Surveillance is becoming wholesale, and privacy is deliberately de-valued

With data collected it's possible to do surveillance backwards in time, not just forwards. Data can be valuable later.

Data processing (including data mining) is cheap enough that it's possible to do wholesale surveillance, to follow every car, every one. All behavioural marketing is based on the notion of following everyone so that marketing can be targeted more effectively.

There's a lot of noise and error but for applications like marketing it's good enough to provide just a bit more edge over the competition. Less so for law enforcement.

We have these systems because the data is there. We have surveillance systems because it's useful to government and corporations.

People are seduced by convenience into making bad privacy trade-offs; they only think about privacy when it's salient to them. It's not that they don't understand the risks, but that they only focus on one thing at a time. Anti-privacy forces try to prey on these tendencies, they do their best to make privacy less salient, to make it a bit annoying to get to privacy settings.

(Note - see further this on privacy salience:

"sites appear to craft two different messages for two different populations. Most users care about privacy about privacy but don’t think about it in day-to-day life. Sites take care to avoid mentioning privacy to them, because even mentioning privacy positively will cause them to be more cautious about sharing data. This phenomenon is known as “privacy salience” and it makes sites tread very carefully around privacy, because users must be comfortable sharing data for the site to be fun." [And, I'd add, for the site to make money!])

Technology, the great disruptor

Cameras are everywhere but today we can still see them. 10 years from now they'll be too small to see.

Identification is only temporary; once automatic face recognition is working, there'll be no need for ID to know who you are.

Automatic voice recognition and keyword recognition are coming; will you be able to turn your cellphone microphone off, or find that it gets turned on remotely?

Once "life recorders" become common, it'll be thought suspicious not to wear them. Brain scan technology can tell if you've seen something before e.g. a terrorist training camp; the technology can only get better.

This era heralds the death of ephemeral conversation. Many people have been caught out by emails etc they thought were chat or had been deleted.

Laws & other infrastructure need care - they'll last for longer than we think

Laws written for one technology tend not to apply to later technologies, although they're trying to make laws technology invariant.

Decisions on privacy and anonymity made today based on the security concerns of the moment will become infrastructure, and we'll be stuck with them for 10 or 20 years.

Defaults matter

Watch the defaults (as Facebook has learned). People do defaults.

To make people have less privacy, just change the defaults, don't change what's possible. Microsoft has also learned about defaults; finally the defaults for Windows are good.

(Side note - All this is of course particularly relevant given the recent controversy about Facebook's recent new privacy practices and privacy defaults, which were then changed again in quick order after detractors pointed out that they claimed to be more privacy friendly while actually being the opposite - see e.g. comments by EFF, Light Blue Touchpaper, Epic, Reuters, Blogzilla, Out-Law, while Ben Laurie pointedly suggested the best option is not to use Facebook at all!

Out-Law have also noted a recent report by security experts Sophos who found many Facebook users readily accept friend requests from strangers, allowing the "friend" to access vast quantities of info about the user - which access rights are down to Facebook defaults.

Improvements in network analysis techniques will also help businesses like Facebook find out truly relevant connections between users, irrespective of defaults, which is again very valuable information, and could potentially include sensitive data like political beliefs:

"all networks share a remarkable property: their nodes can be classified into groups with the nodes connecting to each other depending on their group membership. In a social network, for example, people can be grouped by age, occupation, political orientation and so on. The method proceeds by averaging all possible groupings of the nodes, giving each grouping a weight that reflects its explanatory power.")

Security & privacy; privacy & openness

Security vs privacy is a false dichotomy. It makes no sense. Only identity-based security hurts privacy. Door locks, tall fences, those protect both security and privacy.

The real dichotomy is not security vs privacy, but liberty vs control. That explains why privacy and openness differ depending on who are are talking about.

Privacy tends to increase your power, openness tends to reduce it.

Between government and people, there is a power imbalance - government tends to have the power. (I.e. it's a question of balance of powers or separation of powers, of checks and balances, concepts essential to democracy.)

Open government reduces government power, and so is an equaliser. Privacy for the people increases the power of the people and decreases the power imbalance. Forced openness in people increases the imbalance. So we want government to be open, and individuals to be private.

Knowing each other's secrets doesn't work because of the power imbalance. If a policeman stops you and you show him your ID and he shows you his ID, it's not the same! If a doctor asks you to take off your clothes you can't say "You first". There is a fundamental asymmetry in those sorts of interactions, and privacy reduces or increases the asymmetry.

Real security is liberty plus privacy. Privacy is part of security. Individual privacy is part of what we want. Open government is part of what we want.

Some types of security do require us to give up our privacy, e.g. police investigations, sharing medical information, identification in a corporate setting.

In areas where security is opposed to privacy, where we are forced to violate privacy and give up power, we need oversight and auditing to maintain security (Quis custodiet custodes ipsos?). We willingly give power to the police to intrude into intimate aspects of our lives to solve crime, but there is a process in place to regulate it, a warrant system whereby a neutral third party determines probable cause and how to minimise the privacy invasion -and that by and large works. Similarly, data minimisation helps the police to do their jobs while preserving rights.

The "death of privacy" is over-rated. It is not inevitable. Just because the technology exists it doesn't follow that it will be used in such a way that it invades privacy. It's a balancing act. Technology changes the balance, but it doesn't make it go away.

We can accept the new balance, or deliberately reset it (e.g. the US video rental privacy laws - overly narrow, but at least it's an attempt to reset the balance).

He favours measures such as access to your own data and what is held about you, inhibiting third party data collection and use, limiting secondary use, erasing data, opt in not opt out, prosecuting criminal users of data.

What next? (including points from the discussion afterwards)

Defending privacy is very difficult because it's very contextual, it tends to be a social construct which by definition will involve a third party. Health data is private, but you know it because your doctor told you. We want our friends to see our data on Facebook, yet we want it to be private.

The "I've got nothing to hide" argument for giving up privacy is too glib. Privacy is part of dignity, it's a basic human need. (Also see the anecdote above, and his brilliant post on why privacy matters as a basic human need and deserves protection). A paper dealing specifically with the "I've got nothing to hide" fallacy was also mentioned - see my separate summary of Daniel Solove's excellent paper "'I've got nothing to hide' and other misunderstandings of privacy".

In the US, a common refrain is "Give up your privacy - or terrorists will kill you!". So of course people agree to give up their privacy.

What about education? (e.g. so people know to beware of the defaults). He thinks ritualising it through "education" and leaflets etc won't work, people need to learn it through their environment, awareness needs to be raised e.g. through scandals. (Although it was noted that the only scandal which seems to have registered with people, in the UK at least, was the child benefit CDs data losses involving the personal details of all families in the UK with a child under 16. Most other scandals seem to be considered par for the course now.)

The end result will be less privacy simply because computers naturally spit out data - unless we deliberately make them not do it.

We're not going to engineer our way out of this. Lots of good work has been done on privacy enhancing technologies (PETs) e.g. David Chaum's digital cash, but without economic incentives they will never be used.

We have to push for laws, a legal substrate that will encourage the use of PETs. It's like e-voting, good systems are not used because lesser systems are cheaper. (Note: standard e voting machines have been hacked.)

What will it take to change the laws? To change the laws properly, it will take a new generation of people who grew up with these technologies and understand what privacy means in the information age. That may not happen for 2 or 3 generations.

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Tuesday, 15 December 2009

ISTPA Privacy Management Reference Model 2.0 released - for privacy management systems

The International Security, Trust & Privacy Alliance have just (December 2009) released their Privacy Management Reference Model 2.0 (formerly known as the Privacy Framework version 1.1, originally published in May 2002).

What's it about? Privacy laws and regulations still differ across the world, often significantly. It's not easy to comply with all of them - and in fact complying fully with one country's set of requirements may technically require breaching another's! IT systems and services could help in this regard, indeed without using technological means it would probably be impossible for a cross-border business of any size to check that it was properly compliant with data protection and privacy laws. (Although technology is of course not the be all and end all; people, processes and practices matter too.)

So the ITSPA Reference Model, as the paper puts it, is intended to be:

"A framework for resolving privacy policy requirements into operational privacy services and functions" from an information technology viewpoint

or, from the old ISTPA FAQs:

"to provide analytical starting point and basis for developing products and services that support current and evolving privacy regulations and business policies, both international and domestic… As legislative, regulatory and market requirements for privacy protection progress, it is essential that trusted and reliable solutions be developed and deployed that meet those requirements", and the Framework was intended as a "resource for constructing trusted and reliable solutions for privacy protection".

The original aims and their evolution are best summarised by quoting from the preface:

"privacy requirements (typically expressed as fair information practices or privacy principles) provide little insight into how to actually implement them, presenting frustrations for policymakers who expect business systems to manage privacy rules and design challenges for IT architects and solution developers who have few models to guide their work.

The ISTPA Privacy Management Reference Model was developed to aid in the design and implementation of operational privacy management systems. When we vetted the original Reference Model, we confirmed that its 10 privacy Services represented a robust set of operational functions capable of supporting any set of privacy requirements…

Today we see accelerated attention to systemic privacy risk and increased expectations of auditable privacy compliance, stemming not only from legislative and regulatory mandates, but also reflecting the business realities of our information-rich IT environment. Today, increased cross-border information flows, networked information processing, use of federated systems, application outsourcing, social networks, ubiquitous devices and cloud computing bring greater challenges and management complexity to privacy risk management.

To address these issues, the ISTPA has completed a series of studies and in-depth exercises aimed at producing an updated revision of the Reference Model. As a starting point and with the understanding that privacy requirements are expressed in different forms (practices, principles, legislation, regulations, and policies), the ISTPA undertook a research project in 2005-2007, analyzing representative global privacy requirements and testing the Reference Model against those requirements.

The results of this analysis were captured in the ISTPA “Analysis of Privacy Principles: An Operational Study,” published in 2007. Twelve representative international privacy instruments (law, regulations, major statements of privacy principles) were reviewed and core privacy requirements were derived from each instrument. We learned through this process that, while similar words are often used (e.g., notice, consent, etc.), there are significant and subtle differences in their intended meaning and application. Finally, these requirements were grouped together to create a composite set, (shown below in section “Operationally-Focused Privacy Requirements”)…

The findings of this Analysis were then applied to the revision process for the ISTPA Reference Model Services and underlying Functions. As a result of this assessment, we determined that the original Services do provide a robust and comprehensive set of privacy functions to support privacy requirements. Furthermore, this assessment provided a deeper visibility into each Service and its applicability to the nuances of international privacy legislation. This led us to make a number of changes and updates to the Reference Model document.

The ISTPA Privacy Management Reference Model v2.0 is the culmination of this work and has been versioned v2.0 to reflect the fact that the original “framework” has been re-formulated into a “Reference Model” for the implementation of privacy management systems."

The Analysis of Privacy Principles, for anyone who's not come across the report yet, looked at a broad spectrum of instruments and legislation from across the world including the OECD Privacy Guidelines, the EU Data Protection Directive, various US laws & regulations and the APEC Privacy Framework. Trying to comply with this patchwork of rules is of course a longstanding difficulty for multinational enterprises. Nevertheless ISTPA has managed to derive requirements from their study of the disparate instruments, tried to reconcile and distil them, and translated them into practical operational requirements - a commendable effort.

I gather from John Sabo of CA (and president of ITSPA), who kindly told me about the release of the new reference model, that the ITSPA site is being worked on currently, so the links to members of ITSPA, FAQs etc are not yet back up. But when it is, for anyone who's not read them yet, it's worth checking out the other documents on the site too.

Adoption of the Reference Model is another matter. It would be good to see it being increasingly used and built on, but that will take time - and, probably, more legislation. But that's the subject of a whole other blog post…

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Friday, 11 December 2009

Gowers copyright exceptions - IPO consult on draft legislation

The UK Intellectual Property Office want to go ahead with draft legislation (draft SI The Copyright (Permitted Acts) (Amendment) Regulations 2010) following the 2006 Gowers report recommendations on adding or expanding copyright exceptions to improve access to and use of copyright works. They seek views by 31 March 2010.

The consultation paper is 100 pages long so I haven't had a chance to read it fully yet and am just flagging its release, but, from a quick skim, unfortunately some key exceptions that would have improved the situation for UK users of copyright works are not going to happen:

  1. Format shifting e.g. ripping your paid-for CDs to MP3 - nothing, let's leave it to the EU.
  2. Parody, caricature, pastiche - no new exception.
  3. DRMs - no change.

But there's a bit of easing up for research & private study, and copying by educational establishments or by librarians, archivists or curators.

See further the IPO's Gowers timetable.

Sources:

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Wednesday, 9 December 2009

ICO Personal Information Online - Code of Practice - consultation

The UK Information Commissioner have announced a consultation on a draft code of practice -

"which will provide organisations with a practical and common sense approach to protecting individuals’ privacy online. The new draft guidance explains how the law applies and calls on organisations to give people the right degree of choice and control over their personal information, for instance by giving them clear privacy choices or making it easier for people to erase their personal information at the end of a browsing session."

The consultation begins on 9 December 2009 and ends on 5 March 2010.

What makes life unnecessarily difficult for those wanting to read the consultation document is that the ICO are using a new "consultation portal" where you can respond online, but the link to read and comment on the consultation document only leads to a list of links to the contents -

They really should be providing a single page HTML option (or single document PDF option) so that people can print out the entire document in one go. Many of us are going to want to do that so we can read it on the Tube etc, or even just in the office. It's too tedious to click and print 13 sections separately.

I can understand their wanting to separate comments section by section, but why not give us the option to print the whole thing to read?

There's a Search box but it doesn't work (I tried it in Internet Explorer, Firefox, Opera, Chrome), please give me a single PDF or webpage I can search -

There's a "What do these do?" link at the bottom which when clicked pops up info about event feeds, yet there isn't an RSS or Atom feed link, which would be one of the most useful things for people wanting to keep up to date with comments being added about the consultation document (one for the whole thing and one feed for each section, perhaps). I feel newsfeeds would be much useful for those interested in this than the StumbleUpon etc links, anyway.

As you can tell, I don't think this consultation portal is as good as it could be, if the aim is to encourage consultation responses.

As for the substantive content, I'll post my views if I find the time to print out all 13 sections! Probably over the Christmas break.

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

"Surveillance & Society" journal - 2009 issue online

The latest issue of  Surveillance & Society, "The international, interdisciplinary, open access, peer-reviewed journal of Surveillance Studies", is online with full text free access, including articles such as Discrimination by Design: Predictive Data Mining as Security Practice in the United States’s ‘War on Terrorism’ by Keith Guzik.

From the back issues this seems to be more sociology than law (see e.g. themed issues such as Health, Medicine and Surveillance), but again I've added the journal to my list of free online legal journals on technology law.

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

New "Policy and Internet" journal online

The first issue is online of Policy and the Internet, "the first multi-disciplinary academic journal to investigate the policy implications of the Internet", edited at the Oxford Internet Institute, published by BE Press and funded by the Policy Studies Organization. I've added it to my list of free online legal journals on technology law (it's not on pure law, granted, but policy is close).

There is free "guest access" to the journal by filling in a form - "Those without subscriptions can access any article by filling out a short form that allows us to inform their library of their interest in reading our journals. When libraries are convinced of sufficient interest in the journal, they subscribe. Afterwards, access for all faculty, staff, and students at that institution is immediate and there are no more forms to fill out."Via OII Blog, which says:

"The first issue includes Helen Margetts (the Editor) laying out the scope of the relationship between the Internet and public policy, J.P. Singh discussing the Internet and global governance, Barbie Clarke investigating children’s use of social networking sites and Stuart Shulman analysing a large-scale data set of citizens’ electronic policy interventions – and how policy-makers deal with them."

Contents of Vol 1 issue 1:

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Tuesday, 8 December 2009

Retention of fingerprints and DNA samples - House of Commons Library standard note

The House of Commons Library's topical briefings for MPs are made available on the internet, and one of their recently published "standard notes" is of interest - SN/HA/4049 Retention of fingerprints and DNA data.

This note is a nice, relatively concise, neutral summary of the up to date position in the UK on biometrics data retention, including the various consultations and the proposed changes now in clauses 2 to 20 of the Crime and Security Bill (which had its first reading in the House of Commons on 19 November 2009), with some statistics, a table in the Appendix (taken from the Home Office summary of responses) showing the consultation proposals vs the revised proposals on DNA retention, DNA profile retention and fingerprint retention, and even a "general guidance for individuals" section linking to the Reclaim your DNA website and other sites.

For more posts from this blog regarding DNA retention, see http://blog.tech-and-law.com/search/label/dna

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Monday, 7 December 2009

Data protection, privacy - accountability, self-regulation - Galway paper

I wanted to mention the Galway accountability report, which came out in October. This post is somewhat after the event, but the paper doesn't seem to have received as much attention as I thought it would, and I decided to post the links after James Michael (Associate Senior Research Fellow, IALS and Editor of Privacy Laws & Business International) mentioned the paper in his recent talk on “Will Privacy Law in the 21st Century be European, American or International?

The Galway Project

The Galway Accountability Project on Commonly Accepted Elements of Privacy Accountability was convened earlier in 2009 by The Centre for Information Policy Leadership (set up by US law firm Hunton & Williams LLP) and the Irish Office of the Data Protection Commissioner, and according to the Hunton & Williams press release was also co-sponsored by the OECD (and funded by corporate participants).

Accountability is increasingly seen as perhaps the best way forward in terms of striking the right balance between individual privacy rights and corporate data protection & privacy compliance burdens.

The Galway project was intended:

"to develop a white paper articulating essential, commonly-accepted elements required of a company to establish and demonstrate accountability for its information processes".

Participants in the project deliberations included representatives from the European Data Protection Supervisor, UK Information Commissioner and other EU member state data protection authorities, the Canadian Office of the Privacy Commissioner, the US FTC, the OECD, technology corporations such as Google, Hewlett-Packard, IBM, Intel, Microsoft, Oracle, Salesforce.com, academics from e.g. MIT, and privacy advocates Privacy International.

The paper was released in October 2009 and, while the paper points out that the participants do not necessarily endorse its contents, given the expertise of those who took part in the debates, this report is clearly worthy of note.

Accountability - the essential elements

The Galway paper describes the elements of an accountability based approach and how it differs from other current approaches, and suggests that the 5 essential elements of accountability are:

  1. Organisation commitment to accountability and adoption of internal policies consistent with external criteria;
  2. Mechanisms to put privacy policies into effect, including tools, training and education;
  3. Systems for internal, ongoing oversight and assurance reviews and external verification;
  4. Transparency and mechanisms for individual participation; and
  5. Means for remediation and external enforcement.

It suggests that the key public policy issues include:

  1. How does accountability work in currently existing legal regimes?
  2. What is the role of third-party accountability agents?
  3. How do regulators and accountability agents measure accountability?
  4. How is the credibility of enforcement bodies and third-party accountability programmes established?
  5. What are the special considerations that apply to small- and medium sized enterprises that wish to demonstrate accountability, and how can they be addressed?

In his speech, James Michaels said he saw this project for the implementation of data protection principles in the private sector as as the latest bridge between the EU "legalistic" approach and the US self-regulatory approach.

See:

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Sunday, 6 December 2009

EU - transfer of financial messages data to the USA - interim agreement

Background: the global financial messaging intermediary SWIFT, used behind the scenes by anyone who transfers funds between banks, had a data centre in the USA. SWIFT secretly gave the US government access to banking customers' data mirrored in SWIFT's US data centre as part of the US's Terrorist Finance Tracking Program but there was a big fuss when it became known, with various EU member states saying that this was in breach of EU privacy and data protection laws. [Edit: see also the EU Article 29 Working Party's views on the SWIFT situation.]

As a result SWIFT decided to set up a data centre in Switzerland, which once it is operational in early 2010 will deprive the USA of the information it was getting from the US data centre. So, since July 2009 the US have been discussing with the EU how to get the info anyway, and they recently concluded an interim agreement to allow it.

Here are links to some of the original documents.

Press release on 2979th Council meeting, 30 November and 1 December 2009 - main results including (p.12) on EU-US agreement on financial messaging data for counter-terrorism investigations (adopting the Decision referenced below; emphasis added):

"The agreement aims to continue to allow the US Department of the Treasury to receive European financial messaging data for counter-terrorism investigations, while ensuring an adequate level of data protection. Requests by the US have to be verified by the competent authority of the relevant EU member states, they have to substantiate the necessity for the data and they have to be tailored as narrowly as possible. The agreement also provides for a joint review procedure, redress possibilities as well as a suspension clause.

The agreement is temporary. It will be provisionally applied as from 1 February 2010 and expire on 31 October 2010, at the latest. The European Parliament must consent to the formal conclusion of this temporary agreement in the coming months.

Any long-term agreement for the time after 31 October 2010 must be negotiated and concluded under the rules of the Treaty of Lisbon. These provide that the European Parliament must be fully informed at all stages of the negotiations and must give its consent to the formal conclusion of an agreement.

Concerning that follow-up agreement for the time after 31 October 2010, a Council declaration calls upon the Commission to submit as soon as possible, and at any rate no later than February 2010, a recommendation to the Council for the negotiation of a long-term agreement. It also states that the current agreement is without prejudice to any provisions in that long-term agreement.

In a second declaration, the Council and the Commission commit themselves to the Lisbon rules, i.e. to inform the Parliament immediately and fully at all stages during negotiations.

The negotiations on the provisional agreement adopted today, started in July 2009 and responded to a decision by one of the major providers of international financial payment messaging services to store its European financial messaging data no longer in a database located in the US, but only in Europe."

The background note for the meeting notes that:

"A report by the former French investigating judge Jean-Luis Brugière, commissioned by the Commission, concluded in December 2008 that the TFTP had generated considerable intelligence value also to the EU member states.

SWIFT is a Belgium-based company which operates a worldwide messaging system used to transmit, inter alia, bank transaction information. It has been estimated that SWIFT handles 80% of the worldwide traffic for electronic value transfers."

COUNCIL DECISION on the signing, on behalf of the European Union, of the Agreement between the European Union and the United States of America on the processing and transfer of Financial Messaging Data from the European Union to the United States for purposes of the Terrorist Finance Tracking Program (TFTP 16110/09), as at 27 November 2009 but adopted at the meeting above - it contains the text of the EU/US Agreement and a declaration that EU member states are to implement the agreement provisionally.

FAQ - in an Information Note by the Council's General Secretariat on EU-US agreement on the processing and transfer of financial messaging data for purposes of the US Terrorist Finance Tracking Programme (TFTP) - Questions and Answers, November 2009

And also of related interest see, on information exchange and data sharing for more general law enforcement purposes, my blog post on Reports by the High Level Contact Group (HLCG) on information sharing and privacy and personal data protection.

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Privacy, data protection - EU-USA data transfer - law enforcement - principles for information sharing

At the end of 2006 the EU & US set up an informal high level advisory group, the High Level Contact Group, to discuss privacy and personal data in the context of exchanging information for law enforcement purposes (note: wider than just terrorism).

The HLCG recently submitted their final report to the EU-US Justice and Home Affairs Ministerial Troika Meeting (of 28 October 2009), with agreed principles which would apply to information exchanges for law enforcement purposes - but this doesn't seem to have received much attention.

The details are in Reports by the High Level Contact Group (HLCG) on information sharing and privacy and personal data protection, 23 November 2009 (annexing the Final Report, Principles on Privacy and Personal Data Protection for Law Enforcement Purposes for which common language has been developed (common principles), Addendum to the final report and Annex to the Addendum, phew!), but the agreed principles are as follows:

  1. Purpose Specification/Purpose Limitation;
  2. Integrity/Data Quality;
  3. Relevant and Necessary/Proportionality;
  4. Information Security;
  5. Special Categories of Personal Information (sensitive data);
  6. Accountability;
  7. Independent and Effective Oversight - in order to maintain accountability;
  8. Individual Access and Rectification;
  9. Transparency and Notice;
  10. Redress [Both the US and EU maintained a reservation on this principle though both sides did agree that the key to this principle is to provide the data subject with an effective remedy as a result of any redress process, but they disagree on the necessary scope of judicial redress];
  11. Automated Individual Decisions - "Decisions producing significant adverse actions concerning the relevant interests of the individual may not be based solely on the automated processing of personal information without human involvement unless provided for by domestic law and with appropriate safeguards in place, including the possibility to obtain human intervention";
  12. Restrictions on Onward Transfers to Third Countries.

Their recommendation is to now seek a binding international agreement addressing all these issues.

It's interesting by the way how "law enforcement purposes" specifically means different things in the EU and the US, in these reports:

EU - "use for the prevention, detection, investigation, or prosecution of any criminal offense [sic]".

US - "use for the prevention, detection, suppression, investigation, or prosecution of any criminal offense [sic] or violation of law related to border enforcement, public security, and national security, as well as for noncriminal judicial or administrative proceedings related directly to such offenses or violations".

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.

Thursday, 3 December 2009

Privacy v. free speech: judge Sir David Eady's views

Judge Sir David Eady has decided most of the UK privacy cases in the last decade, as Lorna Brazell of UK IP law firm Bird & Bird (I love their URL!) noted in a recent article for the Society for Computers & Law journal.

As Ms Brazell also pointed out, while assigning the same judge to hear all UK privacy cases may lead to greater legal certainty, the downside is that this means that UK laws on privacy have been, and still are being, shaped by one person's personal and subjective feel - rather than being subject to wider discussion.

But it is what it is, court cases are allocated in whatever way they see fit. If you're interested in UK privacy law, and especially if you're litigating privacy issues in the UK, you can't afford not to pay careful heed to Sir David Eady's views.

Sir David recently made a speech at the JUSTICE 1 December 2009 conference Free speech v privacy - assessing the latest developments in media law & human rights. [Edit] Here is the text of the speech in full: "Privacy and the press: Where are we now?"

This speech has been picked up, with slightly different slants, e.g. by:

Media lawyer Mark Stephens was also quoted in the Guardian article as saying:

"The problem is that the common law is meant to be a commonality of judicial voices," said Stephens. "There is a system flaw in that we have historically concentrated libel and now privacy law into the hands of only a handful of judges – because of the dearth of cases that has meant we have effectively had Eady doing them full-time.

"I don't necessarily think Eady has been wrong, but having one person responsible for a whole area of judicial output is unhealthy – it is likely to cause difficulties in any area of law."

It is also worth reading other speeches by Sir David which don't seem to have been reported as widely as his JUSTICE speech (so the Guardian wasn't quite right to say that his December speech was his first since last year). Here are the links:

©WH. This work is licensed under a Creative Commons Attribution Non-Commercial Share-Alike England 2.0 Licence. Please attribute to WH, Tech and Law, and link to the original blog post page. Moral rights asserted.