An interesting “expert meeting” on the Internet of Things last week. We were joined by DG JUST who had prepared a draft paper for comment titled “A comprehensive approach on personal data protection in the European Union”. It is an update to the 1995 Data Protection Directive (Directive 95/46/EC of the European Parliament) and is reviewing data protection in general but is being influenced by the Internet of Things work in response to a changing world where “new ways of collecting personal data have become increasingly elaborated and less easily detectable”. The review is particularly looking at the following issues:

  • Addressing the impact of new technologies
  • Enhancing the internal market dimension of data protection
  • Addressing globalisation and improving international data transfers
  • Providing a stronger institutional arrangement for the effective enforcement of data protection rules
  • Improving the coherence of the data protection legal framework

The discussion in the morning (and most the break outs) seemed to surface out two different schools of thought. This is best illustrated through one of the discussion points around the theme of “Enhancing control over one’s own data” – also referred to as “silencing the chips” and “right to be forgotten”. The “kiki” types spoke of the difficulties in the practical implementation of such regulations and highlighted how such an approach would constrain innovation. The “bouba” types stressed the need to protect and educate people on the possible abuse they could be subjected to. The answer, as usual, is not at either extreme.

Going into the meeting I was definitely in the “kiki” camp and still think that the genie is out the bottle already on this one, I think it will be near on impossible to implement some of the “silencing of the chips” proposals being made since it would just wipe out the business model for actually implementing these technologies. What I did realise however, was the assumptions I had been making around the safety nets that I believed would be in place to support these technologies. For example, i had assumed a state system would exist that would stop people abusing my data if that happened and I had assumed that there would be “consumer groups” who would “keep an eye on the street” to discourage people from trying to abuse me.

So I departed the day with more questions than when I arrived – I guess a useful day at work. But non the less, many questions still unanswered. I need to figure out a way to take this to the ECTP (European Construction Technology Platform) cohort to get their input but also list some questions below, would love to hear your opinions.

The “silence of the chips”; – at what level do we de-activate personal information? – are there different levels of privacy for different identities / contexts? – why do we want to silence the chips? – what kind of abuse is anticipated?

The “right to be forgotten”; – the logistics of how to delete on demand personal information? – But these devices are very simple low power objects – is it practical to include the kind of data protection management being proposed? Should we focus on dealing with abuse rather than preventative measures? – sometimes when we mine data we only retrospectively realise the value in it – when do we make the value judgement as to if it should be deleted? the current worst cases of abuse are probably imprisonment for your ideas – will the deletion of data really help this? when all the chips and readers are being manufactured outside Europe is this a moot point? Is this kind of policing helpful or based on an outdated process?