Internet security and Big Data considered by panel of experts during seminar hosted by Oxford Brookes University

Wednesday, 05 February 2014

Internet Security seminar

In the third of a series of three seminars on ‘Networks and Society’ organised by Brunel University and the Centre for Global Politics, Economy and Society at Oxford Brookes University, the theme was ‘Data-bases and surveillance’.

The proceedings were opened by Professor Barrie Axford (Oxford Brookes) who reminded participants that while everyone knows that the Internet has changed how we live, are governed and conduct business, a new, perhaps less visible technological trend promises or threatens greater transformation.  It is known as “big data.” Big data is not coterminous with the Internet, although the Web makes it much easier to collect and share data. Big data is about more than just communication:  we can learn from a large body of information things that we could not grasp, or even conceive, when we had access to and used only smaller amounts.  Big data is also characterized by the ability to turn into data many aspects of the world and everyday life that have never been quantified before. Some call this process  “datafication.”, and unlovely as that term is, it does service. As we all know, even friendships and “likes” can be datafied, most obviously on Facebook and other social media platforms.

Lest this appear as rather benign, or merely descriptive, arguably, it is increasingly clear that citizens are, or may be, vulnerable to the increasingly routine use of big data, and thus in need of protection.  Big data in the guise of Big Brother sounds rather clichéd, but is far from impossible. In all countries, but particularly in non-democratic ones, big data can exacerbate the existing asymmetry of power between the state and the people and between powerful corporate interests and citizens. This asymmetry could well become so great that it leads to what has been described as big-data authoritarianism.

He ended by saying that the seminar affords insights into the aims and mechanics of big data construction and use, into the costs and benefits so visited and the propriety - the democratic propriety - of everything being “datafied”.

In his paper Stephen Coulson stressed that the UK Government in fact has a long track record of successfully implementing Big Data projects. This record extends back at least to 1086 with the compilation of the Doomsday Book. More recently, the first regular, national census in 1801 was launched with the aim of collecting reliable data for social and defence planning as well as to inform the private sector by making the information available to life insurance companies. That the information in the Doomsday Book and the 1801 census is still being actively used today is testament to the success of both these projects.  It may be no coincidence that both projects occurred before the advent of IT and the term Big Data. Delays and lack of clarity of purpose in current Big Data projects have created the impression that the concept of using large data sets for predictive modelling is at best inaccurate and at worst a danger. Attempts to investigate the truth of this impression by studying current Big Data projects are complicated by the fact that the projects that use Big Data such as the human genotype and NHS Patient Records tend to be large, high-profile projects. The size and duration of these projects (both examples are still running) makes it difficult to assess the future benefits of their work and their impacts on the perceptions (sense of threat, predisposition to resist) and behaviour of citizens.

Alex Finnen gave a detailed comparative analysis of the vicissitudes in implementing systems of population movement control by way of biometric ID cards, digital passports and other state-of-the-art population management technologies. His paper outlined the current state of affairs from the point of view of managers attempting to effectively use ICTs in these areas. Most population management control mechanisms are subject to rigorous national and international legislative control, but in order to work effectively in a mobile world they require international agreements on the formats in which data is to be stored and presented.  Nowhere is this more so than in Europe where it is possible for a citizen to move by car between five or six different states in one day.  Such agreements are not in place and are not likely to be so in the near future.  While this remains the case, the United Kingdom and the EU have a brief window to introduce appropriate legislation to manage these still nascent technologies. His basic premise is that “big brother is not here yet” in global terms but that we should use this window of opportunity to develop the national and international legislative framework for the day when it is a real global prospect.

Karl Harrison gave a close-grained portrayal of the ways in which the capacity to gather large quantities of data from numerous types of mobile devices has become a commonplace of major crime investigation in the UK. Such ‘high-tech’ sources of intelligence have become progressively more established and systematised within major crime investigation. He argued that There is an implicit challenge for police in recognising the distinction between ‘data’ and ‘intelligence’ in the context of the interrogation of mobile devices; this is not dissimilar to the conflation the persists between concepts of forensic intelligence and evidence, and the tendency to regard only certain specific forensic evidence types as being suitable providers of intelligence (most specifically PACE DNA samples). As a consequence, complex enquiries that might be led in some part by forensic intelligence, are sometimes hamstrung by a syndrome of tunnel vision that directly equates the term ‘intelligence’ with biometric identification.

Jonathan Joseph, spoke to the wider and contested theme of ‘resilience’ applied to a host of policy areas and bearing on the ways in which systems and actors react to, manage and ‘bounce back’ from external shocks.  A resilience approach to development, security and disaster protection focuses on risk awareness, preparedness and assessment. Using EU policy on monitoring crisis situations in the Horn of Africa, he revealed how the resilience discourse allows actors such as the EU to portray itself in a way that is consistent with its image as a facilitator of better governance and normative forms of power. It also allows the EU to project itself as a strong actor in a complex and uncertain global environment.  His position is that while resilience encourages a view of the bigger picture as more complex, uncertain and yet inter-dependent, this in fact legitimates more mundane practices at the micro level that relate to monitoring and evaluation of performance of individuals and institutions, including forms of data gathering and storage. Here, under the resilience motif, routine surveillance of individuals and populations can be parlayed into a a cost-effective way of dealing with interdependency crises.

In the final paper of the day, Anthony Barnett turned directly to the potential and actual consequences for democracy carried in the routinisation of big data generation and usage. Big data is a wonderfully powerful tool in many areas, in the sciences in particular, but also in surveillance, because it changes the way the intelligence services work and the way citizens - and journalists - must relate to them. Where there were once specific targets, everyone is now under automatic surveillance, the information ready to be accessed by an agent if so desired. Where suspects where once placed under surveillance by human judgment, computers looking for patterns of behaviour will flag most suspects for the intelligence services in the future, most of which will be false positives. The copious amount of metadata we leave by living normal 21st Century lives mean we can be mapped in the smallest detail with a minimum of effort. Activists, journalists and their sources will need to be much more careful and courageous than ever before, and a self-censorship mentality is a likely outcome. These changes should be problematic also to those who place much trust in the state’s intentions – with the huge size of the intelligence services in Britain and its allies, and the information sharing amongst them, leaks for profit and other motifs is inevitable. How are we to deal with this brave new world?

The day closed with a lively discussion on the propriety of big data generation and usage; on its liberating and oppressive effects. In this, and throughout the day, a number of themes and issue-areas were apparent:

  1. In what could otherwise be presented as a managerial or simply technical solution to, for example, crime investigation, population movement or consumer marketing, underlying principles are at stake; not least the legitimate extent and appropriateness of state and private-corporate surveillance or monitoring in a democratic polity and society?
  2. At heart we are dealing with the changing nature of power in contemporary societies and whether the digital revolution is entrenching or fundamentally altering past configurations.
  3. So, rather than trade only in discussions on the technical capacity of states and other actors to carry out such surveillance / monitoring, or on their relative efficiency when doing so, we should be examining the objectives of data gathering in the first place.
  4. Insistence on clarity of purpose is essential to ensure democratic accountability, along with the expectation that the institutions charged with oversight of intelligence and other kinds of data gathering are fit for purpose.
  5. Clarity of purpose also bears on the sensitive area of data sharing. Where this occurs – as it frequently does – lists of “customers” for shared data must be publicly available and their intended uses made known.
  6. At the same time, it is necessary to distinguish between data willingly and knowingly surrendered by individuals and data – perhaps metadata - gathered from individuals and populations ‘behind their backs’, as it were, as a result of them going on line to further a line of inquiry or personal connection.
  7. This is a tricky area for students of democracy weighted, as it is, with matters of trust and / or a sense that in a highly digitized and interconnected world, citizens and consumers may be relatively uncaring – rather than ill-informed – about the possible dangers of identifying and revealing themselves online.
  8. As such, the different motivations underlying citizen/consumer resistance to data gathering have to be canvassed and understood. The world is not  ordered, or is not just ordered, by principled or ideological cyber-enthusiasts or sceptics.
Details of the outputs from this and the first 2 seminars in the series will be advertised on these pages in the coming months, along with notices about future events.
  • The panel of expert speakers comprised:  Dr Stephen Coulson (Buckingham University and Apsley Analytics), Dr Alex Finnen (Ministry of Defence), Dr Karl Harrison (Cranfield), Professor Jonathan Joseph (Sheffield) and Anthony Barnett (openDemocracy).