• Ingen resultater fundet

F URTHER PERSPECTIVES : COVID-19, S URVEILLANCE AND C ITIZEN P RIVACY

6. CONCLUSION

6.1 F URTHER PERSPECTIVES : COVID-19, S URVEILLANCE AND C ITIZEN P RIVACY

The right to privacy of data subjects has been a central topic throughout our thesis, as we through our findings and corresponding analysis have shown that the Danish public sector is subjected to an extensive amount of TPT. We wish to shortly broaden the perspective of our thesis as we at the time of writing are in the midst of the COVID-19-crisis, which allows for drawing on peripheral, but interesting perspectives with relevance to our study.

We therefore in this section discuss COVID-19 and its implications for the digitalized state of Denmark. When a crisis hits, it is often unexpected and therefore defining for our current society, demanding quick decisions. These rapid reactions and decisions might however be the cause of changing future societal dynamics. 9/11 is an example of a crisis leading to changing dynamics. In the repercussions of 9/11 a shift was seen from privacy and democratic rights towards state security in the United States. NSA was established shortly after and the giant tech companies entered into hidden partnerships with intelligence agencies, hereby canceling the initial thoughts of protecting privacy of the individual (Lyon, 2001, Flyverbom, 2020). It is argued by Shoshana Zuboff that the digital surveillance as a result of “national security” has caused an unregulated field of commercial tracking, which is known as surveillance capitalism (Zuboff, 2019).

The corona crisis is a different crisis but calls for the same considerations. Fear and insecurity should not be used as tool for implementing further surveillance, which would not have been accepted in the first place. The immediate choice of the Danish state would have been to monitor the Danes if they were staying home, grouping, etc.

“Governments around the world are in the process of implementing analogue as well as highly technological solutions, with the purpose of containing the infection” (Persz, 2020)

At least 25 countries are planning to implement collection of data from mobile devices, apps, etc., which might compromise privacy (Persz, 2020). The Danish state and a big part of the Western world did however choose not to monitor digitally through arguments of safety and privacy. Google sought to implement a “Corona-app” allowing for tracking of the infection to be mapped all over the world, which would have equaled more data given away. Due to stronger privacy guidelines of Denmark, the Agency of Digitisation denied the app to be implemented in Denmark, as a similar Danish app existed using Bluetooth instead and is therefore not tracking geolocation data. This is interesting in relation to our findings, which are however not related to any crisis, but still to a possibly compromised privacy, TPSs, hereunder TPT, are already existing and to the fact that we found no indication of governance. The arguments of safety and privacy are therefore contradicting our findings.

The question raised by Flyverbom (2020) in the context of a crisis is “If technological solutions can help us out of the worst crisis in history, should we then use them at any cost”? Several professors, such as Mikkel Flyverbom, Professor & member of the Data Ethics Council of Denmark, Stine Bosse, head of the TechDK Commission, Nanna Bonde Thylstrup, PHD and Lector in Communication and Digital Media, argue that we should be careful in implementing new technologies too rapidly, as it might compromise privacy (Persz, 2020 & Flyverbom, 2020)

The TechDK commission has composed guidelines for helping navigate within the Corona-crisis thereby ensuring privacy, democracy and ensuring that tech organizations are not gaining more power at the cost of citizen privacy, when citizens are vulnerable. Stine Bosse argues in Persz (2020) that we should not be afraid to discuss the downside to surveillance in times like these, as she argues that we need transparency about where data is stored, who controls it and the legal background for collection and use.

The issue with introducing new technologies to accommodate a crisis as the corona-crisis, is what Nanna Bonde Thylstrup calls “function creep”. Function creep is a tendency, where technologies are developed for one purpose, but used in other contexts. Nanna Bonde Thylstrup further argues that governmental and private institutions show great interest in the development of these technologies, which could be problematic. “When such actors come into play, and often in collaboration with government institutions, there is a real risk of abuse in the form of function creep” Nanna Bonde Thylstrup in Perz (2020). This is consistent with the theory of Omobowale et al. (2010) arguing that private-public conflict often is a result of different incentives, i.e. the public need for competences to be more efficient vs. the private incentive of increased profit.

It is further argued by Nanna Bonde Thylstrup that Denmark is lacking a proper democratic discussion of how technologies are to be used and that we rarely have such discussions in Denmark.

She argues that this is a result of not being “raised” to think like that, as we do have a history of trust to the state, whereas in Germany, a history of mistrust is more evident, having prepared them better for these situations (Persz, 2020). Trust to the state in Denmark could and should therefore be treated with care.

The Danish response is a reflection of digital responsibility, and a response towards tech companies and the complexity of these organizations managing critical societal functions. It is however interesting to view the stand and actions of the Agency of Digitisation in relation to our findings, as our findings are contradicting the stronger privacy guidelines of not being subjected to tracking. We did show extensive TPT on a great number of public pages and found little to no indication of any governance of the use of TPSs.

Privacy, freedom and trust to the state must be sustained, in order to keep running society with its current model, which is that of well-functioning e-government. It is therefore essential that we embrace technology at our own pace and do not view technology as a train leaving without us. We should not adopt technologies that in the moment seems evident, without thinking about the future consequences. The lacking governance in our findings could therefore be a result of adopting services and technologies too rapidly, which in the end might compromise privacy.