On Friday 4 November, ‘power’ was discussed in all its forms at this year’s Aarhus Symposium – a major conference on management held for the sixth year running at Aarhus University. Under the theme ‘The Art of Power’, the programme covered geopolitical power, digital power, industrial power and human power and included speakers such as architect Bjarke Ingels, Birgitte Bonnesen, Jimmy Maymann and Morten Albæk. However, this year’s panel discussion was specifically dedicated to ‘cyber power’.
The participants in the discussion were Digital Society’s Anja Bechmann, privacy researcher and digital sociologist at Aarhus University, together with Thomas Lund-Sørensen, Director of the Centre for Cyber Security (CFCS), and Kim Schlyter, Cyber Risk Leader at Deloitte Danmark and Norden.
If you were not one of the lucky ones to get a ticket for the popular conference, you can read about the interesting panel discussion here.
Agreement: The situation is bad – for various reasons
The three participants in the panel discussion quickly agree that the cyber power situation is very bad. However, their arguments as to why that is differ, since they, due to the nature of their work, have a varying focus: the companies’ reason for concern (Kim Schlyter), the state’s reason for concern (Thomas Lund-Sørensen) and the average citizen’s reason for concern (Anja Bechmann).
Thomas Lund Sørensen points out that we should see digitisation as a good thing and generally only be concerned about crime:
“We should be concerned about those who previously stole PIN codes by looking over people’s shoulder. But we should not be concerned about the companies; they deliver a product free of charge in exchange for data”.
Kim Schlyter sees it as a problem that companies do not necessarily store and process data in a secure manner, even though they believe that they are:
“Companies think that data is ‘secure’, but it’s actually only ‘compliant’ with legislation. We’re constantly being hacked, which increases the need for companies to handle data securely.”
Anja Bechmann has a very different concern. She sees it as a fundamental problem that data is regarded as a commodity and not as part of the individual:
“When we interact, for example on the social media, we produce data by virtue of our interaction. So the data is part of our behaviour and thereby part of our identity. But it is bought and sold as if it were a product.”
Data in the wrong hands – what do we do?
A major point of discussion was what individuals should do if their data falls into the wrong hands.
Anja Bechmann highlighted the problem that ordinary people are not entitled to advice or financial compensation if data is stolen from them, revealing their identity, as identity theft is not covered by legislation:
“The costs of rebuilding your identity are not covered. Only the financial loss in connection with transactions is covered, as the banks are considered the victim.”
“In Scandinavia and the EU, we generally place enormous confidence in the state, which we trust to protect us, but in the event of identity theft, it doesn’t,” explains Anja Bechmann.
Kim Schlyter points out that not only hacking of personal data leads to identity theft, but also what started out as ‘phishing’ and now also is referred to as ‘smishing’, where people receive lots of text messages on their mobile phone from unknown senders:
“People can do nothing more than protect themselves. Where would they go?” Kim Schlyter asks and continues:
“The authorities could be the ones offering help, but that would be at the expense of our personal freedom.”
Restricting our personal freedom is what Anja Bechmann regards as a fundamental problem of giving the state, the police and companies access to more data:
“Just because data can be accessed doesn’t mean that it has to. The police and companies with court orders don’t have to be allowed to systematically access personal data belonging to thousands of citizens, even though it’s technically feasible. They should have a good argument for how the data can contribute to the investigation or prosecution.”
In response to why the state does not offer more help to citizens in the event of identity theft, Thomas Lund Sørensen explains that the state’s ability to act depends on resources – not economically, but in terms of competence:
“There’s a massive need for skilled people to help deal with matters related to identity theft,” says Thomas Lund-Sørensen while looking across the students sitting in the hall witnessing the discussion.
Need for data market regulations
There was widespread agreement among the three experts in the panel discussion that regulations need to be implemented to prevent data abuse.
“The data industry is greater than the darknet – i.e. the porn and drug markets in total. Data is bought and sold illegally with BitCoins. Regulations are necessary, in the same way as they have been in the financial sector,” says Kim Schlyter.
The problem is, however, that data brokers are located outside Denmark, and that Danish regulations will not be able to protect Danes efficiently,” says Anja Bechmann.
In that connection, she is excited to see the effect of the EU’s new data protection regulation. The regulation includes stricter requirements with regard to informed consent from the user and companies’ stated purpose of the data. If companies do not comply with the requirements, they risk being fined, corresponding to four per cent of the company’s global turnover.
The debaters’ best advice
Finally, the three debaters are asked to give their best advice on how users can enhance their digital security.
Kim Schlyter offers a specific piece of advice, encouraging people to create long, unique passwords.
Thomas Lund-Sørensen advises to keep software updated and consider what information access is granted to, especially on the mobile phone:
“The Internet is developing, and unfortunately, we’ll have to be more suspicious and focus on protecting our devices, among other things by encrypting data, which I would advise people to do”.
Anja Bechmann, on the other hand, does not have any specific advice. Her research shows that users always choose to behave in a way that is natural and convenient for them. She recommends that the Danish state responds to that behaviour:
“The users will not stop sharing their data without reading the long declaration of consent or become better at remembering to update their software. My best advice is to look at the users and start introducing regulations on their terms”.
“What is it exactly we’re protecting?” Anja Bechmann asks in conclusion.