Professor, Filiorum – Kindergarten Research Center, University of Stavanger
We need a visionary response in the fight against technology that is used by all, but controlled by very few.
This is a discussion post. Opinions in the text are the responsibility of the writer.
The Data Inspection break with Facebook should be just the beginning. The continuation should be on strategic and policy approaches that can provide secure and research-based technology.
Data inspection recently revised the privacy implications of Facebook. The management decided that the Danish Data Protection Agency should not be on Facebook or communicate via Facebook.
The evaluation is welcome, although it is surprising that it only came now, after many years of data misuse. Remember that in 2018 it was revealed that the data analytics company Cambridge Analytica had accessed private information about more than 80 million Facebook users, information that was used to influence the Trump election campaign.
What The Technology Council suggests, more public actors are likely to close their Facebook pages. But what will happen next is unclear.
What about other platforms?
Facebook has spent several years expanding its communications empire with Instagram, Messenger, Pinterest, and WhatsApp. They have developed powerful algorithms to tailor content exactly to different user groups.
Will the separation of Facebook be followed by the elimination of other social networks in Norway?
How should public actors communicate with their users without a similar platform?
What about other great platforms and apps that are widely used, but have not been evaluated by the Norwegian Data Protection Authority?
Designed for business benefit
Social media claims to be related to communication, but the design is optimized for business benefits. Tech researchers have been warning about this problem for more than a decade.
We are increasingly frustrated and confused by the lack of a government strategy. The consequences of this lack of strategy will be greater among the most vulnerable groups that society must protect. Like little children.
Technology is being used more and more in the education of young children. But there is no legislation that prevents Google or Facebook from misusing your data for commercial purposes.
Is obtain research evidence that learning platforms and apps used at home and at school really help their education. It is rather the opposite. Recent studies show that the most popular applications advertised as “educational” do not support children’s learning.
Two main strategies
The analysis and conclusion of the Norwegian Data Protection Authority cannot be taken as news. They should be the beginning of a strategic policy approach to create solutions for safe and research-based technology now. A development in the right direction can be guaranteed with two main strategies.
First, we need to reduce the distance between technology developers and buyers and their users. Quality can be increased through direct collaboration between technology producers and teachers, families, and children. Collaboration should be used directly in technology development, not just to evaluate existing solutions.
Researchers can do whatever they can to support research-based technology. But the road from the development of technologies in universities to their launch on the market is too long.
The government should demand a greater scientific effort from technology manufacturers. Domestic manufacturers should be able to present clear evidence that there is good research behind claims that their platform is educational and safe.
Such evidence cannot be found simply by having an investigator attend the annual board meetings. Such proof is obtained by requiring technology manufacturers to test their technologies. carefully, by schools buy them.
In the United States it is four levels of evidence used by the US government to decide whether to provide funds to schools and districts for the purchase of technology in schools.
The United States is not a good example when it comes to protecting personal information. The EU privacy regulation GDPR, on the other hand, has good rules on data protection. The new regulation of the UK on protecting children’s digital rights is a good example of how to toughen the rules to protect children.
Examples from other countries are not ideal, but can be used to steer the discussion towards constructive solutions.
In Norway, we have used a lot of US technology, but we have not adapted the content and services around them to the Norwegian context. This gap represents a unique opportunity for the innovation sector, but also for the new government to show its digital momentum.
Not using Facebook is a good reaction. But we need a visionary response in the fight against technology used by all but controlled by very few.