RAID’s “rich dialogue” unlocks global opportunities of tech governance
At RAID (Regulation of Artificial intelligence, Internet and Data) H2 on October 10th, Online policymakers, regulators and industry united to work towards systems of “New Tech Governance for the New Globalization”.
The biannual conference highlighted the need for industry, regulators and governments to work together to unlock a range of opportunities for society, trade and economic growth.
In his Keynote Address Didier Reynders, Justice Commissioner, European Commission said: “With tech regulation, we don’t want to stifle innovation and competition. On the contrary, we want to make sure that big tech companies operating in the European Union cannot abuse their market power. New players should be able to enter the market and compete on a level playing field. At the same time, users need to be sure that the online environment is safe; that individual’s fundamental rights and values are respected.
“Our focus is the same, whether it relates to data technologies such as AI or infrastructure: it’s about fairness, transparency and trust, because an absence of these will hurt innovation, trade and investment.”
In her Keynote Address Gabriela Ramos, Assistant Director-General for Social and Human Sciences, UNESCO said: “We know AI has immense potential to improve the lives and wellbeing of people. But we must recognise that AI, for now, does not benefit everyone. How could it, when most AI technology is concentrated in the hands of few companies in a small number of countries, and when these companies have chosen the path of self-regulation?”
On the opening panel discussion, Jean-Pierre Raffarin, Former Prime Minister, France and Chairman, Fondation Prospective et Innovation emphasized the need for international alignment. “A balance must be found between sovereignty and cooperation. The first step for great advances in governance and the new globalization is the search for peace.”
Ventsislav Karadjov, Deputy Chair, European Data Protection Board (EDPB) highlighted the need to prioritise common values. “The world is becoming increasingly interconnected in the light of technological innovation. We need a harmonised legal framework for technology,” he said.
Chairing the opening panel, Filip Van Elsen, Partner and Global Co-Head of Technology at Allen & Overy said: “The topic of the new globalization and how tech policy can support global growth and avoid protectionism is very important for business. The Biden administration has issued an executive order, which is aimed to lead to more trust and stability in transborder data flows. But at the same time we are seeing a rise of data localisation laws. So clearly there are conflicting trends playing around.”
Thibaut Kleiner, Director for Policy Strategy and Outreach, DG Connect, European Commission said: “Europe is there for business. We want trade to increase, and to develop global standards. We need to address the fear that governments have about data flows being something they cannot control. The idea is not to stop data flows, not to localise data, but to create what we call in the EU a single market for data. But there may be times where for national security reasons you want to control data.”
Dr. Emilija Stojmenova Duh, Minister for Digital Transformation, Government of Slovenia said: “What is required is a balanced regulatory approach that safeguards fundamental human rights without stifling innovation. Our goal is to create a safe internet, preventing hate speech on one side while providing freedom of expression on the other.”
In the “Deep Dive” session on Consent as a Legal Basis, Cecilia Álvarez, Director of Privacy Policy Engagement, EMEA, Meta said: “As privacy practitioners, we sometimes forget that privacy laws do not govern all of our human being interactions, and we should not ignore the rules and the rationality behind the legal institutions that GDPR borrows, or expects interaction, from. This applies both to the legal basis of consent and contractual necessity.”
On the panel Developing Trust in a Changing Internet, Nick Seeber, Partner & Internet Regulation Lead at Deloitte said: “There is a broad lack of trust in the digital space. Consumers and business users are confused and looking for governments and platform companies to take a lead in creating trust.”
He pointed out three ways to overcome this issue: firstly making it easier for ordinary users to resolve trust disputes quickly and at a low cost – as the EU Digital Services Act (DSA) will enable for content. He also highlighted “the professionalization of the trust function – every company is going to have an equivalent of a chief trust and safety officer,” and finally, how to increase the level of democracy in trust policy development.
“The ability to use digital platforms to allow ordinary users to participate in policy development would increase legitimacy; it would also show transparency to regulators; and would expose the complexity. The digital democratisation of trust is a very exciting development.”
Christel Schaldemose MEP, who led on the European Parliament’s passing of the DSA, acknowledged that the EU might have been too slow to regulate platforms in the past, but the DSA now strikes a balance between regulating and demanding that platforms take responsibility. The Act will also be fine-tuned in the future. “The metaverse might be a reality; we will need to be faster than we have been so far,” she said.
Leonardo Cervera Navas, Director, European Data Protection Supervisor (EDPS) acknowledged the great possibilities that web 3 and the metaverse bring for the future, but warned of new risks, from profiling, security issues and identity management to deep fakes. “These risks pose a threat to the sustainability of the metaverse. If actors do not address properly and urgently there will be lots of business opportunities lost,” he said.
Maeve Hanna, Partner at Allen & Overy highlighted that discussion about metaverse ethics is already underway, but questions remain about who can be responsible for moderating content. “Individuals have a role, but we need to look beyond users to platforms and governments. Tech companies are best placed to understand their tech, so they have to be part of this discussion.”
Norberto Andrade, Director, AI Policy at Meta, focused on “the role of sandboxes as mechanisms to reduce uncertainty and enhance an evidence-based approach to policy making… and how regulatory sandboxes are creating momentum to think systematically about policy experimentation.”
Speaking on Digital Safeguarding for All, Thomas Van der Valk, Privacy Policy Manager EMEA, Meta said: “Meta is investing a lot in making sure that all users, in particular youth, are protected on our services. For young people it is incredibly important to be able to go online in a safe environment. There are no single perfect solutions to getting the balance between safety and privacy; that’s why we’ve been working on a multi-layered approach to safeguarding minors that involves age verification, parental supervision and an age-appropriate experience. We have a whole suite of tools and safeguards that do just that.”
Joanna Conway, Partner & Internet Regulation (Legal), Deloitte said: “From a legal and regulatory perspective, we are seeing a wave of regulation and laws coming in globally that touch on making online space safer.
“It’s really important to understand that if a platform’s using AI because of the sheer volume of content it needs to detect and analyse, sometimes it will get it wrong. There’s a balance required there – so occasionally it gets it wrong, but also how much is the tool getting right? Should we be so critical of the tool? Are we actually better off overall for having the tool? That’s what the regulations are going to be good at. The DSA and the Online Safety Bill require a holistic risk assessment and a look at what mitigation is put in place and an assessment of how effective it is. It won’t be perfect and there will be trade-offs.”
Commissioner Kristin N. Johnson, U.S. Commodity Futures Trading Commission stressed the need during the “current crypto winter” for regulators to provide stability. “I feel my job is to protect the markets from the possibility that ‘winter is coming’. It’s important to ensure regulatory infrastructure to protect against the possibility of crisis in one firm leading to crisis across the industry.”
“This is such a rich dialogue,” she added. “It allows us to think collaboratively about how to approach new technology.”
Prof. Joachim Wuermeling Executive Board Member, Bundesbank said: “My takeaway from the RAID conference is that questions related to digitalisation reach far beyond the mandate of financial supervisors, including data protection and competition, for instance. Thus, regulators need to cooperate more closely across sectoral borders.”
Katherine M. Harman-Stokes, Acting Director, Office of Privacy and Civil Liberties, United States Department of Justice said: “We have to be able to provide for public safety and national security, while protecting the privacy people expect in their personal data. There are a number of international initiatives reflecting the growing awareness of convergence in how democratic societies strike the balance between privacy and public safety, which should provide a trusted and sustainable basis for international data transfers.”
At the end of the panel, she said: “This has been very informative. I am optimistic that we will continue to develop privacy frameworks based on the same commonalities that will move us toward convergence.”