What to expect from US, EU and China’s new laws: AI - Tech
The emerging regulatory landscape for artificial intelligence (AI) in the United States, the European Union, and China. It highlights the key provisions of each region's approach and outlines the potential impact on AI development and deployment.
US, EU and China’s new laws
In the United States, the Biden administration has issued an executive order calling for increased transparency and accountability in AI development. The order requires companies to assess their AI systems for potential cybersecurity vulnerabilities and provide information about the data used to train and test them.
The European Union has taken a more comprehensive approach to AI regulation with the passage of the AI Act. This law categorizes AI systems based on their risk level and imposes stricter requirements on those deemed to pose higher risks. For instance, social scoring systems and real-time facial recognition are prohibited under the AI Act.
China has also been actively regulating AI, with a focus on preventing misuse and ensuring alignment with national objectives. Chinese regulations emphasize the need for AI systems to be explainable, transparent, and trustworthy. They also address concerns about deepfakes and the ethical use of AI in recommendation systems.
Take Home
Regulatory efforts are influenced by national contexts, such as the US’s concern about cyber-defence, China’s stronghold on the private sector and the EU’s and the UK’s attempts to balance innovation support with risk mitigation. In their attempts at promoting ethical, safe and trustworthy AI, the world’s frameworks face similar challenges.
Some definitions of key terminology are vague and reflect the input of a small group of influential stakeholders. The general public has been underrepresented in the process.
Policymakers need to be cautious regarding tech companies’ significant political capital. It is vital to involve them in regulatory discussions, but it would be naive to trust these powerful lobbyists to police themselves.
