The US government on Monday announced it would further restrict artificial intelligence chip and technology exports, divvying up the world to keep advanced computing power in the US and among its allies while finding more ways to block China’s access.
The new regulations will cap the number of AI chips that can be exported to most countries and allow unlimited access to US AI technology for America’s closest allies, while also maintaining a block on exports to China, Russia, Iran and North Korea.
The lengthy new rules unveiled in the final days of outgoing President Joe Biden’s administration go beyond China and are aimed at helping the US keep its dominant status in AI by controlling it around the world.
“The US leads AI now — both AI development and AI chip design, and it’s critical that we keep it that way,” US commerce secretary Gina Raimondo said.
The regulations cap a four-year Biden administration effort to hobble China’s access to advanced chips that can enhance its military capabilities and seek to maintain US leadership in AI by closing loopholes and adding new guardrails to control the flow of chips and global development of AI.
While it is unclear how President-elect Donald Trump’s incoming administration will enforce the new rules, the two administrations share similar views on the competitive threat from China. The regulation is set to take effect 120 days from publication, giving the Trump administration time to weigh in.
New limits will be placed on advanced graphics processing units (GPUs), which are used to power data centres needed to train AI models. Most are made by Nvidia, while AMD also sells AI chips.
Major cloud service providers such as Microsoft, Google and Amazon will be able to seek global authorisations to build data centres, a powerful part of the new rules that will exempt their projects from the country quotas on AI chips.
Stringent conditions
To obtain a stamp of approval, authorised companies must abide by stringent conditions and restrictions, including security requirements, reporting demands and a plan or track record of respecting human rights.
Until now, the Biden administration had imposed sweeping restrictions on China’s access to advanced chips and the equipment to produce them, updating the controls annually to tighten restrictions and capture countries at risk of diverting the technology to China.
Because the rules alter the landscape for AI chips and data centres around the world, powerful industry voices criticised the plan even before it was published.
Read: China to subsidise consumer smartphone purchases
Nvidia on Monday called the rule “sweeping overreach” and said the White House would be clamping down on “technology that is already available in mainstream gaming PCs and consumer hardware”. Software giant Oracle argued earlier this month the rules would hand “most of the global AI and GPU market to our Chinese competitors”.
The rules impose worldwide licensing requirements on certain chips, with exceptions, and also set controls for what are known as “model weights” of the most advanced “closed-weight” AI models. Model weights help determine decision making in machine learning and are generally the most valuable elements of an AI model.
The regulation divides the world into three tiers. About 18 countries, including Japan, Britain, South Korea and the Netherlands, will essentially be exempt from the rules. Some 120 other countries, including South Africa, Singapore, Israel, Saudi Arabia and the United Arab Emirates will face country caps. And arms-embargoed countries like Russia, China and Iran will be barred from receiving the technology altogether.
In addition, US headquartered providers likely to receive global authorisations such as AWS and Microsoft will be allowed to deploy only 50% of their total AI computing power outside the US, no more than 25% outside of the tier-1 countries, and no more than 7% in a single non-tier-1 country.
Read: Tax break plan may lure China to build electric cars in South Africa
AI has the potential to increase access to healthcare, education and food, among other benefits, but also can help develop biological and other weapons, support cyberattacks, and assist with surveillance and other human rights abuses.
“The US has to be prepared for rapid increases in AI’s capability in the coming years, which could have a transformative impact on the economy and on our national security,” US national security adviser Jake Sullivan said.
How the restrictions will work
Which chips are restricted?
The rule restricts the export of GPUs. Although known for their role in gaming, the ability of GPUs to process different pieces of data simultaneously has made them valuable for training and running AI models. OpenAI’s ChatGPT, for example, is trained and improved on tens of thousands of GPUs. The number of GPUs needed for an AI model depends on how advanced the GPU is, how much data is being used to train the model, the size of the model itself and the time the developer wants to spend training it.
What is the US doing?
To control global access to AI, the US is expanding restrictions on advanced GPUs needed to build the clusters used to train advanced AI models. The limits on GPUs for most countries in the new rule are set by compute power, to account for differences in individual chips.
Total processing performance (TPP) is a metric used to measure the computational power of a chip. Under the regulation, countries with caps on compute power are restricted to a total of 790 million TPP through to 2027. The cap translates into the equivalent of nearly 50 000 H100 Nvidia GPUs, according to Divyansh Kaushik, an AI expert at Beacon Global Strategies, a Washington-based advisory firm.
“Fifty thousand H100s is an enormous amount of power — enough to fuel cutting-edge research, run entire AI companies or support the most demanding AI applications on the planet,” he said. Those could include running a global-scale chatbot service or managing advanced real-time systems like fraud detection or personalised recommendations for massive companies such as Amazon or Netflix, Kaushik added.
But the caps do not reflect the true limit on the number of H100 chips in a country. Companies such as Amazon Web Services and Microsoft that meet the requirements for special authorisations — also known as “Universal Verified End User” status — are exempt from the caps.
National authorisations also are available to companies headquartered in any destination that is not a “country of concern”. Those with national Verified End User status have caps of roughly 320 000 advanced GPUs over the next two years. “The country caps are specifically designed to encourage companies to secure Verified End User status,” Kaushik said, providing greater visibility to US authorities about who is using them and helping to prevent GPUs from being smuggled into China.
Are there other exceptions to the licensing?
Yes. If a buyer orders small quantities of GPUs — the equivalent of up to some 1 700 H100 chips — they will not count toward the caps, and only require government notification, not a licence. Most chip orders fall below the limit, especially those placed by universities, medical institutions and research organisations, the US said. This exception is designed to accelerate low-risk shipments of US chips globally. There also are exceptions for GPUs for gaming.
Which places can get unlimited AI chips?
Eighteen destinations are exempt from country caps on advanced GPUs, according to a senior administration official. They are Australia, Belgium, Britain, Canada, Denmark, Finland, France, Germany, Ireland, Italy, Japan, the Netherlands, New Zealand, Norway, South Korea, Spain, Sweden and Taiwan plus the US.
What is being done with ‘model weights’?
Another item being controlled by the US is known as “model weights”. AI models are trained to produce meaningful material by being fed large quantities of data. At the same time, algorithms evaluate the outputs to improve the model’s performance. The algorithms adjust numerical parameters that weigh the results of certain operations more than others to better complete tasks. Those parameters are model weights. The rule sets security standards to protect the weights of advanced “closed-weight”, or non-public, models. Overall, Kaushik said, the restrictions are aimed at ensuring the most advanced AI is developed and deployed in trusted and secure environments. — Karen Freifeld, (c) 2025 Reuters
Get breaking news from TechCentral on WhatsApp. Sign up here.
Don’t miss:
Latest US offensive against China risks faster technological decoupling