The government in the United States also wants more control over AI. For this a law is revived from 1950.
Ever since the launch of ChatGPT in 2023, the real AI hype has been going on. And many companies are coming up with creative AI products. European regulators have already decided through AI legislation some scenarios, such as biometric identification, where the use of artificial intelligence is completely prohibited. Additionally, certain security mechanisms have been introduced to systems such as ChatGPT. Regulators in the United States want more control over the AI market, although the country has not introduced any new legislation to do so.
Defense Production Act
There is already an investigation in the US into investments by big tech companies in AI startups. That investigation will help the FTC build a clearer picture of the relationships between big tech companies and AI providers. President Biden's administration also wants more Information Developments in the AI landscape. To this end, the government looked to the Defense Production Act, a law enacted at the start of the Korean War in 1950. The law allows the president to request information from AI companies.
AI companies don't have to turn over all information to the government, but they do have to let them know when they're actively training AI models. The government wants all projects implemented with a certain amount of computing power to be reported. If an AI model uses more than a septillion floating point operations or FLOPS, this must be reported. For AI models working with DNA, the limit is 1,000 times lower, so the law is stricter. Even so, that's still a sextillion flops — a 1 followed by 36 zeros.
FLOPS fail?
Those boundaries can change quite a bit, and for good reasons. First, it's unclear how much computing power OpenAI and Google use to train their models. This makes it very difficult to put the limit right: GPT-4 is estimated not to require that much computing power. Additionally, the development of AI systems has become more efficient over time, requiring less computing power to achieve the same results. The government may later reduce the limit for this.
Reporting obligation
AI companies should not only report their own activities to the government. The US Commerce Department has already confirmed that AI companies must also report customer activity. This includes looking at how much computing power customers are using to train AI models. Additionally, AI providers must ensure that customers can properly identify themselves. So companies must verify that identity or risk paying fines. The regulations typically affect foreign companies, but were created to protect the U.S. cloud from Chinese actors.
Although the Biden administration started later than the EU, the US seems to be moving faster. No new laws should be written to impose obligations on AI companies. Biden should confirm with his signature that the development of AI is a matter of national security. After that, the president can compel companies to share information. Thanks to the 1950 Act, this will be possible from next week.
“Passionate analyst. Thinker. Devoted twitter evangelist. Wannabe music specialist.”
More Stories
From Concept to Creation: Designing Your Signature Acrylic Nails
How to Care for Your Marginated Tortoise Year-Round
Biden and Xi want to sit down one last time