As AI brings ethical and national security issues to the table, one new rule is hoping to keep sensitive software out of the hands of adversaries.
The Trump administration is attempting to make it more difficult for export of certain types of sensitive artificial intelligence (AI) software. The new rule went into effect on January 6, 2020, and governs certain types of geospatial imagery software.
Companies will have to apply for a license to export this technology across US borders and overseas (with the exception of Canada). The administration is hoping to curb software that has the potential to aid foreign military operations in their own quest for bigger, better AI.
The rule covers software found in things like drones, sensors, and satellites – anything that could automate the process of identifying targets, and the administration is considering pursuing international regulations to encourage level playing fields with these AI exports.
Industry Relief Amid Security Concerns
The news came after a lot of speculation about much broader and more restrictive crackdowns of software exports. For now, the industry is mostly pleased with the final rule, citing fears of more significant regulations that would have had onerous effects on advancements across the board.
The new rule is largely bipartisan and a response to a growing concern that technology is outpacing existing commerce regulations, opening the door to substantial security concerns. The rule provides guidance, for now, for companies building and selling this sensitive software.
As other countries gain access to this type of software, fears that exploitation of security weaknesses, as well as copying and counterfeiting concerns, have been at the forefront of regulatory talks.
The Downsides and the Benefits of Regulation
For some, the new regulations could spell greater difficulties using AI for good purposes. Things, like monitoring the effects of climate change or studying urban sprawl may be more difficult with licensing and compliance.
However, the industry fully expected much stricter regulations for exports, including the possibility of banning AI exports altogether, so the current iteration of the law was somewhat relieving.
It’s difficult to stop the spread of information across an entire field. Still, the administration hopes to slow the rate of exchange, giving the United States time to address security concerns and build software to maintain an economic advantage. For now, the new regulation is US-specific, and we’ll soon see if international governing bodies enact the same.