A group of artificial intelligence researchers from nearly 30 countries is boycotting one of South Korea's most prestigious universities over concerns about a recent partnership with an "ethically dubious" arms manufacturer with the stated purpose to design and manufacture "autonomous weapons systems".

The Korea Advanced Institute of Science and Technology (KAIST) and its partner, the weapons manufacturer Hanwha Systems, one of South Korea's largest arms dealers, are pushing back against the boycott, saying they have no intention of developing "killer robots" - even though the description of the project clearly states its goals, per the Guardian.

"There are plenty of great things you can do with AI that save lives, including in a military context, but to openly declare the goal is to develop autonomous weapons and have a partner like this sparks huge concern," said Toby Walsh, the organiser of the boycott and a professor at the University of New South Wales.

"This is a very respected university partnering with a very ethically dubious partner that continues to violate international norms."

What's worse, the scientists say, is Hanwha's history of manufacturing and selling cluster munitions and other arms that are banned in more than 120 countries under an international treaty that South Korea, the US, Russia and China have not signed.

Walsh, an Australian professor, became aware of the project after reading a Korea Times article about the partnership. He said he promptly wrote the university asking for more information - but never received a response.

Walsh was initially concerned when a Korea Times article described KAIST as "joining the global competition to develop autonomous arms" and promptly wrote to the university asking questions but did not receive a response.

Participants in the boycott have promised not to visit KAIST or host or collaborate with any of its faculty "over fears it could accelerate the arms race to develop autonomous weapons."

KAIST opened the controversial research center on Feb. 20. At the time, university leaders said it would "provide a strong foundation for developing national defense technology."

The announcement of the initiative, which has since been deleted, said it would focus on "AI-based command and decision systems, composite navigation algorithms for mega-scale unmanned undersea vehicles, AI-based smart aircraft training systems, and AI-based smart object tracking and recognition technology."

However, for all their effort, it appears the boycotters are already too late to prevent the creation of killer robots, though the group is still agitating for governments to promise to ban the manufacture, use and distribution of these weapons.

South Korea’s Dodaam Systems already manufactures a fully autonomous "combat robot", a stationary turret, capable of detecting targets up to 3km away. Customers include the United Arab Emirates and Qatar and it has been tested on the highly militarised border with North Korea, but company executives told the BBC in 2015 there were "self-imposed restrictions" that required a human to deliver a lethal attack.

The Taranis military drone built by the UK’s BAE Systems can technically operate entirely autonomously, according to Walsh, who said killer robots made everyone less safe, even in a dangerous neighbourhood.

"Developing autonomous weapons would make the security situation on the Korean peninsula worse, not better," he said.

"If these weapons get made anywhere, eventually they would certainly turn up in North Korea and they would have no qualms about using them against the South."

The idea that governments should do more to prevent, or at least regulate, increasingly advanced smart weapons is gaining traction around the world. Last year, Elon Musk surprised his twitter followers by conjuring up an image of robots walking down streets murdering people. While Putin once jokingly mused "how long until the robots eat us?"