Experts have previously expressed fears that AI could escalate conflicts around the world because of "slaughterbots" that can kill without any human intervention.

Last week, OpenAI quietly lifted the ban, stating that the company would not allow the use of models for "activities with a high risk of physical harm, including weapons development, military, and warfare."  

"Our policy does not allow our tools to be used to harm people, develop weapons, conduct surveillance, harm others, or destroy property. However, there are national security use cases that are compatible with our mission."

The OpenAI employee said that it was not clear whether "military" use would be allowed under the previous policy in beneficial use cases, adding that the purpose of the policy update was to provide clarity and the ability to have these discussions.

Last year, 60 countries, including the US and China, signed a "call to action" to limit the use of artificial intelligence (AI) for military purposes.

The use of AI for military purposes by "Big Tech" organizations has previously caused controversy.

In 2018, thousands of Google employees protested against a Pentagon contract (Project Maven) to use the company's AI tools to analyze drone surveillance footage.

Microsoft employees also protested a $480 million contract to provide soldiers with augmented reality headsets.

In 2017, tech leaders including Elon Musk called on the UN to ban chemical weapons and laser weapons to blind people.

They warned that autonomous weapons could lead to a "third revolution in warfare": the first two were gunpowder and nuclear weapons.

Experts warn that once the "Pandora's box" of fully autonomous weapons is opened, it may be impossible to close it again.

Former MI6 agent and author Carlton King says that in the near future artificial intelligence could control pilotless attack aircraft.

The advantages of learning to use machines to pilot attack aircraft would be significant for military leaders.

"As soon as you start giving machine learning to an autonomous robot, you start to lose control of it. After a while it's tempting to say let the robot do everything."

Noting that drones are flown by pilots in the US and the UK, but military leaders may be inclined to take humans out of the equation, King said: "It is clear that there will be a move, if not already, to remove the pilot on the ground because his reactions will not be fast enough and put it in the hands of an artificial intelligence whose reactions are much faster, who can make a decision to shoot or not to shoot."