ADF unveils robotic and autonomous weapons amid calls for stronger ethical guidelines


As governments around the world turn to artificial intelligence and autonomous weapons in conflict, human rights advocates and scientists are calling for stronger frameworks to guide their use.

Last month the Australian Defence Force unveiled a suite of weapons at the land autonomous systems and teaming demonstrations at the Puckapunyal Army Base in northern Victoria.

Among them were drones that fly themselves, robotic combat vehicles, uncrewed tanks, and a robot dog that can be used to clear landmines.

The weapons are not fully autonomous and require a level of human interaction.

Two smiling soldiers in camouflage fatigues.

Lieutenant Colonel Jake Penley and Lieutenant Colonel Adam Hepworth (right) at the Puckapunyal Army Base. (ABC Shepparton: Charmaine Manuel)

Robotic and Autonomous Systems director Lieutenant Colonel Adam Hepworth said the Australian Army saw a range of uses for AI in administrative settings and “on the coalface of war”.

He said it was important to “maintain our human custodianship of decision-making”.

“Every system that we have on board has to go through a legal review process to meet all of our domestic and our international obligations under international law,” Lieutenant Colonel Hepworth said.

But some leading voices in science and human rights say more needs to be done to set up stronger frameworks governing their use.

A smiling, bespectacled man touches his finger against a robot's finger.

Toby Walsh says artificial intelligence is a “double-edged sword”. (Supplied: TU Berlin/Press/Christian Kielmann)

‘A very dark, very dangerous place’

The ABC sent a catalogue of the weapons on display at Puckapunyal to Toby Walsh, the chief scientist at the University of New South Wales’s AI Institute.

“I was quite impressed, quite nervous in terms of seeing the breadth of capabilities being demonstrated,” he said.

A robot dog controlled by a woman in a high-visibility vest.

Robotic dogs can be used to clear mines. (ABC Shepparton: Charmaine Manuel)

Professor Walsh said the use of artificial intelligence and autonomous tools was a “double-edged sword”.

“There’s some very positive applications that you can think of — for example, a mine-clearing robot,” he said.

“No-one should ever have to risk a life or limb ever clearing a minefield ever again.

“That’s a perfect job for a robot.

“If it goes wrong the robot gets blown up, you go and buy yourself a new robot.

“But equally, there are places where, I think, handing over that decision making to algorithms is going to take us to a very dark, very dangerous place.

“This is a space in which technology is evolving very rapidly.

“The military is adopting it incredibly fast.”

A weapons rig in a field.

An OWL-B loitering munition on at Puckapunyal. (ABC Shepparton: Charmaine Manuel)

Professor Walsh said the software used in the weapons was easy to steal, copy or hack.

“It’s actually just a matter of changing the code and something which was asking an operator to confirm the target selection [or] the target choice could be turned into a fully autonomous weapon,” he said. 

“It is frustrating that the Australian Defence Forces say, ‘Well, no need to worry, nothing to think about here,’ when unfortunately the technologies are going to be available to other people where it is everything to worry about.

 “We need to keep some meaningful control by humans over what these machines are doing.”

A dark-haired woman in a patterned dress and a black coat.

Lorraine Finlay says the use of autonomous weapons poses challenges to the principles set by international humanitarian law. (Supplied: Australian Human Rights Commission)

Machines blind to value of human life

Human Rights Commissioner Lorraine Finlay said the use of autonomous weapons posed challenges to principles set by international humanitarian law.

She said the system of review set out by the Geneva Convention was flawed because autonomous weapons were designed to learn from each mission and the technology evolves.

A pair of armoured fighting vehicles drive across a field.

Optionally crewed combat vehicles on display at Puckapunyal. (Supplied: Australian Defence Force)

“In particular there are concerns about whether machines can really understand proportionality, because they don’t understand the intrinsic value of a human life,” Ms Finlay said.

“Simply saying there’s a human somewhere in the loop isn’t enough — it needs to be clear exactly where in the loop they are, what their authority is, and are they the ones making the critical decisions or is that being left to the machine?”

A swarm of drones fly through a clear sky.

Target drones displayed at the demonstration at Puckapunyal.  (Supplied: Australian Defence Force)

There are no specific regulations relating exclusively to lethal autonomous weapons and Ms Finlay’s key recommendation to the government is that there should be.

In November last year, Australia voted in favour of a resolution at the United Nations that urged the international community to consider the challenges posed by autonomous weapons systems.

“So previously Australia’s taken the position that it was premature to regulate … but we’re hopeful that this most recent resolution shows a shift in that position and a recognition that the time is now to actually address these issues and make sure that those safeguards are put in place,” Ms Finlay said.



Source link

spot_imgspot_img

Subscribe

Related articles

spot_imgspot_img

LEAVE A REPLY

Please enter your comment!
Please enter your name here

18 + 19 =