Wednesday, March 3, 2021
  • Setup menu at Appearance » Menus and assign menu to Top Bar Navigation
Advertisement
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
No Result
View All Result
Home Machine Learning

The US Army changed how it describes AI-powered firing by tanks — Quartz

March 17, 2019
in Machine Learning
The US Army changed how it describes AI-powered firing by tanks — Quartz
586
SHARES
3.3k
VIEWS
Share on FacebookShare on Twitter

Credit: Google News

The US Defense Department has revised its description of an initiative designed to use artificial intelligence to give tanks the ability to identify and engage targets on their own.

You might also like

Yum! Brands Acquires AI Company

Cloudera: An Enterprise-Level Play On Machine Learning And Big Data – Seeking Alpha

An open-source machine learning framework to carry out systematic reviews

The change came after Quartz published details of the US Army’s ATLAS program as revealed in a solicitation to vendors and academics. ATLAS, which stands for “Advanced Targeting and Lethality Automated System,” aims to use artificial intelligence and machine learning to give ground-combat vehicles autonomous targeting capabilities that are at least three times faster than a human being.

In Quartz’s Feb. 26 article, the Army said is not planning to replace soldiers with machines but seeks to augment their abilities. ATLAS is primarily designed to increase the amount of response time tank gunners get in combat, Paul Scharre, director of the Technology and National Security Program at the Center for a New American Security, a bipartisan think tank in Washington, DC, told Quartz.

Yet, Stuart Russell, a professor of computer science at UC Berkeley, said even this was a step too far. “It looks very much as if we are heading into an arms race where the current ban on full lethal autonomy”—a US military policy that mandates some level of human interaction when actually making the decision to fire—”will be dropped as soon as it’s politically convenient to do so,” said Russell, an AI expert.

The updated language later added by the Army to the solicitation states:

All development and use of autonomous and semi-autonomous functions in weapon systems, including manned and unmanned platforms, remain subject to the guidelines in the Department of Defense (DoD) Directive 3000.09, which was updated in 2017. Nothing in this notice should be understood to represent a change in DoD policy towards autonomy in weapon systems. All uses of machine learning and artificial intelligence in this program will be evaluated to ensure that they are consistent with DoD legal and ethical standards.

According to Defense One, the Army will also be drafting new talking points to use when discussing ATLAS.

The machines are only partially taking over

US military leaders appeared March12 before the Senate Armed Services Committee to discuss the state of AI Pentagon initiatives. They emphasized that ethical guidelines concerning AI use have been developed. Lt. Gen. Jack Shanahan, who runs the Defense Department’s AI center, used the word “ethics” or “ethical” four times during his prepared testimony.

An Army spokesman who responded to a request for further details on the new ATLAS description and talking points has not yet provided any.

No “human in the loop” requirement

ATLAS will require a soldier to throw a switch before firing, the Army told specialist website Breaking Defense, which published a March 4 follow-up on Quartz’s reporting that continued into a four-part series about the ethics surrounding autonomous weaponry.

Defense Department directive 3000.09 instructs everyone along the official chain of command that “autonomous and semi-autonomous weapon systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.”

However, as Scharre told Breaking Defense, “The US Defense Department policy on autonomy in weapons doesn’t say that the DoD has to keep the human in the loop. It doesn’t say that. That’s a common misconception.” (“The Directive does not use the phrase ‘human in the loop,’ so we recommend not indicating that DoD has established requirements using that term,’” a Pentagon spokesperson told Breaking Defense.)

Even mechanical firing systems can be made to operate on their own, Russell told the site, further warning of “automation bias” or “artificial stupidity.” This refers to instances in which technology reduces humans to button-pushers who blindly follow a robot’s commands.

Further, directive 3000.09 says the deputy secretary of defense can waive its restrictions—after a mandatory legal review—in times of “urgent military operational need.”

Worries of a “firestorm”

Military language is “at once abstrusely technical and sloppy,” wrote Breaking Defense’s Sydney Freedberg, and the Army’s definition of “lethality” can be quite different from a civilian’s. There were “people in the Pentagon…who were aware of how this all sounded,” well before the Quartz article was ever written, he reported. Within hours of the original solicitation going online, the head of the Pentagon’s Joint Artificial Intelligence Center expressed concerns over what he feared would be a ‘firestorm’ of negative news coverage” when it was spotted, Freedberg wrote.

Scharre describes the current crop of autonomous weaponry, such as ATLAS, as akin to blind-spot monitors on cars, and says they would reduce the chances of missing an intended target.

Still, critics of AI-assisted weaponry (who include Elon Musk) fear the lack of concrete, universally accepted guidelines. They say only a total ban will prevent eventual catastrophe.

As Article 36, a UK-based NGO that works to “prevent the unintended, unnecessary or unacceptable harm caused by certain weapons,” states on its website: “Action by states is needed now to develop an understanding over what is unacceptable when it comes to the use of autonomous weapons systems, and to prohibit these activities and development through an international treaty.”

Credit: Google News

Previous Post

2018-2023 Global Machine Learning as a Service (MlaaS) Market Report (Status and Outlook) – Flatland Today

Next Post

Machine Learning Engineer Is The Best Job In The U.S. According To Indeed

Related Posts

Yum! Brands Acquires AI Company
Machine Learning

Yum! Brands Acquires AI Company

March 3, 2021
Cloudera: An Enterprise-Level Play On Machine Learning And Big Data – Seeking Alpha
Machine Learning

Cloudera: An Enterprise-Level Play On Machine Learning And Big Data – Seeking Alpha

March 3, 2021
An open-source machine learning framework to carry out systematic reviews
Machine Learning

An open-source machine learning framework to carry out systematic reviews

March 3, 2021
Microsoft’s Azure Arc multi-cloud platform now supports machine learning workloads – TechCrunch
Machine Learning

Microsoft’s Azure Arc multi-cloud platform now supports machine learning workloads – TechCrunch

March 2, 2021
Opportunity, Trends, Share, Top Companies Analysis (Based on 2021 COVID-19 Worldwide Spread) – NeighborWebSJ
Machine Learning

Opportunity, Trends, Share, Top Companies Analysis (Based on 2021 COVID-19 Worldwide Spread) – NeighborWebSJ

March 2, 2021
Next Post
Machine Learning Engineer Is The Best Job In The U.S. According To Indeed

Machine Learning Engineer Is The Best Job In The U.S. According To Indeed

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

January 6, 2019
Microsoft, Google Use Artificial Intelligence to Fight Hackers

Microsoft, Google Use Artificial Intelligence to Fight Hackers

January 6, 2019

Categories

  • Artificial Intelligence
  • Big Data
  • Blockchain
  • Crypto News
  • Data Science
  • Digital Marketing
  • Internet Privacy
  • Internet Security
  • Learn to Code
  • Machine Learning
  • Marketing Technology
  • Neural Networks
  • Technology Companies

Don't miss it

Remote work: 5 things every business needs to know
Internet Security

Remote work: 5 things every business needs to know

March 3, 2021
Yum! Brands Acquires AI Company
Machine Learning

Yum! Brands Acquires AI Company

March 3, 2021
Customer Experience Management and Improvement
Marketing Technology

Customer Experience Management and Improvement

March 3, 2021
New app rollout helps reduce paperwork for NSW frontline child protection caseworkers
Internet Security

New app rollout helps reduce paperwork for NSW frontline child protection caseworkers

March 3, 2021
Cloudera: An Enterprise-Level Play On Machine Learning And Big Data – Seeking Alpha
Machine Learning

Cloudera: An Enterprise-Level Play On Machine Learning And Big Data – Seeking Alpha

March 3, 2021
The Symbolic World: Raising A Turing’s Child Machine (1/2) | by Puttatida Mahapattanakul | Feb, 2021
Neural Networks

The Symbolic World: Raising A Turing’s Child Machine (1/2) | by Puttatida Mahapattanakul | Feb, 2021

March 3, 2021
NikolaNews

NikolaNews.com is an online News Portal which aims to share news about blockchain, AI, Big Data, and Data Privacy and more!

What’s New Here?

  • Remote work: 5 things every business needs to know March 3, 2021
  • Yum! Brands Acquires AI Company March 3, 2021
  • Customer Experience Management and Improvement March 3, 2021
  • New app rollout helps reduce paperwork for NSW frontline child protection caseworkers March 3, 2021

Subscribe to get more!

© 2019 NikolaNews.com - Global Tech Updates

No Result
View All Result
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News

© 2019 NikolaNews.com - Global Tech Updates