- Military leaders argue that AI has an important role in the future war.
- There was a change in the collaboration of the industry with the Ministry of Defense on AI and Autonomy.
- AI in military technology must join ethical executives, said Doug Philippone de Snowpoint Ventures.
No one wants “killer robots”, so ensuring that artificial intelligence systems are not becoming Rogue is the “business cost” in military technology, said the founder of a venture capital company during a Wednesday discussion on AI technology on the battlefield.
“You must be able to do an AI that can work in an ethical framework, period,” said Doug Philippone, co-founder of Snowpoint Ventures, a venture capital company that merges technological talents with defense problems, at the top of the basis of the national security innovation of the Reagan Institute.
“I don’t think anyone, you know, trying to have killers who run on their own,” he said.
Philippone explained that companies working in the military technological space that are worth investing should “think about these problems and work in this ethical environment”. He said that these are not limits to development. Instead, these are requirements.
Autonomous machines tend to cause a certain degree of apprehension, in particular when such technology is applied to the “killing chain” of the DOD. While military leaders argue that systems are essential for future war, they also pose ethical concerns as to what machine autonomy could ultimately mean.
Times change
The defense technology space seems to be experiencing a major change in perspective. Last month, Google reversed the course on a previous promise against the development of AI weapons, which aroused criticism from certain employees. This decision seemed to reflect a greater desire among more technological companies to work with the Ministry of Defense on these technologies.
Throughout the Silicon Valley, “there has been a massive cultural passage of” Noodow we think of defending America “in” Let’s Get in the Fight “”, said Thomas Robinson, Director of Domino Data Lab, an AI solutions company based in London.
He declared during the event on Wednesday that “it was only a palpable difference between a few years ago”.
There has been a strong increase in smaller and more agile defense technology companies, such as Andundil, occurring in fields such as unrelated systems and autonomy, stimulating an opinion among certain defense technology chiefs that the new Trump administration could create new potentially valid DOD contracts of hundreds of millions, if not billions of dollars.
Part of this cultural change has aroused concerns concerning the “rotating doors” of military officials heading to the field of venture capital technology after retirement, creating possible conflicts of interest.
Former Air Force Secretary Frank Kendall underlined the AI and, during his mandate, stolen in the X-62 Vista piloted by artificial intelligence. Photo of the Air Force by Richard Gonzales
American military leaders have increasingly prioritized the development of AI capabilities in recent years, some arguing that the team dominates this technological space will be the winner in future conflicts.
Last year, Air Force Secretary Frank Kendall said the United States was locked in a technological arms race with China. The AI is crucial, he said, and “China advances aggressively”.
The Air Force has experienced fighter planes to dive, among other compatible AI tools, just like other elements of the American military and American allies. “We are going to be in a world where decisions will not be made at human speed,” said Kendall in January. “They will be made at the speed of the machine.”
Certain areas of armed conflict, including cyber war and electronic war, are probably dominated by AI technologies that evaluate the events that occur at unimaginably rapid speeds and unimaginably small dimensions.
IA with railing
This makes AI a higher investment. During Wednesday’s discussion, the representative of the American Congress, Ro Khanna of California, expressed his support for a proposal from the Democratic Democratic candidate Michael Bloomberg, who called 15% of the massive pentagon budget to advanced and emerging technology.
As a candidate for the Secretary of Defense, Pete Hegseth is committed to prioritizing new technologies, writing that “the budget of the Ministry of Defense must focus on lethality and innovation”. He said that “technology changes the battlefield”.
But ethical considerations remain essential. Last year, senior Pentagon officials, for example, discussed the railings set up to calm the fears that he “built killers in the basement”.
Understanding exactly how the work of the algorithms of an AI tool will be important for the implementation of the ethical battlefield, noted Philippone, and will therefore include the quality of the data absorbed – if not, these are “garbage in garbage”.
“Whether it’s Tyson’s chicken or it’s the Navy department, you want to be able to say” this problem is important, “he said.” What is the data? “
“You understand how it circulates in the algorithms, then you understand the exit in a way that is verifiable, so that you can understand how we arrived,” he said. “And then you codify these rules.”
Philippone said that the opacity of knowledge owners of certain IA companies is “BS” and an “approach to the black box”. He said that companies should rather aim for a more transparent approach to artificial intelligence.
“I call it the glass box,” he said. Understand how the inner functioning of a system work can help avoid hacks, he said: “It is really important from the point of view of ethics and really understand the process of your decision in your organization.”
“If you can’t audit it,” he said, “that leaves you sensitive.”
businessinsider