• 3 Posts
  • 1.1K Comments
Joined 1 year ago
cake
Cake day: July 6th, 2023

help-circle
  • You could. This type of gun is not intended primarily for use against people (although this particular gun might be modified to serve the role of a sniper rifle). It’s for shooting aircraft and lightly armored vehicles. By that I don’t mean cars; I mean armored personnel carriers. The bullets would go right through a building’s walls.

    I can’t quickly find a photo of this gun’s 12.7 mm bullet doing its thing, but here’s what the very similar American 50 cal bullet does to six-inch-thick concrete:


  • My guess is that they didn’t answer your question because they had strict instructions not to stray from the script on this topic. Saying the wrong thing could lead to a big PR problem, so I don’t expect that people working in this field would be willing to have a candid public discussion even about topics to which they have given a lot of thought. I do expect that they have given the ability of AI to obey orders accurately a lot of thought at least due to practical (if not ethical) concerns.

    I mean, I am currently willing to say “the AIs will almost definitely kill civilians but we should build them anyway” because I don’t work in defense. However, even I’m a little nervous saying that because one day I might want to. My friends who do work in defense have told me that the people who gave them clearance did investigate their online presence. (My background is in computational biochemistry but I look at what’s going on in AI and I feel like nothing else is important in comparison.)

    As for cold comfort: I think autonomous weapons are inevitable in the same way that the atom bomb was inevitable. Even if no one wants to see it used, everyone wants to have it because enemies will. However, I don’t see a present need for strategic (as opposed to tactical) automation. A computer would have an advantage in battlefield control but strategy takes hours or days or years and so a human’s more reliable ability to reason would be more important in that domain.

    Once a computer can reason better than a human can, that’s the end of the world as we know it. It’s also inevitable like the atom bomb.


  • The federal government uses the term “terminate” rather than “revoke” to describe the decision not to extend TPS, but even the article the OP posted (which is very critical of Trump’s plan) interprets what he said as “not extend”.

    Now Trump plans to forcibly uproot this group of roughly 18,000 people who pay taxes, own homes, have jobs, and support their families. But that’s only the beginning: Up to 2.7 million people could lose protection from deportation if Trump allows immigration programs such as Temporary Protected Status, DACA, and humanitarian parole to lapse during a second term, according to Forbes.


  • The interviewer was the one who used the word “revoke” but Trump does seem like the kind of person who could attempt to end the TPS designation early rather than waiting for it to simply expire a year into his term. Such an attempt would have very little chance of success. Decisions to terminate (as opposed to revoke early) TPS status during Trump’s past presidency are still going through the courts (see Ramos, et al. v. Nielsen, et al.) and not in effect.


  • Trump and the interviewer are talking about Temporary Protected Status, which is temporary.

    A TPS designation can be made for 6, 12, or 18 months at a time. At least 60 days prior to the expiration of TPS, the Secretary [of Homeland Security] must decide whether to extend or terminate a designation based on the conditions in the foreign country.

    Source.

    TPS eligibility for people from Haiti will last until February 3, 2026 unless it is extended. If during a Trump presidency, the federal government does not extend TPS for Haiti, it would be acting well within its established authority.




  • Despite media speculation, Israel is not currently planning to strike Iran’s nuclear facilities, according to four Israeli officials, even though Israel sees Iran’s efforts to create a nuclear weapons program as an existential threat. Targeting nuclear sites, many of which are deep underground, would be hard without U.S. support. President Biden said Wednesday that he would not support an attack by Israel on Iranian nuclear sites.

    I wonder what the strategy here is, given that the USA also wants to prevent Iran from having nuclear weapons. Is the implication here that the USA will not enable an attack on Iranian nuclear facilities as long as Iran doesn’t actually try to build a bomb? How confident are Israel and the USA that Iran can’t build a bomb in secret? Is there a way Iran could retaliate against an attack on its nuclear facilities but not against an attack on other major targets?




  • a Ghost Robotics Vision 60 Quadrupedal-Unmanned Ground Vehicle, or Q-UGV, armed with what appears to be an AR-15/M16-pattern rifle on rotating turret undergoing “rehearsals” at the Red Sands Integrated Experimentation Center in Saudi Arabia

    They’re not being used in combat.

    With that aside, I appear to be the only one here who thinks this is a great idea. AI can make mistakes, but the goal isn’t perfection. It’s just to make fewer mistakes than a human soldier does. (Or at least fewer mistakes than a bomb does, which is really easy.)

    Plus, automation can address the problem Western countries have with unconventional warfare, which is that Western armies are much less willing to have soldiers die than their opponents are. Sufficiently determined guerrillas who can tolerate high losses can inflict slow but steady losses on Western armies until the Western will to fight is exhausted. If robots can take the place of human infantry, the advantage shifts back from guerrillas to countries with high-tech manufacturing capability.