[HN Gopher] DARPA Project Reveals One Person Can Control Dozens ...
       ___________________________________________________________________
        
       DARPA Project Reveals One Person Can Control Dozens of Robots
        
       Author : bookofjoe
       Score  : 61 points
       Date   : 2025-02-28 19:30 UTC (3 days ago)
        
 (HTM) web link (spectrum.ieee.org)
 (TXT) w3m dump (spectrum.ieee.org)
        
       | teeray wrote:
       | > U.S. Defense Advanced Research Projects Agency (DARPA), experts
       | show that humans can single-handedly and effectively manage a
       | heterogenous swarm of more than 100 autonomous ground and aerial
       | vehicles, while feeling overwhelmed only for brief periods of
       | time
       | 
       | This will surprise nobody who has watched professional Starcraft
       | players.
        
         | KumaBear wrote:
         | Watching professional starcraft players makes you question if
         | they are human. Their control of vast quantities of units and
         | platoons is unreal at moments.
        
           | crooked-v wrote:
           | The real limiter is (unironically) the quality of the drone
           | pathfinding.
        
         | h2zizzle wrote:
         | No Newtype powers required.
        
         | datadrivenangel wrote:
         | Good unit AI for RTS allow for amazing results, and there is so
         | much more control/automation that most RTS games could allow
         | for.
        
         | easterncalculus wrote:
         | It makes me wonder if there could be some sort of lower-cost
         | real-life strategy game with cheap(er) homemade drones
         | eventually, kind of like FPV racing now. I'm not a big RTS
         | person but that sounds really fun.
        
           | colechristensen wrote:
           | Robot wars but instead of 1v1 you have large teams of drones
           | on a well enclosed football field.
           | 
           | I can't decide if that would be cool or terrifying.
        
             | djmips wrote:
             | Both
        
           | mkoubaa wrote:
           | If the drones were not destroyed as part of normal gameplay
           | it could make sense. So rather than a battle Royale maybe
           | something like drone laser tag mechanics
        
         | szvsw wrote:
         | > feeling overwhelmed only for brief periods of time
         | 
         | There is something deeply, darkly comedic (depressing?) about
         | the qualitative language here. Primarily the way it
         | simultaneously intersects with modern discourse around
         | wellness, anxiety, and mental health in such a banal manner at
         | the same time as the latent/implicit violence of action (given
         | that the obvious subtext is operating semi-autonomous killing
         | machines).
        
           | parsimo2010 wrote:
           | Agreed- they write as if being overwhelmed 3% of the time is
           | a victory. A good system would have people feeling
           | overwhelmed 0% of the time.
        
             | colechristensen wrote:
             | >A good system would have people feeling overwhelmed 0% of
             | the time.
             | 
             | There are benefits to being pushed past your limits from
             | time to time. Also, there's just no such thing as 0. When
             | you're designing limits you don't say "this never happens",
             | you're saying "this event happens less than this rate for
             | this cohort".
        
               | parsimo2010 wrote:
               | I'd agree that it is worth pushing your limits during
               | training, but the best-case scenario during actual
               | conflict is to be as close to 0% overwhelmed as you can
               | be.
        
             | some_random wrote:
             | Yeah I really don't like that phrasing. Take off and
             | landing is the most dangerous part of flying but only makes
             | up a tiny percentage of the total flight. If that 3% of the
             | time referenced is the most dangerous or most critical 3%
             | of time then it hardly matters how easy the rest of it is.
        
             | bluGill wrote:
             | The real question is what happens in that 3%. If they are
             | still able to control the drones that is very different
             | from they set the drones to kill your own people. (This is
             | DARPA so we can assume killing people is a goal in some
             | form). There is a lot in between too.
        
           | colechristensen wrote:
           | It's DARPA, you're really past the moralizing about war stage
           | here, that's just out of context. I don't see UX experts
           | hand-wringing about the effects of advertising when they're
           | designing their products.
           | 
           | >discourse around wellness, anxiety, and mental health in
           | such a banal manner
           | 
           | It's not about "feelings" and that might disturb you, but
           | really very many things should be much less about feelings. A
           | whole lot of "wellness, anxiety, and mental health" isn't
           | about feelings but instead being inside or outside the limits
           | of what a person is capable of handling. Facts-based analysis
           | of work and life and people being too far outside their
           | comfort zone could do a lot for many people dealing with
           | mental health issues.
           | 
           | DARPA does and obviously _needs to_ study these things. One
           | of the most important areas for this are pilots especially
           | during emergencies. It comes from both directions, designing
           | the machine to be manageable and training the human to manage
           | in exceptional circumstances and _knowing the limits_ of
           | both.
        
             | szvsw wrote:
             | > It's DARPA, you're really past the moralizing about war
             | stage here, that's just out of context.
             | 
             | I don't really think I was moralizing... just commenting on
             | the funny juxtaposition of the language and the context -
             | or on the comedy of the language specifically when not
             | considering the whole context. I was not saying DARPA
             | should or should not be doing this - though I'll grant that
             | what I wrote could be read as an implicit criticism, even
             | though it was not my intention.
             | 
             | > I don't see UX experts hand-wringing about the effects of
             | advertising when they're designing their products.
             | 
             | Plenty do. Plenty don't. Similarly, plenty of machine
             | learning engineers might choose not to work on, say, a
             | predictive algorithm for facial recognition or a product
             | recommender system because they don't feel like being a
             | part of that system. Some people don't have that luxury, or
             | don't care. It's fine either way, though I of course
             | encourage anyone to do some reflection on the social
             | implications of their engineering projects from time to
             | time. Hamming, who worked on everything from the ABomb to
             | telephones to the foundations of computer programming (and
             | everything in between) strongly recommends this, and I
             | agree. Working on weapons might be necessary, but you still
             | need to reflect and make a conscious decision about it.
             | 
             | > It's not about "feelings" [...] It comes from both
             | directions, designing the machine to be manageable and
             | training the human to manage in exceptional circumstances
             | and _knowing the limits_ of both.
             | 
             | Of course, totally understand that. That doesn't mean we
             | can't find humor in decontextualizing the language! Or in
             | thinking about how science always must struggle with
             | euphemism for the purposes of concision.
        
           | alpaca128 wrote:
           | That sentence could come from an Onion news report about
           | worker productivity.
        
         | 0cf8612b2e1e wrote:
         | So Overwatch/ DVa is onto something.
        
         | parsimo2010 wrote:
         | > The most common reason for a human commander to reach an
         | overload state is when they had to generate multiple new
         | tactics or inspect which vehicles in the launch zone were
         | available for deployment
         | 
         | This seems misleading- what they said is that when everything
         | is on cruise control the commander does not feel overwhelmed.
         | But if they have to do some high cognitive load task (like
         | reading statuses) or react to a complex situation the commander
         | will feel overwhelmed, which is bad. We want to be able to
         | react quickly and appropriately to all situations, which we
         | can't do when overwhelmed. Being able to handle dozens of bots
         | in a calm situation is meaningless. We need to staff our bot
         | controllers/monitors/commanders at a level that they can handle
         | those top 3% complex wartime scenarios.
        
         | nickpinkston wrote:
         | DARPA needs to partner with our Korean allies who already know
         | how to push up their APMs in these scenarios.
        
         | itishappy wrote:
         | Professional Starcraft players prove that this is possible, but
         | my own experience playing Starcraft indicates it's not all that
         | common.
        
         | mkoubaa wrote:
         | This is why we can't deal with China right now
        
         | 29athrowaway wrote:
         | Or AlphaStar from DeepMind.
        
       | Ajedi32 wrote:
       | Starcraft players presumably not surprised.
       | 
       | But seriously, isn't this just a function of how much babysitting
       | the robots require and how good the UI is for controlling them? I
       | don't see why there should be any fundamental limits here.
        
       | lenerdenator wrote:
       | This really is the sort of technology that I want the government
       | to be looking into.
        
         | kevin_thibedeau wrote:
         | Until your local paramilitary cosplay group decides to equip
         | their SWAT team with them.
        
           | kiddico wrote:
           | I think that was sarcasm... I hope that was sarcasm.
        
             | kevin_thibedeau wrote:
             | Don't worry. They'll use non-lethal weaponry to merely
             | blind innocent civilians with their military surplus gun
             | bots.
        
             | lenerdenator wrote:
             | I mean my response was sarcasm, idk about theirs ^
        
       | hooverd wrote:
       | I hope the robots have funny voice lines if you click on their
       | icons enough.
        
         | mystified5016 wrote:
         | "I'm no milkmaid!"
        
         | WD-42 wrote:
         | Me not that kind of robot
        
         | foobarian wrote:
         | "Your soundcard works perfectly!"
        
       | danielmarkbruce wrote:
       | This won't age well. 100 ?
        
       | excalibur wrote:
       | > For instance, in a particularly challenging, multiday
       | experiment in an urban setting, human controllers were overloaded
       | with the workload only 3 percent of the time.
       | 
       | That 3 percent is definitely the part where the innocent people
       | are killed
        
         | sitkack wrote:
         | Unintended surplus collateral loss.
        
         | tehjoker wrote:
         | given the performance of the Israelis recently, it may be more
         | like the opposite. they would authorize collateral damage of
         | 300 people to get 1 militant, so their "off-target" ratio could
         | be as high as 99.7%
         | 
         | Israel does path finding for what the U.S. military can get
         | away with.
        
       | sampton wrote:
       | Coincidentally EA just open sourced C&C games.
        
       | deadbabe wrote:
       | Will we ever be able to build a war interface for remote
       | controlled drones that is so good it just feels like an RTS game?
       | Or will latency be an issue.
        
         | daveguy wrote:
         | Latency will always be an issue with tele-operation from the
         | control source. Best to have local autonomy while waiting for
         | latent instructions. The more autonomous the drone is, the
         | farther away an effective control source can be.
        
       ___________________________________________________________________
       (page generated 2025-03-03 23:00 UTC)