[HN Gopher] JamRTC - WebRTC for Live Musicians
       ___________________________________________________________________
        
       JamRTC - WebRTC for Live Musicians
        
       Author : kimi
       Score  : 69 points
       Date   : 2021-03-18 17:18 UTC (5 hours ago)
        
 (HTM) web link (github.com)
 (TXT) w3m dump (github.com)
        
       | morsch wrote:
       | We previously talked about JackTrip WebRTC[1], which I first
       | thought was similar. But while JamRTC uses WebRTC, it's not
       | actually browser based, while JackTrip WebRTC is. (And of course
       | they _are_ similar in that they are tring to do low-latency audio
       | conferencing via WebRTC.)
       | 
       | And then there is regular JackTrip, which the author refers to in
       | the FAQ, which is neither browser- nor WebRTC-based.
       | 
       | [1] https://news.ycombinator.com/item?id=25942829
        
       | deeblering4 wrote:
       | Some other interesting projects
       | 
       | ninjam/jamtaba - instead of minimizing latency, accept it and
       | delay everyone in a synchronized way
       | 
       | endlesss - multiple people adding layers to an 8ch looper
        
       | bsimpson wrote:
       | I plated with WebMIDI a few years ago. Thought it would be fun to
       | build a player piano broadcast service, where anyone with a MIDI
       | device could play a piano remotely. Never built it though.
       | 
       | https://github.com/appsforartists/midicast
        
       | jm_l wrote:
       | I built a similar project for in-browser jamming with midi-
       | keyboards
       | 
       | https://github.com/jminjie/fourhands
       | 
       | The benefit of this is that it takes no set-up for non tech
       | users. The major downsides are that it's only 2 player, and only
       | works for midi instruments.
       | 
       | On Fourhands you can actually achieve very low latency (<20ms)
       | with WebRTC for somewhat close players on wired connections. If
       | you're having latency trouble like your readme says, I'm guessing
       | it's not the WebRTC part.
        
         | oever wrote:
         | I'm getting <20ms with jamulus on a usb audio card on ubuntu
         | studio (rt-linux). This is for a weekly rehearsal with a big-
         | band. The server runs on an ODROID in the house of one of the
         | players. Connection is with ethernet and glass fiber.
         | 
         | The experience is very nice.
        
       | lgrebe wrote:
       | Isn't there a physical limit on delay that prohibits live, non-
       | co-located music making?
        
         | cma wrote:
         | A marching band might span a whole American football field
         | during a performance. That's potentially 135ms to the center of
         | the field from the edge or more from a corner. Their type of
         | music may be somewhat designed around it as a limitation as
         | well as having a visual metronome in the conductor (though you
         | could estimate one and even make it audible instead of visual
         | using half round-trip-time for internet jam sessions).
         | 
         | People seated at near field level at opposite corners may hear
         | a 350ms+ difference in some instruments, but they may spread
         | them out so that it just sounds like a reverb/wash of echoes.
         | 
         | For these kind of apps wear headphones and you cut around 3ms
         | per meter you are away from your speakers, due to speed of
         | sound.
        
         | BjoernKW wrote:
         | There's the speed of light on one hand and a maximum latency of
         | roughly 30 ms, which is the boundary below which the human mind
         | perceives sound as synchronous, on the other.
         | 
         | However, neither does prohibit non-colocated live music
         | altogether.
        
           | ericwood wrote:
           | Keep in mind the threshold is much lower when playing an
           | instrument is involved. For electric guitar, for example,
           | I've found anything above 13ms is perceptible and makes
           | playing in time more difficult.
           | 
           | This paper dives further into it more than I can anecdotally:
           | https://online.ucpress.edu/mp/article-
           | abstract/36/1/109/9206...
        
         | wishinghand wrote:
         | I recall another project for this, Jacktrip, stating that
         | people need to be within 500 miles of each other, all else
         | being equal.
        
         | analog31 wrote:
         | I've been jamming with some friends on Jamulus. A complicating
         | factor is that the effect of latency depends on what instrument
         | you play. When I play an electric instrument (bass guitar) that
         | makes little or no sound of its own, then it's easy to monitor
         | my timing in the mix through headphones, and my fingers just
         | adapt by playing ahead of the beat as needed. Our bodies have
         | to do that anyway in order to play in time because initiating a
         | note happens before the note comes out of virtually any
         | instrument.
         | 
         | It's much harder when you don't have that isolation, for
         | instance when I play an acoustic instrument, I have to turn up
         | my phones to drown out the acoustic sound. And it gets still
         | harder if you hear that instrument by bone conduction, e.g.,
         | horns or voice. The band adapts by the "easy" instruments
         | taking charge of the tempo and letting the other instruments
         | float around a bit. Still, it takes a lot of concentration to
         | avoid getting lost, and just doesn't sound as good as a live
         | performance of a "tight" band.
         | 
         | The band members have agreed that it's better than not playing
         | at all, but might not be a good listening experience for an
         | audience. Not necessarily because the delays are perceptible
         | but because it sounds like the band is struggling to stay
         | together. Of course other things are happening as well, such as
         | lack of visual cues.
         | 
         | Fortunately we're all "of a certain age," and between the
         | weather warming up and us getting vaccinated, we'll be playing
         | together again.
        
       | gd2 wrote:
       | I'm zero on this tech, but I'd like to see this type of project
       | work.
        
       | OliverJones wrote:
       | Awesome stuff. If you use it, do yourself a latency favor and
       | avoid WiFi connections. Use ethernet.
       | 
       | Some SDP munging in the clients can shorten Opus packets, which
       | helps. If they're longer than 20ms things get ugly.
       | 
       | I tried using a swarm approach (full mesh of point-to-point
       | WebRTC connections) for plain audio, no video. I got the latency
       | down to about 70 ms transcontinental, but my musician alpha
       | tester (brother) didn't go for it.
       | 
       | Let's get this to work!
        
       | Dnguyen wrote:
       | I worked on a project similar when the pandemic started last
       | year. Ran into latency problem and just couldn't find a way
       | around it. Add to the latency issue is also the global metronome
       | as well.
        
       | neolog wrote:
       | I would like to see latency benchmark comparisons for each of
       | these tools.
        
         | askvictor wrote:
         | My experiences playing with Jamulus during our lockdown follow.
         | 
         | Wired networking has less latency than wireless.
         | 
         | Sounds card and drivers add to the latency mix. Which is why
         | Jamulus requires ASIO drivers (on windows at least) - the
         | inbuilt mic on your laptop does a heap of processing to clean
         | up etc, but adds to latency; unfortunately, if you remove this
         | processing there's a lot of noise. A decent external sound card
         | is pretty much required.
         | 
         | If you can isolate the sound of your instrument so you only
         | hear it via the network (i.e. you're wearing isolating
         | headphones), you can get away with more latency as your brain
         | will compensate. That way, everyone will listening through the
         | network, so everyone should be in sync (assuming the software
         | compensates for different latencies). But this makes things
         | really hard for vocalists or drummers, where isolation is
         | really hard or impossible.
         | 
         | The upshot is that this is really hard for amateurs - I was
         | trying to get something going with my community orchestra but
         | gave up as half of them were on ipads or didn't know what an
         | ethernet cable was, let alone an external sound card. Was
         | tempted to look into building a raspberry-pi based standalone
         | 'jamming' applicance, but then the lockdown ended and I lost
         | interest. Still, exploring public Jamulus servers felt a bit
         | like the early days of the Internet :)
        
       | xrd wrote:
       | I love these kinds of projects.
       | 
       | We are layering AI on top of all kinds of problems. It feels like
       | the sync problem when latency is introduced is something that you
       | could solve with some dumb AI.
       | 
       | Is AI solving sync part of the third wave for musicians composing
       | over the internet in real time?
        
         | EGreg wrote:
         | How you gonna solve sync when the roundtrip adds a few hundred
         | ms? Is the AI going to predict sounds 20 mins into the future
         | after a while lol
         | 
         | While you're at it, have the AI predict images from the NASA
         | rovers
        
       ___________________________________________________________________
       (page generated 2021-03-18 23:00 UTC)