Post AT1O2DA3znIfyve3G4 by therealfakemoot@techhub.social
 (DIR) More posts by therealfakemoot@techhub.social
 (DIR) Post #AT1KqVapLFDfoDtL8a by nicole@pkm.social
       2023-02-25T00:01:49Z
       
       0 likes, 0 repeats
       
       Hypothesis: some general principles for improving the performance of distributed systems can be applied to increase human productivity.In general, when you want something done:1. increase the number of workers doing the thing (multithreading, parallelism)2. make existing workers more efficient (comparative advantage, work distribution, sharding)3. get a good manager to run interference (circuit breaker pattern, load balancer, brokers)What did I miss?https://notes.nicolevanderhoeven.com/Principles+of+improving+work+performance#TIL
       
 (DIR) Post #AT1MF3bOeE3gBtQZ0q by ike@pkm.social
       2023-02-25T00:17:28Z
       
       0 likes, 0 repeats
       
       @nicole It might fall under "circuit breaker" but I think there is overhead with parallelism, sharding, etc in that there is increased collaboration friction, dependency mapping, and timing considerations.Sometimes rather than adding more "workers" or "workstreams", it is simply better to remove processes / governance / controls and other sources of overhead. In fact, I think most organizations would be better served by always starting by asking what can be removed rather than adding.
       
 (DIR) Post #AT1O2DA3znIfyve3G4 by therealfakemoot@techhub.social
       2023-02-25T00:37:33Z
       
       0 likes, 0 repeats
       
       @nicole I think it really boils down to two degrees of freedom:1) make the work easierThis includes you examples about managers running interference, as well as finding algorithms with smaller big O2) do more work at onceParallelism obviously falls here, but as I write I realize I'm on the fence about concurrency. The latter isn't intrinsically connected to running more code at the same time, although well written concurrent code can usually be parallelized. Concurrency is more about separating concerns, which to my mind is making the work easier (to observe and optimize).
       
 (DIR) Post #AT26vuSz5MnaCUrebo by nicole@pkm.social
       2023-02-25T09:00:35Z
       
       0 likes, 0 repeats
       
       @ike Excellent point! There probably needs to be a fourth one for removing components rather than adding them. Thanks!
       
 (DIR) Post #AT276MKuJzMtX64Lz6 by nicole@pkm.social
       2023-02-25T09:02:35Z
       
       0 likes, 0 repeats
       
       @therealfakemoot One could argue that doing more work at once makes the work easier as well, but it's a good way of looking at it too. Can you explain a bit more what you mean about concurrency being more about separating concerns?
       
 (DIR) Post #AT2ViUa4Vx2x0UsWAa by therealfakemoot@techhub.social
       2023-02-25T13:38:18Z
       
       0 likes, 0 repeats
       
       @nicole Preface: my understanding of concurrency comes from my time with Golang, but I'm relatively confident the principles translate. Writing (good) concurrent code effectively reduces to this: separate I/o bound work from CPU bound work. Pure (in the math sense) functions get lifted out of network operations. This separation of concerns lets the runtime spend its time more effectively: when it's waiting on I/o it can context switch to computations that aren't bottlenecked .And it's hard to write good concurrent code that separates these concerns until your API has been cleaned up and had other concerns separated.I'm on mobile till tomorrow so my usual eloquence is gonna take a hit
       
 (DIR) Post #AT6ESkvd2r2QUEiBZQ by nicole@pkm.social
       2023-02-27T08:43:55Z
       
       0 likes, 0 repeats
       
       @therealfakemoot Thanks for the explanation! I hadn't considered it in that context. I was thinking more about concurrency in the case of having multiple users (real or virtual, as in a load test) accessing an application. You're right that concurrency does require a separation of concerns to some degree...Thanks!