[HN Gopher] Square Roots and Maxima
___________________________________________________________________
Square Roots and Maxima
Author : surprisetalk
Score : 70 points
Date : 2024-11-30 15:33 UTC (7 hours ago)
(HTM) web link (leancrew.com)
(TXT) w3m dump (leancrew.com)
| ndsipa_pomu wrote:
| Matt Parker's video on Square Roots and Maxima:
| https://www.youtube.com/watch?v=ga9Qk38FaHM
| raegis wrote:
| The original post verifies the fact (experimentally) using
| simulations, and includes example code. However, the guy in the
| video you referenced does a lot of talking around the problem,
| and includes a VPN advertisement in the middle. I never heard
| of Matt Parker before, and I'm not knocking his talent (he has
| 1.24 million subscribers!), but the only coherent part of the
| video is where he includes an explanation from another channel,
| 3blue1brown.
| stouset wrote:
| They target different audiences. 3B1B tends to aim at those
| who want to know more of the underlying math and develop good
| analytical thinking, Matt Parker often keeps things at a bit
| more approachable a level for those who aren't as inclined.
| magicalhippo wrote:
| Matt Parker is an ex-math teacher that does stand-up math
| comedy shows[1], FWIW.
|
| [1]: https://standupmaths.com/
| ndsipa_pomu wrote:
| He's a comedian and maths communicator (including author). He
| appears in quite a few Numberphile videos too. He seems to be
| friends with quite a few YouTube maths presenters such as
| Grant Peterson and Hannah Fry (she seems to have moved over
| to TV presenting now which is good - I think she'd be an
| excellent choice to do a James Burke style Connections series
| as she also has a dry wit).
| dahart wrote:
| Either I haven't seen this before, or forgot it, but it's
| surprising because I use the sum of independent uniform variables
| every once in a while -- the sum of two vars is a tent function,
| the sum of three is a smooth piecewise quadratic lump, and the
| sum of many tends toward a normal distribution. And the
| distribution is easy calculated as the convolution of the input
| box functions (uniform variables). Looking it up just now I
| learned the sum of uniform variables is called an Irwin-Hall
| distribution (aka uniform sum distribution).
|
| The min of two random vars has the opposite effect as the max
| does in this video. And now I'm curious - if we use the function
| definition of min/max -- the nth root of the sum of the nth
| powers of the arguments -- there is a continuum from min to sum
| to max, right? Are there useful applications of this generalized
| distribution? Does it already have a name?
| max_likelihood wrote:
| Perhaps you are thinking of Order Statistics?
| https://en.wikipedia.org/wiki/Order_statistic
| dahart wrote:
| Ah, fascinating, I've never used Order Statistics. It doesn't
| look exactly like what I was thinking, but there is also a
| continuum from min to median to max, similar to min to
| mean/sum to max. I'm not sure but I might guess that for the
| special case of a set of independent uniform variables, the
| median and the mean distributions are the same? Does this
| mean there's a strong or conceptual connection between the
| Bates distribution and the Beta distribution? (Neither
| Wikipedia page mentions the other.) Maybe Order Statistics
| are more applicable & useful than what I imagined...
| falseprofit wrote:
| Median and mean are not the same distribution. Consider
| three uniform values. For the median to be small two need
| to be small, but the mean needs three.
|
| I think order statistics are more useful than what you
| described, because "min" and "max" are themselves quantiles
| and more conceptually similar to "median" than to "mean".
|
| Trying to imagine how to bridge from min/max to mean, I
| guess you could take weighted averages with weights
| determined by order, but I can't think of a canonical way
| to do that.
| adgjlsfhk1 wrote:
| the canonical mapping is via norms. min is the 0 norm, 1
| is the mean and the inf norm is maximum
| jvanderbot wrote:
| I build a whole TTRPG around this fact, so that it's easier to
| create realistic performance curves for characters as they
| skill up.
|
| Yeah I'm real fun at parties.
| somat wrote:
| One of the things I liked about the Heavy Gear table top game
| was that the roll mechanic was to roll N dice and pick the
| highest. where N was your skill level. Now this did make the
| game somewhat brutal. but there was a lot less of the absurd
| high skill wiffing you see as in a D & D type system.
|
| The other neat thing Heavy Gear did was they had none of this
| ablative armor bullshit like you see in Battletech. The armor
| ether works and you get no damage or it gets penetrated and
| you get full damage.
| keithalewis wrote:
| Front page material? P(max{X_1, X_2} <= x) = P(X_1 <= x, X_2 <=
| x) = P(X_1 <= x) P(X_2 <= x) = xx. P(sqrt(X_3) <= x) = P(X_3 <=
| x^2) = x^2. It is late in the day when midgets cast long shadows.
| refulgentis wrote:
| I felt the same way, came here to decide whether to comment
| something negative, saw your comment was just posted. But I
| first read the top comment from 2 hours ago, apparently this is
| news-ish to stats people because it's counterintuitive to
| common methods? _shrug_
| prof-dr-ir wrote:
| If X1...Xn are independently uniformly distributed between 0 and
| 1 then:
|
| P(max(X1 ... Xn) < x) =
|
| P(X1 < x and X2 < x ... and Xn < x) =
|
| P(X1 < x) P(X2 < x) ... P(Xn < x) =
|
| x^n
|
| Also,
|
| P(X^{1/n} < x) = P(X < x^n) = x^n
|
| I guess I am just an old man yelling at clouds, but it seems _so_
| strange to me that one would bother checking this with a
| numerical simulation. Is this a common way to think about, or
| teach, mathematics to computer scientists?
| coliveira wrote:
| > it seems so strange to me that one would bother checking this
| with a numerical simulation
|
| I believe that some people know programming but have little
| experience with mathematics, so the first thing they'll think
| about is to "check" numerically that something is true. Which
| in reality doesn't prove anything, so people should better
| spend the time to learn some math for these situations.
| ValentinA23 wrote:
| "you can't learn maths on your own, you need a master"
|
| My math teacher during my second year in university, who also
| happened to be a chaos theorist working on cool stuff such as
| cryptography via chaos synchronization.
|
| He was by far the worst teacher I ever had in terms of mental
| calculation abilities, but he was also the more advanced. I
| remember a conversation where he explained how he would
| always implement his algorithms at least twice, on entirely
| different software and hardware stacks.
| Vinosawd wrote:
| Similarly,
|
| P(min{X1, X2, ..., Xn} < x) =
|
| P(X1<x or X2<x ... or Xn < x) =
|
| P(not(not(X1 < x) and not(X2 < x) ... and not(Xn < x))) =
|
| 1-P(not(X1 < x) and not(X2 < x) ... and not(Xn < x)) =
|
| 1-P(not(X1 < x))[?]P(not(X2 < x)) ... [?]P(not(Xn < x)) =
|
| 1-(1-P(X1 < x))[?](1-P(X2 < x)) ... [?](1-P(Xn < x)) =
|
| 1-(1-x)^n
|
| which curve, in the [0, 1]^2 square, is just x^n rotated around
| (1/2; 1/2) by 180 degrees.
| cowsandmilk wrote:
| As a mathematician, one of the first programs I wrote was to
| numerically calculate pi by random number generation in a box
| and seeing percentage of points that were in the circle. It was
| a fun introduction to programming. So, I found it to be the
| opposite, numerical simulations were a way to teach
| mathematicians programming.
| gxs wrote:
| Just a side comment on what a great little video.
|
| Short, to the point, and the illustrations/animations actually
| helped convey the message.
|
| Would be super cool if someone could recommend some social media
| account/channel with collections of similar quality videos (for
| any field).
___________________________________________________________________
(page generated 2024-11-30 23:00 UTC)