[HN Gopher] How funny is this word? The 'snunkoople' effect (2015)
___________________________________________________________________
How funny is this word? The 'snunkoople' effect (2015)
Author : tintinnabula
Score : 11 points
Date : 2021-01-27 05:30 UTC (17 hours ago)
(HTM) web link (www.sciencedaily.com)
(TXT) w3m dump (www.sciencedaily.com)
| ggm wrote:
| Mad Magazine cartoonist Don Martin, was extremely adept at
| choosing .. sound-words.
|
| Spladoingg.. Galoosh. Snert. You never knew what noise a man
| would make, falling down a manhole. He pretty much confined it to
| sound-effects.
| na85 wrote:
| >sound-words
|
| Those are called onomatopoeia:
| https://en.wikipedia.org/wiki/Onomatopoeia
| ggm wrote:
| Except Don Martin appears to have operated by taking that
| list, and Roget, and inventing new ones. None of which
| (AFAIK) subsequently entered widespread use. Unlike the
| yiddish slang Mad used, which I think it helped perpetuate
| into modern times.
| steve_g wrote:
| This article describes higher probability letter combinations as
| higher entropy and lower probability combinations as lower
| entropy. For example, "yuzz-a-ma-tuzz" is low entropy because z's
| generally appear with lower probability in English.
|
| I thought entropy worked the other way round - high probability
| means low entropy and low probability means high entropy. Which
| is it?
| refactor_master wrote:
| It does indeed sound like it's written the wrong way around.
|
| Random information = high entropy. A common word is therefore
| low entropy.
|
| Here's a good analogy I found:
|
| > Informally, the amount of information in an email is
| proportional to the amount of "surprise" its reading causes.
| For example, if an email is simply a repeat of an earlier
| email, then it is not informative at all. On the other hand, if
| say the email reveals the outcome of a cliff-hanger election,
| then it is highly informative. Similarly, the information in a
| variable is tied to the amount of surprise that value of the
| variable causes when revealed. Shannon's entropy quantifies the
| amount of information in a variable, thus providing the
| foundation for a theory around the notion of information.
|
| https://arxiv.org/pdf/1405.2061.pdf
| fredophile wrote:
| This is an interesting result but it doesn't tell us if this is a
| universal aspect of humor or if it's a cultural thing. I'd be
| very interested in further studies showing if this applies to
| other languages and cultures.
___________________________________________________________________
(page generated 2021-01-27 23:01 UTC)