[HN Gopher] Reducing the Computational Cost of Deep Reinforcemen...
___________________________________________________________________
Reducing the Computational Cost of Deep Reinforcement Learning
Research
Author : theafh
Score : 50 points
Date : 2021-07-13 18:15 UTC (4 hours ago)
(HTM) web link (ai.googleblog.com)
(TXT) w3m dump (ai.googleblog.com)
| sxp wrote:
| > By doing so, in addition to reducing the environmental impact
| of our experiments...
|
| How does reducing the cost of individual experiments reduce the
| total environmental impact? Won't more efficient experiments
| (measured as experiments/Watt-hour or experiments/$) just trigger
| https://en.wikipedia.org/wiki/Jevons_paradox as more people take
| advantage of the lower cost?
| throwawaygh wrote:
| Strikes me as one of those cases where good research uses a
| silly motivation to get brownie points.
|
| use p100s hosted in data centers powered by renewables, stop
| flying to conferences, don't do drl just because it's sexy
| monocasa wrote:
| It reads to me as a response to Gebru's paper that led to her
| getting fired. A paper on the externalities of training large
| models, like the environmental impacts due to the large
| amount of computation required: "On the Dangers of Stochastic
| Parrots: Can Language Models Be Too Big?"
|
| https://dl.acm.org/doi/10.1145/3442188.3445922
|
| That would explain why such a mundane subject is in a blog
| post, it's focused at tech news to be all "we really do care
| about this thing" rather than academia.
| visarga wrote:
| > Can Language Models Be Too Big?
|
| That's interesting, GPT-n with 100T parameters would be
| bored by just reading the whole internet. Too little
| information, repetitive and on average, junk.
|
| You can try the question in reverse too: Can evolution use
| too much energy? How much energy has it already consumed?
| qorrect wrote:
| > just trigger https://en.wikipedia.org/wiki/Jevons_paradox as
| more people take advantage of the lower cost?
|
| I'm guessing that's exactly what will happen.
| melling wrote:
| So, it really sounds like we need to reduce our carbon
| emissions for electricity generation.
|
| The windmills and solar are still on the way. Looks like the
| new goal in the US is 2035:
|
| https://www.scientificamerican.com/article/bidens-
| infrastruc...
| DrNuke wrote:
| 2016 GTX 1070s 8GB will live another day as the bang-on-the-buck
| ml/dl/drl intro graphic cards then.
| TaylorAlexander wrote:
| Amazing that they fired Timnit Gebru [1][2] after she pushed back
| against the removal of this very subject from one of her research
| papers, [3] only to publish their own work on it without
| mentioning her.
|
| [1] https://www.nytimes.com/2017/12/31/technology/google-
| images-...
|
| [2] https://www.economist.com/science-and-
| technology/2017/03/02/...
|
| [3] https://www.theverge.com/22309962/timnit-gebru-google-
| harass...
| aceon48 wrote:
| "Fired" aka she said I'm resigning unless you meet my list of
| demands, and they accepted her resignation.
| monocasa wrote:
| She said that they could discuss a possible resignation when
| she returned from her preplanned vacation. They said don't
| bother you don't work here anymore.
|
| Threatening to quit is no more quitting than threatening to
| fire is actually firing.
| igorkraw wrote:
| I'm a bit surprised this paper: https://arxiv.org/abs/1906.05243
| wasn't in the citations, nor did the authors build on it
| creato wrote:
| One of the authors of that paper is acknowledged here, so at
| the very least the authors were aware of each other before this
| was posted.
___________________________________________________________________
(page generated 2021-07-13 23:00 UTC)