The text describes some of the unintended side effects which might result from artificial intelligence technologies, perhaps

Question:

The text describes some of the unintended side effects which might result from artificial intelligence technologies, perhaps from externalities in the objective function of a system. Here consider one potential externality: energy use (and, by proxy, greenhouse gas emissions) associated with training deep learning models. Strubell et al. (2019) make the case that modelers ought to pay more attention to the demands of training, citing Amodei (2018) as indicating how popular deep learning models increased the amount of compute time required by some 300,000 fold in between 2012 and 2018, a trend that appears to continue at the time of this writing. Indeed, Strubell et al. estimate that training a single big transformer model, the state of the art for many natural language processing tasks, would use 192 lbs CO2, about a tenth of the emissions released by a passenger flying from New York to San Fransisco and back. They estimate that training the well-known BERT model would take 1438 lbs CO2, a bit under that round trip cost for a passenger on a cross-country flight. Both of these figures grow thousands of times larger when neural architecture search, tuning, and other hyperparameter experiments are run. 

a. Why might deep learning models be growing by such a scale? (Namely, training operations are doubling every 3.4 months as opposed to the every 2 years which used to describe the rate of change of Moore’s law Amodei (2018).) 

b. The cited calculations are rife with uncertainty. Where do you think that uncertainty arises? 

c. What might be done to change the objective function of these models (or of the community of modelers) to incorporate the costs of energy?

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: