Estimated reading time: 0 minutes, 16 seconds

Making Deep Learning More Energy Efficient

Making Deep Learning More Energy Efficient "solitude"

IBM is proposing a way to make deep learning less of an energy drain by reducing the bits from 16 to 4, reports MIT Technology Review. 

Deep learning is an inefficient energy hog. It requires massive amounts of data and abundant computational resources, which explodes its electricity consumption. 

Read the article on MIT Technology Review

Read 1893 times
Rate this item
(0 votes)

Visit other PMG Sites:

We use cookies on our website. Some of them are essential for the operation of the site, while others help us to improve this site and the user experience (tracking cookies). You can decide for yourself whether you want to allow cookies or not. Please note that if you reject them, you may not be able to use all the functionalities of the site.