**8. Carbon footprint (***gCO***2***eq***) and Artificial Intelligence**

AI is an important factor in our daily life and an important factor in the science of the healthcare system. Deep learning (a process by which computer models are trained to identify the patterns from a data set) training requires computationally intensive computers and a large amount of power and associated carbon emission. In a report published by researchers from the University of Massachusetts Amherst estimating the amount of power required for training certain type of Artificial Neural Network (ANN) architecture emits roughly 626, 000 pounds of carbon dioxide [33]. This will get more severe during the model development phase. The proposed deep neural networks are deployed on diverse hardware platforms with different computational properties.

Researchers from MIT-IBM Watson AI Lab introduced a novel AI system "Oncefor-all network" with improved computational efficiency and with a smaller carbon footprint. In their approach, the system, train a large neural network comprising of many different sizes subnetworks and a large number of IoT devices connected to the network. All the subnetworks used in the system can be tailored to diverse hardware platforms without retrain them. In their work, the authors estimate that the computer-vision model process will require **<sup>1</sup> <sup>1300</sup>** the carbon emission compared to the existing neural architecture search approaches. Also, the approach reduces the interference time with a minimum of 1.5–2.6 times [33].

Another approach for tracking and predicting the energy and carbon footprint of training deep learning models is explained in [34]. The tool "Carbontracker" is used to report energy and carbon footpring alongside of performance metrics of model development and training. In this work, to predict the accuracy on reducing the carbon footprint, the authors experimentaly evaluate the tool on different convolutional neural network (CNN) architectures and healthcare data sets.
