Dr. Tamar Eilam discusses sustainable computing and AI sustainability. Deep learning requires a lot of computation and energy to train large models. The demand for AI is growing exponentially, as are the sizes of language models. Foundation models are becoming more common, where a broad pre-trained model is adapted for specific tasks. However, continuously training larger models risks increasing energy consumption significantly. Sustainable AI research aims to dynamically track energy and carbon usage, while helping data scientists determine optimal model training strategies based on transparency around computational costs and model performance.