Large AI models require massive computing power and energy to train. For instance, in 2020, according to recent studies, it took 1,287 MWh to train GPT-31. Enough to power over 100 US homes for a year.2 That’s why sustainability must be built into every decision you make to train, tune and deploy your AI models. AI requires a holistic approach that prioritizes sustainability from the infrastructure and software, to where the models are trained and deployed run, and how they are powered and cooled with renewable energy.