Green AI: When Technology and the Environment Go Hand in Hand

In recent years, AI has become a hot topic ranging from ChatGPT that helps answer questions, to AI that can edit photos, create videos, or even assist doctors in diagnosing diseases. However, behind this intelligence lies a heavy cost. Developing AI, especially large-scale models, requires massive computing power. According to a report from Knowledge at Wharton, training GPT 3 consumed around 1,287 megawatt hours (MWh) of energy and emitted about 502 tons of CO2, equivalent to the yearly emissions of around 112 gasoline powered cars, and that is just for the training phase. Real time processing can account for as much as 60% of AI’s total energy consumption1, not including ongoing updates and improvements. Today, OPEN-TEC (Tech Knowledge Sharing Platform), powered by TCC TECHNOLOGY GROUP, will take you through the importance of Green AI concept that can no longer be overlooked.

The Beginning of Green AI

During AI’s energy challenges, the concept of Green AI emerged. It emphasizes that AI progress should not be measured only by accuracy or intelligence, but also by energy costs and environmental impact. The goal of Green AI is to create AI that performs efficiently while consuming as few resources as possible. According to IBM, one key approach is the development of Small Language Models (SLMs) models that require less memory and computing power than large scale models. These are well suited for resource constrained environments, such as edge devices, mobile applications, or even offline operations2. This approach also includes model compression, an engineering technique that reduces a model’s complexity while maintaining satisfactory performance.

Big Tech Companies Supporting SLMs

Green AI is no longer confined to academic research. It is actively applied by major tech companies like Google and Microsoft, both of which prioritize developing SLMs to cut energy use. A report from Microsoft revealed that its “Phi” model delivers high efficiency with significantly lower computing requirements, resulting in massive energy savings3 and a clear reduction in carbon emissions. Meanwhile, Google reported that its Gemma model, an SLM supporting text, images, audio, and video, can run directly on end user devices 4. This approach reduces the amount of data and shifts energy use to end devices instead. Thus, the development of SLMs is not just an AI innovation but also reflects the concrete efforts of big tech companies to reduce environmental impact.

AI in Daily Life and the Hidden Burden

Many might think Green AI is an issue for big companies, with little connection to daily life. But in reality, AI is already embedded in our everyday routines whether it is smartphones that use AI to search information, electric cars with autonomous driving systems, or smart home devices that respond to our voice commands. Every time these devices operate, hidden energy consumption occurs. If overlooked, it could lead to environmental damage and wider societal costs.

Lastly, smarter AI symbolizes technological progress. But such progress is meaningless if it comes at the cost of environmental destruction. Green AI is therefore not just an option it is a necessity that we must all collectively support. In the future, AI should not only be intelligent in problem solving but also wise in preserving our planet.

References

  1. Walther, C. C. (2024, November). The hidden cost of AI energy consumption. Knowledge at Wharton, The Wharton School, University of Pennsylvania. https://knowledge.wharton.upenn.edu/article/the-hidden-cost-of-ai-energy-consumption/?utm_source=chatgpt.com
  2. Caballar, R. D. (2024, October 31). What are small language models? International Business Machines Corporation (IBM). https://www.ibm.com/think/topics/small-language-models
  3. Microsoft. (2025, January). Accelerating sustainability with AI: Innovations for a better future. Microsoft Corporation.
  4. Sherwood, M., Chan, M., & Ikonomidis, M. (2025, May 20). On device small language models with multimodality, RAG, and function calling. Google LLC. https://developers.googleblog.com/en/google-ai-edge-small-language-models-multimodality-rag-function-calling/