With the fast speed that AI is currently developing, it has the potential to alleviate one of the most pressing problems—climate change. AI applications, such as smart electricity grids and sustainable agriculture, are predicted to mitigate environmental issues. On the flip side, the integration of AI in this field can also be counterproductive because of the high energy demand of the systems. If AI helps us to transition to a more sustainable lifestyle, the question is, at what cost?
The last decade saw exponential growth in data demand and the development of Large Language Models (LLMs)–computational models such as ChatGPT, designed to generate natural language. The algorithms resulted in increased energy consumption because of the big data volumes and computational power required, as well as increased water consumption needed to refrigerate data centers with that data. This consequently leads to higher greenhouse gas emissions (Fig.1). For example, the training of GPT-3 on a 500 billion-word database produced around 550 tons of carbon dioxide, equivalent to flying 33 times from Australia to the UK [1]. Moreover, information and communications technology (ICT) accounts for 3.9% of global greenhouse gas emissions (surpassing global air travel) [2]. As the number of training parameters grows, so does the energy consumption. It is expected to reach over 30% of the world’s total energy consumption by 2030. These environmental concerns about AI implementation led to a new term—Green AI.
Green algorithms are defined in two ways: green-in and green-by AI (Fig. 2). Algorithms that support the use of technology to tackle environmental issues are referred to as green-by AI. Green-in-design algorithms (green-in AI), on the other hand, are those that maximize energy efficiency to reduce the environmental impact of AI.
Green-by AI has the potential to reduce greenhouse gas emissions by enhancing efficiency across many sectors, such as agriculture, biodiversity management, transportation, smart mobility, etc.
- Energy Efficiency. Machine Learning (ML) algorithms can optimize heating, air conditioning, and lighting by analyzing the data from the smart buildings, making them more energy efficient [4][5].
- Smart Mobility. AI can predict and avoid traffic congestion by analyzing the current traffic patterns and optimizing routes. Moreover, ML contributes to Autonomous Vehicles by executing tasks like road following and obstacle detection, which improves overall road safety [6].
- Sustainable agriculture. Data from sensors and satellites analyzed by ML can give farmers insights into crop health, soil conditions, and irrigation needs. This enables them to use the resources with precision and reduce environmental impacts. Moreover, predictive analytics minimize crop loss by allowing farmers to aid the diseases on time [7].
- Climate Change. Computer-vision technologies can detect methane leaks in gas pipes, reducing emissions from fossil fuels. AI also plays a crucial role in reducing electricity usage by predicting demand and supply from solar and wind power.
- Environmental Policies. AI’s ability to process data, identify trends, and predict outcomes will enable policymakers to come up with effective strategies to combat environmental issues [8].
Green-in AI, on the other hand, is an energy-efficient AI with a low carbon footprint, better quality data, and logical transparency. To ensure people’s trust, it offers clear and rational decision-making processes, thus also making it socially sustainable. Several promising approaches to reaching the green-in AI include algorithm, hardware, and data center optimization. Specifically, more efficient graphic processing units (GPUs) or parallelization (distributing computation among several processing cores) can reduce the environmental impacts of training AI. Anthony et al. proved that increasing the number of processing units to 15 will decrease greenhouse gas emissions [9]. However, the reduction in runtime must be significant enough for the parallelization method not to become counterproductive (when the execution time reduction is smaller than the increase in the number of cores, the emissions deteriorate). Other methods include computation at the locations where the data is collected to avoid data transmissions and limit the number of times an algorithm is run.
Now that we know about AI’s impact and the ways to reduce it, what trends can we expect in the future?
- Hardware: Innovation in hardware design is focused on creating both eco-friendly and powerful AI accelerators, which can minimize energy consumption [10].
- Neuromorphic computing is an emerging area in the computing technology field, aiming to create more efficient computing systems. It draws inspiration from the human brain, which performs complex tasks with much less energy than conventional computers.
- Energy-harvesting AI devices. Researchers are exploring the ways in which AI can harvest energy from its surroundings, for example from the lights or heat [11]. This way, AI can rely less on external power and become self-sufficient.
In conclusion, while AI holds great potential in alleviating many environmental issues, we should not forget about its own negative impact. While training AI models results in excessive greenhouse gas emissions, there are many ways to reduce energy consumption and make AI more environmentally friendly. Although we discussed several future trends in green-in AI, it is important to remember this field is still continuously evolving and new innovations will emerge in the future.
References:
[1] D. Patterson, J. Gonzalez, Q. Le, C. Liang, L.-M. Munguia, D. Rothchild, D. So, M. Texier, J. Dean, Carbon emissions and large neural network training, 2021, arXiv:2104.10350.
[2] Bran, Knowles. “ACM TCP TechBrief on Computing and Carbon Emissions.” Association for Computing Machinery, Nov. 2021 www.acm.org/media-center/2021/october/tpc-tech-brief-climate-change
[3] Nestor Maslej, Loredana Fattorini, Raymond Perrault, Vanessa Parli, Anka Reuel, Erik Brynjolfsson, John Etchemendy, Katrina Ligett, Terah Lyons, James Manyika, Juan Carlos Niebles, Yoav Shoham, Russell Wald, and Jack Clark, “The AI Index 2024 Annual Report,” AI Index Steering Committee, Institute for Human-Centered AI, Stanford University, Stanford, CA, April 2024.
[6] M. Bojarski, D. Del Testa, D. Dworakowski, B. Firner, B. Flepp, P. Goyal, L.D. Jackel, M. Monfort, U. Muller, J. Zhang, et al., End to end learning for self-driving cars, 2016, arXiv preprint arXiv:1604.07316.
[9] L.F.W. Anthony, B. Kanding, R. Selvan, Carbontracker: Tracking and predicting the carbon footprint of training deep learning models, 2020, arXiv preprint arXiv:2007.03051.
[11] Divya S., Panda S., Hajra S., Jeyaraj R., Paul A., Park S.H., Kim H.J., Oh T.H.
Smart data processing for energy harvesting systems using artificial intelligence