Technology
Understanding the Energy Demands of AI: Local vs. Cloud Systems
The growing use of artificial intelligence (AI) has sparked questions about its energy consumption, particularly contrasting the efficiency of local devices with the demands of large cloud-based systems. Dave Taylor, a technology expert, addresses this disparity, emphasizing that while AI applications run smoothly on smartphones and tablets, platforms like OpenAI and Google require substantial energy resources to function effectively.
Understanding the difference in power needs is crucial. Local AI features, such as those found on devices like the Google Pixel 10 and Apple iPhone 17, often utilize a Neural Processing Unit (NPU) to perform tasks like voice transcription or language translation. This allows them to maintain efficiency while operating independently of the internet. However, many AI features still rely on online connectivity for more complex processing capabilities.
AI systems like ChatGPT, Gemini, and Grok benefit from vast amounts of data and robust processing power, operating from data centers equipped with thousands of servers. These facilities necessitate extensive cooling systems and substantial electricity, raising questions about their environmental impact.
In practical terms, individual queries to large language models (LLMs) typically consume around 0.001 kWh of electricity, which is comparable to a 10W bulb running for six minutes. More intensive tasks, such as image generation, can require between 0.01 kWh and 0.1 kWh per image, approximately equivalent to charging a smartphone twice.
Water consumption is another consideration, especially regarding cooling systems for data centers. Estimates suggest that processing a text query may use about 500 mL of water, while generating an image could require between 2-3 liters. Although this water is not entirely consumed, as it can be cycled back for cooling, it still raises awareness of resource use in AI operations.
The distinction between local and cloud-based AI systems underscores a significant trade-off: local tools can minimize environmental impact, while cloud systems offer greater capabilities and access to broader knowledge. The future of AI may hinge on optimizing these systems to balance power consumption with performance.
Dave Taylor, who has been a part of the online landscape since the inception of the Internet, continues to explore these topics on his platform, AskDaveTaylor.com. Readers are invited to subscribe to his newsletter for insights into the evolving tech world.
-
Science4 months agoNostradamus’ 2026 Predictions: Star Death and Dark Events Loom
-
Science4 months agoBreakthroughs and Challenges Await Science in 2026
-
Technology8 months agoElectric Moto Influencer Surronster Arrested in Tijuana
-
Technology5 months agoOpenAI to Implement Age Verification for ChatGPT by December 2025
-
Technology10 months agoDiscover the Top 10 Calorie Counting Apps of 2025
-
Health8 months agoBella Hadid Shares Health Update After Treatment for Lyme Disease
-
Health8 months agoAnalysts Project Stronger Growth for Apple’s iPhone 17 Lineup
-
Health8 months agoJapanese Study Finds Rose Oil Can Increase Brain Gray Matter
-
Technology5 months agoTop 10 Penny Stocks to Watch in 2026 for Strong Returns
-
Science7 months agoStarship V3 Set for 2026 Launch After Successful Final Test of Version 2
-
Technology7 months agoInMotion Unveils P6 Electric Unicycle with 93 MPH Top Speed
-
Technology2 months agoNvidia GTC 2026: Major Announcements Expected for AI and Hardware
