Technology
Understanding the Energy Demands of AI: Local vs. Cloud Systems

The growing use of artificial intelligence (AI) has sparked questions about its energy consumption, particularly contrasting the efficiency of local devices with the demands of large cloud-based systems. Dave Taylor, a technology expert, addresses this disparity, emphasizing that while AI applications run smoothly on smartphones and tablets, platforms like OpenAI and Google require substantial energy resources to function effectively.
Understanding the difference in power needs is crucial. Local AI features, such as those found on devices like the Google Pixel 10 and Apple iPhone 17, often utilize a Neural Processing Unit (NPU) to perform tasks like voice transcription or language translation. This allows them to maintain efficiency while operating independently of the internet. However, many AI features still rely on online connectivity for more complex processing capabilities.
AI systems like ChatGPT, Gemini, and Grok benefit from vast amounts of data and robust processing power, operating from data centers equipped with thousands of servers. These facilities necessitate extensive cooling systems and substantial electricity, raising questions about their environmental impact.
In practical terms, individual queries to large language models (LLMs) typically consume around 0.001 kWh of electricity, which is comparable to a 10W bulb running for six minutes. More intensive tasks, such as image generation, can require between 0.01 kWh and 0.1 kWh per image, approximately equivalent to charging a smartphone twice.
Water consumption is another consideration, especially regarding cooling systems for data centers. Estimates suggest that processing a text query may use about 500 mL of water, while generating an image could require between 2-3 liters. Although this water is not entirely consumed, as it can be cycled back for cooling, it still raises awareness of resource use in AI operations.
The distinction between local and cloud-based AI systems underscores a significant trade-off: local tools can minimize environmental impact, while cloud systems offer greater capabilities and access to broader knowledge. The future of AI may hinge on optimizing these systems to balance power consumption with performance.
Dave Taylor, who has been a part of the online landscape since the inception of the Internet, continues to explore these topics on his platform, AskDaveTaylor.com. Readers are invited to subscribe to his newsletter for insights into the evolving tech world.
-
Technology2 months ago
Discover the Top 10 Calorie Counting Apps of 2025
-
Technology1 month ago
Discover How to Reverse Image Search Using ChatGPT Effortlessly
-
Lifestyle2 months ago
Belton Family Reunites After Daughter Survives Hill Country Floods
-
Technology3 weeks ago
Uncovering the Top Five Most Challenging Motorcycles to Ride
-
Technology2 months ago
Harmonic Launches AI Chatbot App to Transform Mathematical Reasoning
-
Technology2 months ago
Meta Initiates $60B AI Data Center Expansion, Starting in Ohio
-
Technology2 months ago
Recovering a Suspended TikTok Account: A Step-by-Step Guide
-
Technology2 months ago
ByteDance Ventures into Mixed Reality with New Headset Development
-
Technology2 months ago
Google Pixel 10 Pro Fold vs. Pixel 9 Pro Fold: Key Upgrades Revealed
-
Lifestyle2 months ago
New Restaurants Transform Minneapolis Dining Scene with Music and Flavor
-
Technology2 months ago
Mathieu van der Poel Withdraws from Tour de France Due to Pneumonia
-
Education2 months ago
Winter Park School’s Grade Drops to C, Parents Express Concerns