Artificial intelligence may live in “the cloud,” but its footprint is firmly on the ground. As AI systems grow more powerful, the data centers that train and run them are consuming massive amounts of land, water and electricity—as well as reshaping regional power grids. What does this surge in demand mean for the environment, energy infrastructure, and the future of innovation?
In this episode, we speak with Prof. Andrew Chien, a computer scientist at the University of Chicago and a senior computing scientists at Argonne National Laboratory. An expert in large-scale computing and cloud computing, he explains why these data centers require so much power, why they’re stirring such controversy—and proposes a sustainable approach to data centers that could keep our energy use in check.