More

    AI Thirst

    Share post:

    I recall the days when I wished my Google connections were powerful enough to grant me an audience with LaMDA. At the time, the allegedly sentient proto-chatbot that had managed to enchant (and get fired) one of the engineers training it, sounded so distant and unreachable, like something straight out of a sci-fi novel.

    It was less than a year ago. Like many of you, I’ve since had the pleasure of conversing at leisure with a free version of OpenAI’s ChatGPT. So far, the experience has been thrilling enough to get me hooked and slightly jealous each time my new friend locks me out to serve the needs of others. Of course, these days I can be unfaithful, too. There is Bard, and Bing, and Jesper, to name but a few, all eager to quench my AI thirst while GPT is looking the other way.

    - Partner Message -

    The vertiginous speed at which this tech revolution is evolving feels both exhilarating and scary. I can certainly relate to the desperate call of the experts to bridle the galloping AI race that is, according to them, getting out-of-control. Pause and reflect, they urge, and use the time to make these powerful state-of-the-art systems “more accurate, safe, interpretable, transparent, robust, aligned, trustworthy, and loyal.”

    Not that anyone seems to care, of course. The dam has already broken. ChatGPT alone managed to amass over 100 million users in a couple of months after its release, making it the fastest-growing consumer product ever. Humanity is rapidly getting addicted to the bounty of AI, thirsting to use and abuse it at will.

    As it turns out, however, we are not the only thirsty ones. According to scientific research published earlier this month, large AI models are devouring enormous quantities of water during their training process. And it doesn’t stop there. Once they’ve ‘graduated’, a model like ChatGPT needs to ‘drink’ a 500 ml bottle of fresh water for each simple conversation of roughly 20-50 questions and answers, estimate the researchers.

    This huge water footprint has gone under the radar so far, overshadowed by the advancing AI’s even larger carbon footprint. Training a model to understand and generate human language requires some heavy-duty processing power. And that considerable upfront power cost is less than half of the energy needed for the actual use of the model, with billions of requests pouring in. A recent attempt to quantify the carbon footprint of Bloom, a 176-billion parameter language model, reads like a cautionary tale, in many ways similar to that of the crypto industry.

    Luckily, it’s not all gloom and doom. Most companies developing the models are aware that their creations are gulping unreasonable amounts of energy and water. Tech savvy geniuses are wrecking their brains on innovative ways to reduce their environmental footprint, from using renewable energy and implementing sustainable practices to water recycling and improving the efficiency of hardware and cooling systems.

    The most promising course of action, however, appears to be putting our new intelligent friends themselves to work on saving the planet. A joint study conducted by PwC and Microsoft (and therefore a potentially biased one) estimates that using AI for environmental applications has the potential to boost global GDP by 3.1 – 4.4% while also reducing global greenhouse gas emissions by around 1.5 – 4.0% by 2030 relative to business as usual. Considering the adoption of AI in several subsectors – agriculture, energy, transport, and water – the researchers find many reasons to be optimistic.

    Given AI models are expected to do a lot of the heavy lifting on saving resources and enhancing productivity, should I perhaps forgive GPT for being so thirsty? Or should I think twice before launching my next not-strictly-necessary query? “If you are concerned about my environmental impact, there are steps you can take to reduce your own footprint, such as using energy-efficient devices, reducing unnecessary device usage, and supporting companies that prioritize sustainability,” advises ChatGPT. “I am just one small component of a larger technology ecosystem, and it is up to individuals, businesses, and policymakers to work together to minimize the impact of technology on the environment.”

    Image courtesy of Steve Johnson on Unsplash
    Julia Axelsson, CAIA
    Julia Axelsson, CAIA
    Julia has accumulated experience in asset management for more than 20 years in Stockholm and Beijing, in portfolio management, asset allocation, fund selection and risk management. In December 2020, she completed a program in Sustainability Studies at the University of Linköping. Julia speaks Mandarin, Bulgarian, Hindi, Russian, Swedish, Urdu and English. She holds a Master in Indology from Sofia University and has completed studies in Economics at both Stockholm University and Stockholm School of Economics.
    - Partner Message -

    Nordsip Insights

    From the Author

    Related articles