Thanks to the tech we built, we can now watch a genocide unfold live, yet we still debate about which side to take, as our elected leaders continue bending to corporate interests. What else do we have to witness to acknowledge that business as usual isn’t working? To acknowledge that we can’t keep stealing, extracting, and killing people to grow an unsustainable system to its breaking point.
I try to focus on the positive and spread messages of hope so that we can imagine better futures, different from the insidious Black Mirror visions of tech we are told are inevitable in the name of progress. But as I’m reading through various climate reports and watch children and young people from Gaza report a genocide in real time, my faith in humanity falters.
Tell me, how is AI, AGI, superintelligence, or whatever you want to sell it as, going to stop us from killing each other to ensure the continued extraction of natural resources? How is AI going to help humanity when we’re still obsessed about winning, dominating, conquering, colonizing, crushing our enemies? Are these the values we’ll align our AI models to?
We have a lot to rethink as a society before we should even dare to think about building superintelligence. Before playing gods, we should invest in human intelligence to realize how stupid we’re actually being in our disregard for life on our planet.
The children are watching and judging us. Will you tell your children stories about smashing KPIs and scrolling social media when they inevitably ask you what you were doing to keep the planet habitable for them and their children?
Yeah, you’re too busy. You don’t have the right title. But who does, if not you and me?
Cease fire. Rethink the stories we tell. And start valuing all life on planet Earth.
]]>Stories used to belong to the land. They were born out of the beauty of the land, the hardships of trying to survive with the land, the languages shaped by the land, the ecosystems of the land. Stories told in the right place at the right time.
Stories haunted with the ghosts of the past, shaped by worries of the present, and reflecting our dreams and hopes for the future. Stories that helped us find purpose and meaning in the land.
We called it progress when our stories became untethered from the land and started placing selfish human desires above all else. Spreading a false sense of superiority and entitlement: man above nature, man above land, man above man, man above woman. We started building fences with stories and thus diving lands with imaginary lines. Land becoming a thing we aspired to control and own.
To extract more from the land we now thought we owned, we invented machines. And stories to justify the violence against the land, the injustice against the others that the building of the machinery of progress required. The more machines we invented, the more machine-like we became in the stories we told, further detaching ourselves not just from the land but from our spirit. Just cogs in the greater machinery of progress.
With this loss of connection and responsibility came our endless quest for purpose and meaning, a gaping hole in our chest, a pit of despair in our stomach. We now find ourselves spinning new stories to fill these voids that are expanding as we drift further apart from each other and the land. Yet none of these new stories that maximize shareholder value sit right in our bodies.
The land calls to us, but we answer the call by flying to different lands, to visit the cities we built over our lands. The cities that are becoming interchangeable, devoid of the land’s color and character. Local climate a mere nuisance, not an invitation to get intimate with the new land you now have the privilege of visiting. The land becoming a mere backdrop for selfies, its stories repackaged in mass produced souvenirs for easy consumption.
We try to further escape the land by building virtual lands. Lands not shaped by the elements and physics over millennia, but digital places built in pixels and code, appearing in the digital cloud overnight and often disappearing just as fast.
These places now connect us over distances and timezones. Seemingly timeless, not changing with seasons and cycles of our planet’s celestial dance we now barely pay attention to, further solidifying the illusion of our separation from land.
Places and stories that abstract away their weight. Data centers hidden from sight, new stories of genius and magic hiding the unpaid bills of materials, energy, and exploitation.
Yet our bodies still remember the land as we design digital interfaces and landscapes, and try to capture the essence of the land on our screens. The pixel lands not quite satisfying our longing, even in 4K.
And now we’re building machines that instantly generate images of land and people with simple incantations. And because we still feel empty inside, we’re pursuing even greater machines that would rule over the men who place themselves above all others. What kind of arrogance is it to engineer our own obsolescence and aim for complete independence from the land that shaped us?
It is the great forgetting of our times. Stories getting mixed up, finding themselves in places where they don’t belong at the wrong times, stripping us of hope and autonomy. If we could only remember the power of stories we could remember the stories that can lead us back to the land and into each other’s embrace once more.
We could tell stories that are grounded in place, mindful of the ghosts of the pasts, shaped by the demands of our times, that reflect our dreams and hopes for a future in which the land is still habitable and our children thriving.
Listening to the whispers of land once more does not mean giving up everything we learned since untethering our stories from the land. But it does mean rewriting our current stories from a place of care for each other and our home planet. Stories that will inspire us to respect the land and even build digital places of a different kind that don’t see our bodies as machines, but as children and storytellers of the land.
]]>I know, I know, the title of this post is atrociously click-baity. But if we allow Big Tech to get away with making outrageous claims about how their technology is saving humanity and release software that hallucinates, I hope you can forgive me for using the same tactic to bring up an important topic. We need to have a serious conversation about the environmental costs of our chats with OpenAI’s ChatGPT1 and other Large Language Models (LLMs) like it.
Now, why would I bring up such a depressing topic AND top it off with a reminder of how we’re harming polar bears, eh? Well, thinking about polar bears and ice helped me survive another record-breaking hot summer. And watching as parts of my country got swept away by record-breaking floods, it’s clear that climate change is here, whether you want to think about it or not. Yet, we are spending so much time this year thinking and talking about ChatGPT that I felt like I had no choice but to recruit polar bears to help me get this very real existential concern across.
Still with me? Ok, good. You might have heard – or not, it’s not like Big Tech wants this in the news – that training LLMs is expensive in more ways than one. You need A LOT of data scraped off the internet – a fancy way of saying you’re stealing what everyone is posting online without asking for permission – and a lot of computers munching on all this data day and night. Obviously, all these computers need energy to run. Data centers – where these computers live – need a lot of energy and water to stay cool. Not to mention you need a lot of material resources, including rare earth metals, to make those computers. Yeah, digital doesn’t mean green or clean, especially when done at the scale LLMs require to appear intelligent.
Microsoft – OpenAI’s partner – proudly talks about how thousands of NVIDIA GPUs were linked together to train OpenAI’s models, but what’s suspiciously missing from these press releases is the S word: sustainability. I find this unusual given how big Microsoft is on sustainability and how much they like to talk about how AI is going to help us deal with climate change. So why is the carbon footprint of ChatGPT & co. still shrouded in mystery and speculation?
Because that’s all we have when it comes to the environmental impact of the most popular LLMs: estimates. My curiosity in this topic was fueled by reports of a study that estimated 700,000 liters of clean freshwater could have been used to train OpenAI’s GPT-3 in state-of-the-art US-based data centers, and that “ChatGPT needs to “drink” a 500ml bottle of water for a simple conversation of roughly 20-50 questions and answers, depending on when and where ChatGPT is deployed”. A bottle of water per conversation doesn’t sound too bad? Multiply that by 100 million active users. Daily.
And there are similar studies on other hidden cost of AI that try to estimate the environmental – and social – impact of training and using LLMs like ChatGPT, with media and blogs occasionally surfacing the most worrying numbers. And I keep wondering, why are we still playing this guessing game given the reach ChatGPT has? Especially when you consider that Microsoft is baking it into Windows, Office, and other products with staggering reach and impact.
So I decided to go directly to the source. At the time of writing, OpenAI doesn’t have a page dedicated to sustainability on their website, but one can learn a lot about the men behind the OpenAI curtain from the way ChatGPT answers questions. And so I typed: “What is the environmental impact of ChatGPT?” into ChatGPT.
On December 7, 2022 – barely a week after public launch – ChatGPT tried to convince me that it does not “consume any resources or produce any waste or pollution”. I had to type in an additional prompt to get it to admit that our conversation does indeed use energy, but it was still very confident that “the overall impact on the environment is minimal and can be offset by the many benefits that these models provide.” How convenient for OpenAI!
Well, I decided to give ChatGPT another chance by asking the same question on September 5, 2023 – almost 9 months later. And lo and behold! It looks like somebody did a bit of reading on the topic! This time, ChatGPT2 was more informed on its environmental impact, but still reassured me that: “Efforts to mitigate this impact are ongoing, with a focus on increasing energy efficiency and transitioning to cleaner energy sources.” Lovely.
But hey, ChatGPT, what about water usage? “Water usage is not typically a direct concern associated with the operation of AI models like ChatGPT.” Ah. How about mining resources for data centers? “The impact of mining resources needed for data centers, including the extraction of materials like rare earth metals and other minerals, is an important environmental consideration associated with the development and operation of data centers, including those used for training and deploying AI models like ChatGPT.” That’s a bit better, were you allowed to read the Atlas of AI?
However, ChatGPT also made sure to add that: “It’s important to note that these environmental concerns are not unique to data centers but are part of a broader conversation about the environmental impact of the technology industry as a whole.” If you want to blame somebody, blame the entire industry! That’s actually a valid point, ChatGPT.
But again, I hate having to quote an LLM in this post, I would much rather quote actual data from OpenAI. I’m sure that Microsoft has the resources to support them on calculating the carbon emissions of their models, at the very least. I’m also sure the smart folks at OpenAI are familiar with Carbon Emission model cards supported by Hugging Face and the various packages that can help with simple carbon emissions calculations. Heck, OpenAI employs enough researchers and engineers that they could come up with even more precise estimation tools. But OpenAI is anything but open when it comes to its environmental impact and costs.
I guess I’ll have to check back in with ChatGPT in 9 months and see if it can generate answers that assume more responsibility?
When I made a pledge in June to learn more about the environmental costs of LLMs, I thought I’d be able to read a lot more research. But I quickly grew frustrated reading similar articles citing numbers from a handful of estimate-based studies.
I came to the conclusion that there are three main problems that we need to bring way more attention to when talking about this topic:
You might be saying: that’s all depressing and infuriating, but what can I, as a non-billionaire, do about all of this? Is there any hope left for polar bears?
I’m glad you asked! (Or let’s pretend you did.) Because there is indeed hope left as long as you:
Speak up! When talking about how magical ChatGPT is with friends or at work, bring up the environmental concerns. Bring up the topic on social media, in blog posts, podcasts, and other content you create, we need to keep making noise. (As an example, here’s a LinkedIn post I recently wrote on the topic.) When sharing articles that talk about estimates but fail to acknowledge the responsibility of OpenAI, Microsoft, Google, Facebook and other companies with resources to be more transparent, call out the lack of transparency and their misdirection tactics, especially when they misdirect from climate justice!
Limit your use of LLMs: While I do think it is important to experiment with LLMs to understand them better, we can also be mindful of what we use them for. If you can find the answer using a search engine, do that instead. Learn what LLMs are actually good for and where they fall short, and limit your usage in scenarios where other tools – or humans! – do the job better.
Advocate for responsible deployment: This one applies to those of you who work in tech and have some input on how companies choose to deploy AI-based products. Does your product actually need an LLM, or can a similar result be achieved with a different approach? Similar to how we learned that not every database needs to become a blockchain, we need to learn that not every product needs generative AI. Every technology we invent is like fire. It’s exciting and warm, and we are easily seduced by its power. But fire isn’t the solution to everything, and when you do decide to build fires, you should learn about responsible firekeeping that helps you balance the fire of technology with other elements.
I linked the resources I used to explore this whenever relevant, but I also wanted to highlight a couple of articles that were particularly helpful in shaping my understanding so far:
I also wanted to thank my ResponsibleTech.Work co-conspirator Daniel Hartley for all the in-depth discussions we’ve had on this topic in the past couple of months. Daniel keeps me well-supplied with links to research and has way more patience for reading various estimate studies than I do. He keeps an excellent list of AI Resources on his website, and I often find myself referring to the sustainability section.
AI usage disclaimer: The post, typos and all, was written without the assistance of generative AI tools, except for the quotes generated by ChatGPT. DALL·E was used to generate the cover image. Hopefully, the impact on polar bears was minimal, but we won’t know for sure until we can convince Big Tech to be more transparent.
I’m focusing on ChatGPT and OpenAI in this post because of ChatGPT’s popularity and hence impact. But Google’s Bard and other Big Tech LLMs are no better. ↩
The answers quoted in this post were generated by the default GPT-3.5 model on the specified dates using the web interface at chat.openai.com without any custom instructions. ↩
Can you imagine how different our phones and other devices would be if they weren’t designed in Silicon Valley, where the individual is the center of the universe, where winning is always the goal?
Can you imagine how our mindsets would shift if our apps didn’t care about crushing it and smashing it, and replaced leaderboards with collaborative sessions?
If our phones stayed silent when we’re together and don’t need to be nudged to buy more of the stuff we don’t need.
What kind of apps would we build if we didn’t have to fund our own existence and could instead spend more time building what we care about? Would we even build apps or just spend more time with each other, exploring and learning together?
What if our phones, these magical little computing devices, actually helped us to learn more about the world around us, instead of being a distraction from boredom, a way to escape discomfort.
I do love the way we can now learn from different perspectives and collaborate across timezones, but why are we only doing it under the guise of productivity or by risking the wrath of trolls?
Why is it all for profit, public discussion a place to vent frustrations, why are comments so toxic, and why is it that we cannot have nice things on the internet?
Why are we measured, compared, disliked, pitied against each other by algorithms that try to keep us glued to our screens, selling our attention to the highest bidder?
There’s values baked into the operating systems and devices we use every day. And there are other ways to think about the world where the individual is not the center of the universe and profit not the only guiding force of innovation.
Let’s make the collective decision to question the assumptions, and build technology that empowers rather than divides, that invites collaboration rather than competition, that supports thriving rather than winning.
]]>