I finished reading World Wide Waste by Gerry McGovern. I'd consider it essential reading for anyone working with computers!


It's well cited (though I still need to check those citations) & uses maths effectively to make it's point.

That computers + (surveillance) capitalism is actually worse for the environment than the predigital era. That we can and must move slow and fix things, and fund that vital work directly.


Don't get me wrong, computers can absolutely help us regain our environmental efficiency. They just *aren't*.

Not as long as we're:
* constantly syncing everything to the cloud,
* expecting same-hour delivery,
* funding our clickbait via surveillance advertising,
* buying a new phone every year,
* using AIs because they're cool rather than useful,
* running bloated software & webpages,
* buying into "big data"
* etc

Computing is environmentally cheap, but it rapidly adds up!

@alcinnz @zensaiyuki I see you talking about “sending everything to the cloud” and couldn’t agree more.

If you are versed on that matter, you may be interested in @hergertme’s Bonsai project.

A bit stale at the moment since Christian is working on a lot of very nice things, but definitely worth having a look at


"Computing is environmentally cheap (...)"

Computers are not though unfortunately, their production is an often overlooked massive energy expense that often exceeds the running energy consumption of their entire lifespan.

There is probably a better way to deal with this through repair and reuse, but either way computers are highly environmentally problematic even before they got to compute anything. :/

This article has some good info: solar.lowtechmagazine.com/2009

@unicorn Yeah, there's a good reason I listed: "buying a new smartphone every year". I just couldn't fit *why* in my toot.

You can add TVs and internet boxes to the list.
5G will bring their useless internet of shit devices and antennes.
I've heard there are 30 years of rare-earth metal resources left...

@numahell @unicorn Wouldn't be surprised. Our phones each contain all bit one of those metals...

@alcinnz @numahell @unicorn and then most of us don't bother handing in old electronics for recycling.

I wonder how long it'll take us before we start "mines" in landfills to get rare earth metals from them

@alcinnz I do find it interesting that even as it has become cheaper and more efficient than ever to have local storage and computation, we're centralizing it more and more heavily.

But I think Rob Pike had a point when he said he wants no local storage anywhere near him except maybe caches. Managing redundancy and backups is *hard*. And any p2p storage system that a) I would trust and b) mere mortals could be comfortable with, may not be very efficient energy-wise.

@freakazoid I think a lot but not all of this comes down to corporate propaganda.

But there's been a lot of promising developments recently in p2p. We just need to turn it into something useful, and stop focusing exclusively on blockchains!

@alcinnz Well datacenters are something I have a pretty deep understanding of, having worked IN a datacenter for several years and then having worked in SRE at Facebook and Google. I've also done a lot of research into performance-per-watt of CPUs. Efficient "bin-packing" really swamps all other considerations at the end of the day. Power is the vast majority of the cost of a datacenter, so the companies have a big incentive to use as little of it as possible.

@alcinnz Google especially were willing to spend a ton of engineering time to make even tiny improvements in efficiency, whereas at smaller companies the math usually went the other way because their engineer-per-CPU-hour ratio is much higher. "You mean I can spend more on AWS to avoid having to spend an engineer-month implementing this efficiency improvement? Sign me up!"

@alcinnz The other, possibly more important difference is that Google is no longer growing like gangbusters, so they can't just burn money to avoid spending time fixing things like smaller faster growing companies can.

@alcinnz Large datacenters are incredibly efficient, energy-wise, not just because the bigger processors are more efficient but because when you have that much to work with in terms of workload, you can engage in a lot of neat tricks like shutting off unused machines or running batch workloads in the unused capacity. And with the PCIe fabrics the datacenters are deploying now, you can even do the same tricks with individual cards.

@alcinnz I have yet to see anything about datacenter energy consumption that actually compares it to some actual alternative. They always compare it to some other activity, probably cherry-picked to be as shocking as possible.

I totally agree on instant gratification shipping thing, though even there it's not like they're achieving it with a lot more miles. A lot of that is being done with improvements in logistics using... computers!

@alcinnz I think that it would make a lot more sense to focus on the point of actual ecological damage rather than the consumer end of things. In particular, we desperately need a substantial carbon tax. Even if it's revenue neutral, we'd rapidly see what's really important to people.

@freakazoid The book I was citing there took the approach of performing the comparisons on a global rather than individual basis, and computing how many trees we'd need to plant.

The central point being that computing is environmentally cheap but rapidly adds up. That we can and must do better.

Shipping is an interesting case, showing how computers can help us be more efficient. But computing/instant-gratification can also encourage to be less.

@alcinnz I think that looking at specific things that could be improved is exactly the right approach. Otherwise you're in the land of unfalsifiable claims, because we don't actually know what would happen if we just shut off the Internet or any given service.

@freakazoid Absolutely. Efficient computing ultimately comes down to the fuzzy field of usecases.

That's *one* reason I want people to be funding quality work directly, rather than fund clickbait via surveillance advertising.

@freakazoid I'm all for taking every measure we can!

And I'm glad serverfarms are so efficient, but that won't stop me from discouraging their use. It will however encourage me to recommend Microsoft or Google (or my local Catalyst Cloud) clouds over Amazon's.

@alcinnz Well I don't know anything about AWS's efficiency. I only know anything about AWS from a user standpoint.

Don't get me wrong; the centralization makes me really uncomfortable, and I'm happy to spend energy/money/etc to move more control back into people's hands.

@freakazoid According to GreenPeace's analysis only MAGAF's datacenters are that green, and amongst them Amazon lags behind.

@alcinnz Ah, ok. I probably saw that. I remember being quite floored when Greenpeace started saying positive things about Google and Facebook. But it also gave me a lot more respect for them, since they were willing to actually say when a company had made substantial improvements.

There were certainly plenty of cynics at both Facebook and Google, but most of us really believed in actually making the datacenters green.

@alcinnz (The main things that made me not have respect for Greenpeace were their opposition to nuclear power, which in my view was necessary for going carbon-free though I have since changed my mind, but they were still wrong before the economics of solar really changed) and the dishonesty of their "Kleercut" campaign against Kimberly-Clark which talked about them clearcutting "in old-growth forests", when they were only cutting trees they'd planted themselves.

@alcinnz On the blockchain thing, my friends and I have been wishing for SOMETHING like that since at least '00, and not specifically for cryptocurrency (I could not conceive of anything other than Chaumian e-cash or something like Ripple, which is why I wrote Bitcoin's very first obituary back in 2010). At the time, Spread seemed the most interesting. Stellar's consensus protocol and Avalanche both seem pretty similar to Spread.


@alcinnz IPFS seems closest to reaching critical mass to me, and its implementation also seems the most principled. I can't imagine its association with FileCoin hasn't contributed significantly to the amount of attention it's getting. And FileCoin is an attempt to solve the biggest problem with p2p storage, which is that you have to waaay overreplicate when even large nodes can drop off at any time and there's no guarantee it's not the same organization operating 10k nodes.

@freakazoid @alcinnz

I don't understand why friends can't just help each other host for free or in a pool. we do it on feddy, why does it suddenly need to be monetized? fuck that shit man

@freakazoid @alcinnz you know, I was surprised to learn in Dercuano that there's very little efficiency difference between different processors. Check out notes/keyboard-powered-computers.html in Dercuano; everything is within an order of magnitude of 1 nJ per instruction. 64-bit instructions do more actual computation than 8-bit instructions, it's true... but that's still usually wasted. I'd be interested to learn there's something I overloked!

@kragen @alcinnz The range on the Green500 list is almost 2 orders of magnitude. And the range of what can be done in a single instruction is well beyond the difference between an 8 bit and a 64 bit computer when you start talking about SIMD, VLIW, etc. While it's true that there are inefficiencies at scale, modern CPUs can also dynamically scale clocks independently on different cores, shut down unused units, etc.

@kragen @alcinnz I think it's absolutely true that modern *software* is not particularly efficient, though even there larger scale means you can do more. For example, MapReduce is highly efficient through the whole memory/storage hierarchy, because it moves the code to where the data is instead of vice versa, and it has very high locality of reference.

@kragen @alcinnz Not having to move that data in the first place, because you're working with a small dataset, would also be efficient, but it's unlikely your data processing needs are going to exactly fit your machine at that level.

So I think putting the same effort into making software more efficient is just going to have a much larger return at larger scale.

@kragen @alcinnz And it's not like Google engineers are looking at marketing brochures for CPUs and getting fooled. They have a bottom line. They are constantly testing with different kinds of hardware and actually measuring power consumption using real (and by real I mean production) workloads. My team was involved in just such a project, in fact. Boring, though, because we were using these newfangled CPUs to serve up ads :)

@kragen @alcinnz There's also the fact that an ever-increasing amount of computing isn't even being done on microprocessors anymore but on GPUs and TPUs. And the energy savings in moving model execution to a TPU is pretty enormous. And we're talking like half or more of the total computing workload when it was all executing on CPUs.

@freakazoid @kragen I'm thinking of applying the GPU to process richer DGGS data, once I've got a usecase or API to build against...

I was discussing GPUs just the other day, as a simplified example of the performance challenges CPUs face!

@alcinnz @kragen Are you thinking of representing, say, roads, as a contiguous series of hexagonal cells, essentially a hierarchical raster representation with hexagonal pixels, instead of as paths?

@freakazoid @kragen For road maps it would probably make sense to store it in vector form. Which is related to other important operations that needs to be supported!

I'm currently thinking a divide-and-conquer lookup table is the way to go for that...

@alcinnz @kragen Now you've got me thinking I should use nVidia Jetson Nanos as additional compute nodes for my Kubernetes cluster rather than Odroid C2s! 128 cores anyone?

@alcinnz @kragen You know, something just occurred to me. I think a Kardashev type 1 and even a type 2 civilization is most likely to use the vast majority of its available energy for is computation. Even if that's not the case, at least at the moment the amount of computation we do is limited primarily by the cost of the power it uses. So increasing its efficiency won't reduce the total energy devoted to computing. In fact, it's likely to increase it.

@alcinnz @kragen Oh, and something I forgot to mention earlier: the place where I think computing has most failed to live up to its potential to reduce our ecological footprint is remote work, mostly due to the amount of organizational inertia among employers. Imagine if everyone whose job didn't require actually being in an office stopped commuting! I did, a few months before lockdown.

@freakazoid @kragen One simplification I find interesting to highlight here is geospatial.

There's an order of magnitude simplification when switching to an area-based model ("DGGS") rather than a point-based model ("lat-long"). I'm really keen to see if that reduces the need for servers! Even if others see it as a way to lighten the workload on their servers.

@alcinnz @kragen Oh wow DGGS looks awesome! I've been thinking about ways to represent OpenStreetMap data so that I can process it without a huge Postgres server, and this might be my answer! This also gives me something to do with my new Kubernetes cluster :D

@alcinnz "running bloated software & webpages" is a huge contributer to climate change, wish people would realize that.

Someone close to me works for AT&T and they realized they would run out of energy to supply their growing server farms, so they put research and dev into fixing it and they did, for now. We need financial incentive to solve this on a large scale, a tax on shitty software perhaps (like carbon tax? lol

@dcharles525 @alcinnz it is not a huge contributor to climate change, no. all the energy consumption of all the computers in the world is a single-digit percentage of even world electrical consumption, much less total energy consumption, and even most of that is a result of screen backlights, not compute. you can run a one-watt cellphone all day on the energy used by a single elevator trip or starting your car's engine.

@dcharles525 @alcinnz If, as you say, a lot of energy use is being caused by shitty software, then a carbon tax IS a tax on shitty software.

@alcinnz Can you please help me understand how syncing everything to the cloud is environmentally destructive? I see the big cloud players all taking big steps to minimize their environmental impact, and given that, isn't 1000 racks of storage in a data center backing 100000 people more efficient than spinning up 100000 magnetic platters on home electricity?

@alcinnz Mind you, I'm not saying syncing everything to the cloud is an unmitigated good *AT ALL*. Recognizing what's important and taking full possession of critical bits is the only way to go, but for many people who won't realistically back up ever, at all, having a cloud drive for important docs seems prudent.

@feoh Most people and organizations have a whole lot of useless information sitting around on their harddrives. Most data is junk that never actually get used. My understanding (though not the book's) is that that's not wasteful unless you actually do something with said data, especially on modern filesystems, or buy new harddrives because of it.

So yeah if you are backing up files, take some time to delete unwanted files so they don't use up precious bandwidth or processing time.

Sign in to participate in the conversation

For people who care about, support, or build Free, Libre, and Open Source Software (FLOSS).