Follow

I've just read Greenpeace's report on the energy sources for serverfarms.

There's some interesting takeaways. Looks even more vital now to promote alternative entertainment outside of the "streaming services"!

Streaming is not green. Downloading it won't be much better, but will once you've rewatched it instead of something else!

Another interesting takeaway: Only GAFAM are doing well at sourcing green energy. The asian giants are doing terribly on that measure.

Of them Amazon lags worryingly (worrying because they provide infrastructure for most other Internet services) behind.

Show thread

@alcinnz what about p2p content? Like torrents or ipfs. Do you think those would improve something?

@grilix In important cases, yes I do think so! At the very least it won't hurt.

@grilix Same here: I personally think we need valid numbers to compare. What's the power consumption for keeping data stored in a P2P network (which possibly will have to be massively redundant)? What about the costs for encryption here (which I consider a "must have" but which seems crucial as it pretty likely will require a lot of compute power on various ends)? I sometimes fear we're trying to solve too many issues at once in a rather "straightforward" way where it actually ...

@alcinnz

@z428 @grilix @alcinnz Elliptic curve cryptography is quite lean. It's one of the reasons the NSA fought to keep anyone from using it for decades.

I don't know what post-quantum algorithms are going to be like, though. ECC is very strong and cheap, but from what I read from the cypherpunks it seems all of the algorithms used today will be considered broken within the next ten years if not already (cf. Google's "quantum supremacy.")
@z428 @alcinnz @grilix also keep in mind there is a prevailing ideology to use slow scripting languages (python, ruby, to a lesser extent node) because "servers are cheap." there was a web company that migrated from ruby to go and they dropped from ten app servers to one.

@icedquinn This prevailing ideology in turns also has two aspects to it: (a) In most environments I know, getting skilled developers is close to impossible so you'd rather focus on high-level languages and spend some more hardware to run them. And (b) Go essentially is a Google project (product?) ... 😐

@alcinnz @grilix

@z428 @alcinnz @grilix it is, although they don't seem to use it as a point of influence. it's basically Rob Pike's sequel to Limbo without a VM.
@icedquinn @z428 @alcinnz @grilix Is the whole quantum computing narrative the same as, "AI will take all our jobs away in 10 years", but it never happens. #quantumcomputing #ai #employment #cryptography
@breaktheirbank @alcinnz @grilix @z428 No, Shor's algorithm is a thing. And a few people do have quantum computers right now.
@icedquinn @alcinnz @grilix @z428 Who has one? Is it in public or private hands? Does anyone know what they do with it?

@icedquinn @alcinnz @grilix @breaktheirbank @z428 No quantum computers are large enough to come anywhere close to breaking any widely used curves. They don't act like classical computers where you can fake having more bits by doing multiple operations, you really do need (many) more qubits than the bit size of the curves you're working with, and it's believed to be impractical to entangle enough qubits at the same time to do it.

@breaktheirbank @icedquinn @grilix @z428 Perhaps.

But from a network security perspective, we might as well be prepared for it. We don't want security to all collapse in the face of a quantum computer, even if powerful enough ones don't come to pass.

@icedquinn @alcinnz @grilix @z428 No. There are not many algorithms with quantum speed-up, but thouse that we have can be used for decrypting RSA and ECC. However, this algorithms are completely useless for symmetric crypto.

@icedquinn @alcinnz @grilix @z428 There is another reason, why quantum computing is not a treat now. While quantum supremacy gives us up to exponential speed up, real quantum computers shurely disobey Moore's law. Even more, I would say, that creating n-qubit quantum computer is an O(e^n) problem.)))

@icedquinn @alcinnz @grilix @z428 the reason is that all quantum algorithms requares the possibility to entangle arbitrary qubit with each other. Usually, it is entangling pairs of qubits. If you have n qubits you have to be able to entangle at least 2^n possible pairs.

@icedquinn @alcinnz @grilix @z428 this is the main problem with google's superconductor chips. They have to coop with the full range of solid state quantum noises.

@grilix ... would require a few iterations of collecting requirements and most likely balancing mutually exclusive priorities (like energy consumption vs. digital independence...).

@alcinnz

@z428 @grilix @alcinnz I think P2P redundancy isn't that big of an issue:
- It empathises on keeping good data for yourself or others, thus reducing downloading it everytime (not important for most but for stuff like music, specially over youtube this gets an enormous save)
- Centralised servers also need redundancy, and a lot of it, they basically never can be down because stuff like HTTP isn't fail-resilient. 10 peers for most torrents is largely enough, 5 is quite minimal. This is the size of a basic CDN or worldwide platform.

@lanodan Agree. Two things to add however: Personal (datacenter) experience is that redundancy in a "central environment" also is an issue but is a bit easier to address because at least all the machines are under your control. In a decentralized environment (where redundancy is provided third parties) this becomes a bit more different if you want to do it in a reliable way. Plus: This "keeping good data" is an interesting aspect. I'm not thinking all along the ...

@alcinnz @grilix

@lanodan ... lines of "good data" but rather providing an environment as "performant", reliable, scalable as nowadays centralized infrastructure, yet in a P2P manner. No assumptions made here so far, I really have no idea. Seems a fairly complex equation if done right. 😀

@alcinnz @grilix

@z428 @alcinnz @grilix Well it is a fairly complex one but thanks to laziness I tend to believe that when people do things themselves they rather have it be efficient.
Specially because they see the costs of everything rather than just the total or nothing at all.

Also for the server at home kind of thing, it should be noted that an hard drive can spin down and you could have a NAS suspend itself and use wake-on-lan.
And NAS is just an ethernet disk adapter, you could turn it off like for an external HDD (or even better as network protocols can quite cope with a target being offline).
@icedquinn @alcinnz @grilix @z428 ARM being expensive? I'm not so sure there, and for laptop HDDs I tend to avoid them because they seem more prone to failures (all my 2.5" HDDs are dead, very few of my 3.5" HDDs are).
@lanodan @alcinnz @grilix @z428 Prior to the RPI4 coming out recently, 4/8gb RAM was hard to come by in an ARM board. You could mostly only get 512 or 1g. Maybe 2 if you sought it out. And even then you didn't have USB-3/C so accessing the hard drive was limiting.

RPI4s might actually make respectable file servers though.
@icedquinn @alcinnz @grilix @z428 The RockPro64 has a 4G RAM option for a while and I've seen similar kind of boards in the last years, and my BananaPi (which has done most of my hosting at home) has 2G of RAM.
And sure, 2G or 4G may no seem much for a server but it's actually plenty if you avoid RAM eaters.

And I will keep on using SATA for HDDs, mostly because there tends to be no HDDs which are natively using USB and the few that do are external ones where you would need to throw away the enclosure.

@lanodan
Sorry for jumping in, but may I ask which are those "RAM eaters" when you speak about self hosting?

@txusinho Mastodon, Matrix's synapse, probably heavy CMS like wordpress and mediawiki, …

They're basically not written with small self-hosting in mind.

@lanodan Don't get me wrong - I'm not arguing against this idea here. 🙂 I'd actually just be interested whether there are *any* figures or numbers to compare these approaches including as much "side-effects" as possible. Most of these calculations I've seen so far seem pretty much "biased".

@alcinnz @grilix

@alcinnz

I would recommend this book if you haven't read it. He has a podcast (streamed) too. gerrymcgovern.com/books/world-

@jamesmullarkey It's interesting that judging by that report, that singular facet of "tech"'s operations, MAGAF are our environmental saviours!

At the same time they heavily push (via Dark Patterns) consumption patterns that (I strongly suspect) are environmentally damaging, campaign against our Right To Repair, & make excessive use of bandwidth & the client's battery for the sake of a profit. I don't know how it balances out.

@alcinnz @jamesmullarkey very well put. Facebook runs very efficient data centers (hardware wise. Not that sure about software given how widely Hack and Python is used), but a lot of the CPU cycles are used to determine how best to keep eyeballs on the site and to optimize ad delivery -- leading to even more unnecessary consumption.

If not for being able to work on open source projects that advance the state of the art my internal calculus would probably push me out.

@michel_slm @jamesmullarkey I totally get you. It's hard to make an income off pushing the state of efficient software forward, I'm trying.

@alcinnz

Through surveillance capitalism they are dragnetting all possible data and storing it indefinately even if it doesn't have any use.

They are the destroyers. :(

@alcinnz

I'm about to launch a download only podcast BTW for this very reason.

@alcinnz I am recently pretty torn about that. From one perspective, I applaud the idea to download instead of streaming whole-heartedly. Then again, in example looking at friends running Synology boxes, QNAPs or large Linux servers at home, 24x7 with large disks to store all the stuff they downloaded even by now, I wonder whether we do have a "total" energy-wise look at this issue. 😐

@z428 Two thoughts here:

1) Do they still care about all those downloads?
2) I think I've seen terrabytes-large USB sticks. Strip off the surrounding computer, leave the storage.

@z428 3) It all depends on what the individual wants to do. Streaming *can* be more efficient for those who don't like rewatching.

@alcinnz Both talking about movies and music, I think it boils down to keeping a "collection". Back in our youth we used to have MC tapes, later CDs, literally hundreds of them, and it was a mess to find what you wanted to listen to, especially in case you had visitors and wanted music for a party. As far as I see it used, YouTube and Spotify mainly satisfy this need - provide a virtually infinite collection or archive that "has everything available". Mimicking this locally, I'm at best ...

@alcinnz ... thinking all the lines of simple, extensible USB disk subsystems that can easily be powered up and down but do not need to be running *all the time*; yet I have to find things like that. So far using large external USB single drives creates another interesting issue: How to backup these beasts? Keeping backups of, like, 6..10TB of media files is interesting in personal use both in terms of media and in terms of process... 😉

@z428 @alcinnz So one maybe unpopular opinion: you don't really need backups of that data. At best, you need to back up a list of stuff you have and then redownload it in case a disk goes banana.

Most of the movies, songs and ebooks are widely available on the Internet, so no need for backup.

Such portable hard disks ideally spend so little time running that they will be too small before they break. At least that's my experience.

@phel Works for most cases, true. Becomes nasty if, say, your audio collection folder contains both downloaded stuff and music extracted from local CDs, as well as music that has custom metadata attached to it... 😉

@alcinnz

@z428 @alcinnz
Synology boxes have power saving features. They'll power down the drives and idle at under 4W.

(Plus I run mine entirely on electricity from renewables.)

I think the bigger problem is people reusing old hardware or standard desktop or gaming PCs as servers, as they generally aren't designed for low power consumption.

@mathew @z428 Though there's a good reason to reuse hardware: it's environmentally costly to produce.

People often forgot to incorporate that into their calculations, in part because it wasn't such a big problem before computers.

@alcinnz @z428
Indeed, though there are limits. I once helped someone move a PDP into his attic, I doubt that it makes any sense to keep using it from an environmental point of view though. 🙂

@alcinnz @z428
Another thing people often forget to factor in is that if you have fewer power hungry computers heating up the office, or more efficient devices, then you need noticably less air conditioning. I was quick to get rid of all my CRTs once LCD panels became affordable.

@mathew @alcinnz @z428 I remember reading a study about this wrt. laptops; it takes about seven years for a new, more energy efficient laptop to break even with the energy needed to produce it. Of course those numbers will be drastically different with different degrees of utilisation and power efficiency, but I think the energy efficiency effect is typically overestimated.

@michiel Yes.... that's an aspect I didn't even have in mind at the moment. But as always... things get more difficult, the closer you look. No real idea how to ever come to a meaningful conclusion here. Every "solution" seems to have pretty "interesting" drawbacks in other dimensions. 😐

@mathew @alcinnz

@z428 @mathew @alcinnz Agreed. It's frustrating; I'd like to contribute something tangible to the reduction of fossil fuel use, and this *is* a field I'm at least familiar with, but it's clear that there are too many trade-offs to easily find big wins. Which doesn't mean that these big wins don't exist ...

@michiel @z428 @alcinnz
Best way to reduce your carbon footprint is to not have children.

Second best is to be vegan or vegetarian.

Reducing electricity consumption is pretty far down the list.

sciencemag.org/news/2017/07/be

@mathew Oh well... I've at least ruled out one of these options, but at least I'm vegetarian so guess that counts a bit as well... 😀

@michiel @alcinnz

@mathew @z428 @alcinnz Yes, and I hope you don't take my exasperation with this advice personally: it makes individuals responsible for a collective problem.

I'm a software engineer; my entire supposed value in the labor market comes from my ability to solve problems *once* and reuse the solution infinitely many times over.

If we investigate and educate each other about good choices in this problem space, we *can* have a significant impact as a profession.

@michiel @z428 @alcinnz
Oh, absolutely. Individual approaches are minor in scope compared to the effect if the 20 worst corporations took action.

@mathew @z428 @alcinnz We can't reasonably expect someone watching Netflix to realize how much energy they're using. Making the consumer responsible for something that happens completely out of their sight is not a reasonable solution.

We (I think we're all tech geeks here) are the ones who know how the sausages get made. We even have some influence about what goes into them.

@michiel @z428 @alcinnz
I've just realized that I mostly stream TV and movies, because I'm mostly unlikely to watch them again; but I download music, which I listen to repeatedly. This isn't a position I ended up with for energy efficiency reasons, though. It's just a happy coincidence.

(The few movies that I know I'll want to watch repeatedly, I still buy on disc.)

@mathew @michiel @z428

> This isn't a position I ended up with for energy efficiency reason, though. It's just a happy coincidence.

The more often we as software developers can make this the case the stronger impact we can have!

Sign in to participate in the conversation
FLOSS.social

For people who care about, support, or build Free, Libre, and Open Source Software (FLOSS).