No it’s cool. Just cancel your Netflix and pirate your media. Thanks Big Think!
Better yet - boycott Big Media so they have even less of a reason to keep gigantic data centers to begin with.
Ok, I’ll binge watch videos on some other streaming platform instead. I’m helping!
Man out in the wilds we can cover 10 times that distance in that time and not even have to worry about the law man.
Here’s a Big Think. I used to drive 4 miles to blockbuster then pick out a plastic coated VHS, then play it on my plastic coated VCR on my TV that was at least 10x the width of current TVs.
Then, 3 days later, drive like a mad racer with no brakes to get back to the Blockbuster 2 minutes before they closed.
And that’s just like the other 80% of America who don’t have trains, buses or decent bike lanes. So kindly FO on guilt tripping us for our streaming habits, TYVM.

Honestly, if jets left trails that looked like this then nobody would want to fly so much, and public outcry would be a lot louder.
Yeah but she’s the One Good Billionaire™️ for reasons nobody is able to articulate.
I thought that was Gabe Newell
I would hazard a guess that Gabe rarely leaves the house. He’s probably more carbon-neutral than 90% of the self-congratulatory twats on this platform.
The man owns six (6) yachts.
Carbon-neutral my fucking ass.
Gabe actually does good, though. Valve has been a major contributor to several open source products
and at the same time makes bank from kids gambling with Counter Strike skins, so a bit of a mixed bag there
Does good != does only good
Does good != does good on average
Does good != is good
Taylor Swift also arguably contributes something of value–music that a lot of people really like. Doesn’t mean either of them should be able to amass that much wealth. The tax system in the US is broken. In the US in 1961, for example, stock buybacks were illegal (so stocks paid dividends, which are taxable income), and any income above $32,000/year was taxed at 50%, up to a marginal tax rate of 91% for any income above $400,000/year. In contrast, the highest marginal tax rate in the US in 2024 was 37% for any income above $731,200/year, and companies buy back stocks rather than issuing dividends most of the time. Further, most millionaires and billionaires amass wealth through stocks rather than income, using loans against stocks for cash, meaning they pay almost no taxes and continue to amass personal wealth.
That might have more to do with how bad the others are than whatever she does herself…
Yep. She’s not publicly eating babies and being carried on a palanquin by the poors. She ain’t good, but her badness is orders of magnitude below some of the mustache twirling villains out here tying people to railroad tracks.
She’s so good (chorus: how good is she?), she’s so good… that I don’t even know why people think she is bad. (Also relevant is that I don’t truly care enough to look it up?🤪)
But in contrast, everyone knows about The Musk, and Bezos, and so on. Taylor Swift, on the other hand, is known to travel in a jet (which considering how she likely goes with a full retinue of 10-50 people each time, and how much she would be absolutely MOBBED even by her adoring fans if she tried to use more public transportation options, definitely is the safest and might even be a very efficient form of travel under those circumstances?).
Not the same at all.
Then there’s the guy that was Luigi’d, not even a billionaire but with that mindset. So if one doesn’t need to be a billionaire to act like one, then it stands to reason that one could theoretically become one without needing to act like the others. Maybe. Though whether that applies to her I have no idea. Perhaps she simply has a better PR team.
I do find it funny they use an old 1950s smoky long range bomber as an example
If a B-52 is smoking like that it does NOT have a long range anymore
Now try saying this about AI
The funny thing is, with AI each individual token is surprisingly efficient, but each query is burning 10s or 100s of tokens, and a single interaction can lead to 10s or 100s of queries. Factor in that there’s forced AI integrations into things that don’t need it on top of the millions of active users, the near constant training of new models, and suddenly its ballooned into an amount of energy that’s noticable on a global scale
Using the words thank you to respond to Alexa uses the same amount of gasoline a wood chipper takes to consume eleven spotted owls.
Unleaded, though!
Blue checkmarks fund Nazis
This screenshot is from before twitter was aquired.
Top account still has their blue checkmark… and bottom still hasn’t deleted their account.
Funding and participating with Nazis is still supporting them.

watching Netflix is a skill issue anyway. learn to use xdcc or i2p torrents or something
there many movie torrents on i2p?
oh, yes
Piracy is the green option
Kill an oil exec and then binge watch your fav series!
Real… I myself pirate everything… 🗿
Am I the only one who sees this as an endorsement for self hosting?
I would love to see the numbers for 5 people watching an already downloaded movie off a hard drive.
If the house has solar panels and is net zero, it would make the emissions 0 right?
self hosting is wildly less efficient… one of the biggest costs in data centres is electricity, and one of the biggest constraints is electrical infrastructure… you have pretty intense power budgets in data centres and DC equipment is pretty well optimised to be efficient
meanwhile a home server doesn’t likely use server hardware (server hardware is far more efficient), is probably about 5-10y or more out of date, and isn’t likely particularly dense: a single 1500w server can probably service ~20 people in a DC… meanwhile an 800w home server could probably handle ~5 people
add the fact that netflix pre-transcodes their vids in many different qualities and formats, whilst home streaming - unless streaming original quality - mostly re-transcodes which is a very energy-hungry process
heck even just the hard drives: if everyone ran their own servers and stored their content that’s thousands if not hundreds of thousands more copies of the data, and all that data is probably on spinning disks
a single 1500w server can probably service ~20 people in a DC
I’m guessing you dropped a zero or two on the user count, also added an extra zero to the wattage (most traditional colocation datacenters max out at around 2,000 concurrent watts per 48U rack, so each server is going to target around 50-75w per rack unit of average load)
Netflix is going to be direct-playing pre-transcoded streams, so the main constraint would be bandwidth. If we average out all streams to 5mb/s, that’s about 200 streams per gigabit of network bandwidth. Chances are that server has at least 10 gigabit networking, probably more like 50 gigabit if they have SSDs storing the data (especially with modern memory caching). That’s between 2,000 and 10,000 active clients per server
Back of the envelope math says that’s around 0.075 watts per individual stream for a 150w 2U server serving 2000 clients, which looks pretty realistic to my eyes as a Sysadmin.
Granted for a service the size of Netflix we aren’t talking about individual servers we’re looking at a big orchestrated cluster of servers, but most of that is handling basic web server tasks that are a completely solved problem and each individual server is probably serving a few million clients thanks to modern caching and acceleration features. The real cost and energy hit is going to be in the content distribution which I covered above.
I’m guessing you dropped a zero or two on the user count
i was being pretty pessimistic because tbh i’m not entirely sure of the requirements of streaming video… i guess yeah 200-500 is pretty realistic for netflix since all their content is pre-transcoded… i kinda had in my head live transcoding here, but also i said somewhere else that netflix pre-transcodes, so yeah… just brain things :p
also added an extra zero to the wattage
absolutely right again! i had in my head the TDP eg threadripper at ~1500w - it’s 350w or lower
Hey if you were thinking live-transcode I can definitely see why you’d think around 20 clients per server for CPU transcode and I can also see where such a high wattage would come from!
Edit: fun bonus fact! Netflix offers caching servers to ISPs that they can place on their side of the interconnect to mutually reduce bandwidth costs. By memory from a teardown I saw on reddit like a decade ago, it was a pretty standard 1U single socket server (probably a supermicro whitebox if we’re being real)with 4-6 HDDs to serve the media files
yeah i remember that as well! considering the bandwidth netflix takes up i’m not surprised at all! i think it’s like 15% of global internet bandwidth or something crazy?
My home server used 5w at idle and 9w while streaming. Add another 10w for the hard drive.
According to your example, using Netflix a single user would uses 75w.
That doesn’t include the internet cost which I bet is significant as well.
There is a reason paying for Netflix is like $20 a month and internet cost is like $50-100 whereas it costs close to $1/month of electricity for self hosting and no internet cost during usage.
an n150 mini pc - largely considered a very efficient package for home servers - consumes ~15w max without the gpu, and ~9w idle
a raspberry pi consumes 3-4w idle
none of that is supporting more than a couple of people streaming 4k like we’re talking about in the case of netflix
and a single hard drive isn’t even close to what we’re talking about… you’re looking at ~30w at least for the disks alone
as for internet cost, it’s likely tiny… my 24 port gigabit switch from 15 years ago sips < 6w… i can only imagine that’s pretty inefficient compared to today’s standards (and 24 port is pretty tiny for a DC, and port power consumption doesn’t scale linearly)
data centres are just straight up way more efficient per unit of processing than your home anything; it pretty much doesn’t matter how efficient your home gear is, or what the workload is unless you switch it off most of the time - which doesn’t happen in a DC
Idk where your getting your numbers from.
Here is an article that talks about HDD read power usage being less than 10w:
https://www.solved.scality.com/high-density-power-consumption-hdd-vs-qlc-flash/
Even with 30w, it’s still lower than the 75w you mentioned.
Also, that hard drive can serve multiple purposes whereas Netflix is only for steaming movies and tv shows (not music, so you got to add Spotify usage to be fully fair).
my numbers are coming from the fact that anyone who’s replacing all their streaming likely isn’t using a single disk… WD red drives (as in NAS drives) according to their datasheet use between 6 and 6.9w when in use (3.6-3.9w at idle)… a standard home NAS has 4-6 bays, and i’m also assuming that in a typical NAS setup they’re in some kind of RAID configuration, which likely means some level of striping so all disks are utilised at once… again, i think all of these are decent assumptions for home users using off the shelf hardware
i’m ignoring sleep here, because sleep for NAS drives leads to premature failure… this is why if you buy WD green drives for your NAS for example and you use linux, you wdparm to turn off sleep to avoid constantly parking and unparking the heads which leads to significantly reduced life (afaik many NAS products do this automatically, or otherwise manage it)
the top end of that estimate for drives (6 drives) is 41.4w, and the low end (4 drives) is 24w… granted, not everyone will have even those 4 drives, so perhaps my estimate is a little off, but i don’t think 30w for drives is an unreasonable assumption
again, here’s where data centres just do better: their utilisation is spread much more evenly… the idle power of drives is not hugely less than their full speed read/write, so it’s better to have constant access over fewer drives, which is exactly what happens with DCs because they have fewer traffic spikes (and can legitimately manage drive power off for hours at a time because their load is both predictable, and smoother due just to their scale)
also, as someone else in the thread mentioned: my numbers for severs were WAY off for a couple of reasons, but basically
Back of the envelope math says that’s around 0.075 watts per individual stream for a 150w 2U server serving 2000 clients, which looks pretty realistic to my eyes as a Sysadmin.
that also sounds realistic to me, having realised i fucked up my server numbers by an order of magnitude for BOTH power use, and users served
servers and data centres are just in a class of their own in terms of energy efficiency
here for example: https://www.supermicro.com/en/products/system/storage/4u/ssg-542b-e1cr90
this is an off the shelf server with 90 bays that has a 2600w power supply (which even then is way overkill: that’s 25w per drive)… with 22tb drives (off the top of my head because that’s what i use, as it is/was the best $/byte size) that’s almost 2pb of storage… that’s gonna cover a LOT of people with that 2600w, and imo 2600w is far beyond what they’re actually going to be pulling
That’s bullhonkey of the highest fucking order!
One hour of streaming video typically uses around 0.08 kWh, or 288000 Joules, while an electric car can drive a mile with 0.346 kWh, or 1245600 joules, which is to say driving 4 miles is equivalent to 17.3 hours of Netflix!
I haven’t heard “bullhonkey” since 1991 and am a big fan of what you’ve done here.
The original tweet’s claim is false.
TLDR: It referenced an oral interview from a French think tank called The Shift Project. They have since acknowledged it as an error after a fact check from the International Energy Agency. BigThink originally tweeted this in 2019 along with a corresponding article. They have since issued a correction on the article and deleted the tweet. The IEA estimated that it would take around 45 hours of Netflix streaming to generate the carbon emissions of driving 4 miles.
The IEA estimated that it would take around 45 hours of Netflix streaming to generate the carbon emissions of driving 4 miles.
Just a little 90x error, lol
And they call themselves BigThink… .They really ought to change their name, it does not suit them. As they clearly don’t think before they scream.
https://www.iea.org/commentaries/the-carbon-footprint-of-streaming-video-fact-checking-the-headlines
Turns out my choice to not own a TV is green.
Also, that number is utter bullshit.
Netflix, like all major streaming platforms, has an incredibly optimised system for providing the media. A 4 mile drive emits ~1.6-2kg of CO2, whereas one hour of streaming from Netflix emits up to 100g per hour as per Netflix themselves (and even that study is being questioned now, with newer ones putting this value around 30-40g). Meaning you’d need to stream for well over two days to even get near the emissions of a 4 mile drive.
2kg of CO2? Atomic weight of CO2 is about 44, of which carbon is 12, so 27% of CO2 is the carbon from the gasoline. I know that gasoline contains more than just hydrocarbon chains, and that the chains also contain hydrogen. But for the sake of this back of the envelope calculation I’ll disregard both.
27% of 2kg is 0.54kg, according to https://en.wikipedia.org/wiki/Gasoline a liter of gasoline is 0.755kg. Aka 2kg of CO2 is the result of burning 0.72L of gasoline. Driving 4miles, or 6.44km on 0.72L is 9km/L, or 21.2mpg. 1.6kg of CO2 would be 0.57L and 11.3km/L or 26.6mpg.
Maybe I shouldn’t have disregarded the additives and the hydrogen, but unless they account for about 50% of the weight of the gasoline, then those 4 miles were driven in a something very uneconomic.
Well the average I found was for the US, and you guys do love your SUVs even in completely unreasonable areas/spaces. And SUVs do get around 15-20MPG when used properly.
Wise fucking words. Aside from boycotting certain businesses we have almost no ability to control the environmental side of things.
Remove encryption, let users download more files. Problem solved.
What problem?











