What is your point? That people who use Celsius can’t feel the difference between 21.7°C and 22.8°C?
If you’re worried about your thermometer, you’ll be happy to hear that metric ones usually have finer precision than Fahrenheit ones, since they go in .5°C steps. Since +1°F means +5/9°C, you have less precision!
It’s just an incredibly weak defense. Why is it worse for C to use an extra decimal for these differences? I can just as well argue that C is a more accurate representation, because small differences in temperature are smaller. Just like your argument, this is purely an opinion - until you can show me that not needing the extra decimal is objectively better, or until I can show you that smaller differences being represented as such is objectively better, neither of them holds any weight.
It’s the same reason we use abbreviations and contractions when speaking. A trivial simplification is still a simplification.
Why bother with Celcius at all when there is Kelvin. Even Kelvin is arbitrary. Best to use Planck normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.
So whenever you have to tell someone the temperature outside, you say it’s 0.000000000000000000000000015237 Planck
If 3 digits isn’t more a tiny bit more cumbersome than 2, then 32 digits is fine too.
We don’t have issues with decimals in many places. For example, why are there pennies? Why aren’t dollars just scaled up 100? Generally speaking: why don’t people immediately shift to the lower unit when talking about e.g. 3.5 miles? If you’re correct, those should be simplified too - yet they aren’t.
Why bother with Celcius at all when there is Kelvin.
Because Celsius uses a scale that relies on temperatures you’re encountering in your everyday life.
Even Kelvin is arbitrary. Best to use Plank normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.
The universe is mostly empty space with an average temperature of like… 4 Kelvin or some shit. Why not use a system that reflects that? Oh, we do? Right. Celsius is Kelvin + 273.15.
Well you know that you can use the decimals?
How is - 40.000001°F more fine than - 40.00000000001°C?
23°C is a nice room temperature.
18°C is a bit chilly but still a comfortable temperature.
If you want to go for a finer destinction then we cann say 18.5°C is warmer but I personally can’t feel the difference.
I can feel the difference between 71 and 73 in my house.
At 73, my kids room is uncomfortably hot. At 71, it has a perfect chill for sleeping.
What is your point? That people who use Celsius can’t feel the difference between 21.7°C and 22.8°C?
If you’re worried about your thermometer, you’ll be happy to hear that metric ones usually have finer precision than Fahrenheit ones, since they go in .5°C steps. Since +1°F means +5/9°C, you have less precision!
The point was they need that extra decimal because C isn’t good for human temperature sense.
It’s not like you are prohibited from using decimals in Fahrenheit. It’s that you don’t need 3 digits because it works better for people.
And fuck you for making me defend the most ass backwards measurement system on the planet.
It’s just an incredibly weak defense. Why is it worse for C to use an extra decimal for these differences? I can just as well argue that C is a more accurate representation, because small differences in temperature are smaller. Just like your argument, this is purely an opinion - until you can show me that not needing the extra decimal is objectively better, or until I can show you that smaller differences being represented as such is objectively better, neither of them holds any weight.
It’s the same reason we use abbreviations and contractions when speaking. A trivial simplification is still a simplification.
Why bother with Celcius at all when there is Kelvin. Even Kelvin is arbitrary. Best to use Planck normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.
So whenever you have to tell someone the temperature outside, you say it’s 0.000000000000000000000000015237 Planck
If 3 digits isn’t more a tiny bit more cumbersome than 2, then 32 digits is fine too.
We don’t have issues with decimals in many places. For example, why are there pennies? Why aren’t dollars just scaled up 100? Generally speaking: why don’t people immediately shift to the lower unit when talking about e.g. 3.5 miles? If you’re correct, those should be simplified too - yet they aren’t.
Because Celsius uses a scale that relies on temperatures you’re encountering in your everyday life.
Why? That scale is still arbitrarily chosen.
Our bodies are mostly water why not use a system that reflects this?
The universe is mostly empty space with an average temperature of like… 4 Kelvin or some shit. Why not use a system that reflects that? Oh, we do? Right. Celsius is Kelvin + 273.15.
…rankine glowers in your general direction…
Are you made of mostly empty space? Your response does leave me questioning. Please aknowledge that you are made of 64% water and not 4°k nothing.
As a matter of fact…