• CluckN@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    8 months ago

    Fahrenheit has a fine granularity that is lost in cold climates. It’s why the Bahamas/Belize use it as well.

    • Johanno@feddit.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      Well you know that you can use the decimals?

      How is - 40.000001°F more fine than - 40.00000000001°C?

      23°C is a nice room temperature.

      18°C is a bit chilly but still a comfortable temperature.

      If you want to go for a finer destinction then we cann say 18.5°C is warmer but I personally can’t feel the difference.

      • Wolf_359@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        I can feel the difference between 71 and 73 in my house.

        At 73, my kids room is uncomfortably hot. At 71, it has a perfect chill for sleeping.

        • FooBarrington@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          8 months ago

          What is your point? That people who use Celsius can’t feel the difference between 21.7°C and 22.8°C?

          If you’re worried about your thermometer, you’ll be happy to hear that metric ones usually have finer precision than Fahrenheit ones, since they go in .5°C steps. Since +1°F means +5/9°C, you have less precision!

          • Blue_Morpho@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            8 months ago

            The point was they need that extra decimal because C isn’t good for human temperature sense.

            It’s not like you are prohibited from using decimals in Fahrenheit. It’s that you don’t need 3 digits because it works better for people.

            And fuck you for making me defend the most ass backwards measurement system on the planet.

            • FooBarrington@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 months ago

              It’s just an incredibly weak defense. Why is it worse for C to use an extra decimal for these differences? I can just as well argue that C is a more accurate representation, because small differences in temperature are smaller. Just like your argument, this is purely an opinion - until you can show me that not needing the extra decimal is objectively better, or until I can show you that smaller differences being represented as such is objectively better, neither of them holds any weight.

              • Blue_Morpho@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                arrow-down
                1
                ·
                edit-2
                8 months ago

                It’s the same reason we use abbreviations and contractions when speaking. A trivial simplification is still a simplification.

                Why bother with Celcius at all when there is Kelvin. Even Kelvin is arbitrary. Best to use Planck normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.

                So whenever you have to tell someone the temperature outside, you say it’s 0.000000000000000000000000015237 Planck

                If 3 digits isn’t more a tiny bit more cumbersome than 2, then 32 digits is fine too.

                • FooBarrington@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  8 months ago

                  We don’t have issues with decimals in many places. For example, why are there pennies? Why aren’t dollars just scaled up 100? Generally speaking: why don’t people immediately shift to the lower unit when talking about e.g. 3.5 miles? If you’re correct, those should be simplified too - yet they aren’t.

                  Why bother with Celcius at all when there is Kelvin.

                  Because Celsius uses a scale that relies on temperatures you’re encountering in your everyday life.

                  Even Kelvin is arbitrary. Best to use Plank normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.

                  Why? That scale is still arbitrarily chosen.

      • CluckN@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        edit-2
        8 months ago

        Our bodies are mostly water why not use a system that reflects this?

        • NegativeInf@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          The universe is mostly empty space with an average temperature of like… 4 Kelvin or some shit. Why not use a system that reflects that? Oh, we do? Right. Celsius is Kelvin + 273.15.

    • imaqtpie@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      8 months ago

      Save yourself before it’s too late.

      Do not say anything positive about Fahrenheit in this thread… the Temperature Scale Inquisition is watching closely for any dissent from the party line.