Americans just assume their experience is the universal one. They can’t possibly imagine that F might be gibberish to someone who grew with Celsius or that people who grew up with Celsius can intuitive tell how hot something is in Celsius.
I call bullshit, like yeah I’m sure that’s the smallest degree or whatever, but how ‘hot’ or ‘cold’ something feels is up to way more than just temperature like humidity, wind chill if it’s sunny or cloudy so in a real example I doubt a person can notice the difference between a 66°F and 65°F day because there’s so many other factors. And you know what it is actually really bad at? Telling people when stuff freezes, you think some person from texas or nevada or any place that usually doesn’t get cold enough knows the exact freezing point in fahrenheit? Most people will guess around 30 while pretty much everyone knows that the freezing point of water in celsius is 0°
When it’s above 100, people who have options for something lower will generally go for them. Similarly for under 0. OK, so as PancakeLegend@mander.xyz pointed out, such sensitivities might be specific to US culture, but theoretically, how much would we have to expand the 0-100 Fahrenheit range so that 0 is too cold for pretty much everyone and 100 is too hot for pretty much everyone? 0 goes to -10, 100 to 140? A new-Fahrenheit degree would still be more precise than a Celsius degree.
My point is “really hot” and “really cold” are not useful reference points to ascribe to, no matter what numbers you’re using. If i was coming up with a measurement system for brightness and i said 1000 was “really bright” would you be able to tell me anything about 500? No because you literally have no reference frame for what i mean by “really bright”. It’s the same thing when Americans describe Fahrenheit to the rest of the world. You have to experience the data points, and at that point, whether you use 0 to 100, -20 to 40, or 250 to 310, it doesn’t matter. You will just intuitively understand the scale and so there’s no inherent benefit.
“Oh but 100 Fahrenheit means 100/100 on the hot scale, it just makes intuitive sense!”
WHAT DOES THAT EVEN MEAN?? Fahrenheit lovers literally don’t know how ridiculous they sound
Americans just assume their experience is the universal one. They can’t possibly imagine that F might be gibberish to someone who grew with Celsius or that people who grew up with Celsius can intuitive tell how hot something is in Celsius.
What’s not to get lol. Think about when it’s really really cold outside. That’s 0. Think about when it’s really really hot outside. That’s 100.
No? Lmao
Not cold enough for you?
Nah it’s more like, one degree fahrenheit is the smallest change in temp that the average human can sense.
I call bullshit, like yeah I’m sure that’s the smallest degree or whatever, but how ‘hot’ or ‘cold’ something feels is up to way more than just temperature like humidity, wind chill if it’s sunny or cloudy so in a real example I doubt a person can notice the difference between a 66°F and 65°F day because there’s so many other factors. And you know what it is actually really bad at? Telling people when stuff freezes, you think some person from texas or nevada or any place that usually doesn’t get cold enough knows the exact freezing point in fahrenheit? Most people will guess around 30 while pretty much everyone knows that the freezing point of water in celsius is 0°
When it’s above 100, people who have options for something lower will generally go for them. Similarly for under 0. OK, so as PancakeLegend@mander.xyz pointed out, such sensitivities might be specific to US culture, but theoretically, how much would we have to expand the 0-100 Fahrenheit range so that 0 is too cold for pretty much everyone and 100 is too hot for pretty much everyone? 0 goes to -10, 100 to 140? A new-Fahrenheit degree would still be more precise than a Celsius degree.
My point is “really hot” and “really cold” are not useful reference points to ascribe to, no matter what numbers you’re using. If i was coming up with a measurement system for brightness and i said 1000 was “really bright” would you be able to tell me anything about 500? No because you literally have no reference frame for what i mean by “really bright”. It’s the same thing when Americans describe Fahrenheit to the rest of the world. You have to experience the data points, and at that point, whether you use 0 to 100, -20 to 40, or 250 to 310, it doesn’t matter. You will just intuitively understand the scale and so there’s no inherent benefit.
Right? Lol