It seems like Americans really hate the metric system.
Metric really makes more sense in a lot of ways, being based on decimal measurements. On the other hand, it is true that we get a general sense of what feet, miles, and pounds are; and it would be difficult to try to get these same senses from different units, even if they’re easier in the long run. It still sounds kind of wrong to me for thirty-degree weather to be hot. I read a trivia item back in the eighties that said only the United States, Burma (now Myanmar), and Liberia still use the British Imperial System; and that appears to still be the case, although some nations (including the United Kingdom that gave us most of our weird measurements in the first place) will use both. A lot of the units that are officially part of our system really don’t remain in common use, at least not in this country. Even yards seem to be relegated to fabric these days. They sometimes do retain some bearing on the units we do use, though. Traditional units were often defined by something common that wasn’t always the same length, like a person’s foot. From what I’ve seen on Wikipedia, a furlong (660 feet) is the distance a team of oxen could plow without resting, an acre the area of land that could be plowed in a day by one person with one ox, and a league (no longer an officially used unit anywhere) the distance a man could walk in a day.
The mile, as defined by the Romans, was 1000 paces, with a pace eventually being defined as five feet, so a mile was 5000 feet. With the agricultural system used in England, which was largely based on the number four, Queen Elizabeth I declared an English mile to be eight furlongs, which came out to 5280 feet.
5280 is pretty well fixed in my brain without having to use a mnemonic, but that doesn’t make it any less strange that we’re still using it.
It also had something to do with the length of the standard surveyor’s rod, which was about the length of an ox-goad, and there were forty rods in a furlong. And since a hogshead is sixty-three gallons, that means Grampa Simpson’s car gets incredibly terrible mileage. In 1620, Edmund Gunter invented a chain four rods long with one hundred links, which allowed for some ease in converting to decimal measurements.
We know that it’s much easier to convert between meters and kilometers than feet and miles, but where did the meter come from in the first place? It’s one ten-millionth of the distance from the North Pole to the Equator along a great circle, originally passing through Paris because the French came up with it. That probably explains why the British were so reluctant to adopt the metric system, really.
It’s been slightly altered a few times, most recently in 1983 so that it would be directly relative to the speed of light, although not at all by a multiple of ten. It’s the length traveled by light in a vacuum during 1/299,792,458 of a second. Fortunately, that doesn’t come up much, as that’s much harder to remember than chains and furlongs. I’m not sure why exactly ten million, either, unless it comes back to a meterstick being convenient to hold, which is pretty much where yards and rods came from as well. A gram was originally defined as the mass of a cubic centimeter of water at the freezing point. A liter is 1000 cubic centimeters, and while that is metric, it isn’t part of the International System of Units. While technically you can use any Latin numerical prefix with metric units, that doesn’t necessarily mean people always do. Meters, centimeters, and kilometers are in common usage; but not decimeters or hectometers. In American usage, a megagram is referred to as a metric ton, simply a tonne (spelled that way) in other countries. I’m not sure why the larger prefixes are from Greek and the smaller from Latin, at least up to 1000 and one-thousandth. Beyond that, “mega” and “micro” come from the Greek for “great” and “small,” while “giga” means “giant,” “tera” means “monster,” and “nano” means dwarf, again all from Greek. I’m still not sure why scientific language tends to combine Greek and Latin.
The origin of the Fahrenheit temperature scale is rather bizarre as well. Dante Gabriel Fahrenheit, who invented it in 1724, mostly wanted it to be reproducible and easy to use, so he went with zero degrees as the coldest he could create in his laboratory, the freezing point of water mixed with brine. There are some conflicting accounts of how he decided on the rest, but he seems to have decided on thirty-two as the freezing point of water and ninety-six as the normal human body temperature (he was off by a few degrees) because they were easy to divide on the thermometer he was using, and wouldn’t require a lot of messing around with fractions. They were also the warmest and coldest temperatures people dealt with most of the time, at least in Germany when he lived. Later, it was slightly adjusted so that there would be exactly 180 degrees, or half a circle, between the freezing and boiling points of water. Anders Celsius, also working in the first half of the eighteenth century, based his scale simply on the freezing and boiling points of water, originally proposing zero as the boiling point and one hundred as freezing. Jean-Pierre Christin switched this around in 1743. The scale was generally called Centigrade until 1948, when it was renamed in honor of Celsius. The International System officially uses Kelvin, in which a degree is the same as in Celsius, but zero is absolute zero, -273 on the Celsius scale, as determined in 1848 by William Thomson, First Baron Kelvin. And they’re not called degrees in Kelvin, just kelvins. So a kelvin is still defined based on the freezing and boiling points of water, even though absolute zero isn’t directly related to water. Maybe we’ll have to change these scales someday if we ever make contact with life that doesn’t rely on water, but I’m sure that, if the US still exists, we’ll still be using the ox-based system.