Skip to content
# Why was the number 299,792,458 chosen as the definiton of a metre instead of a more rounded off number like 300,000,000?

### Leave a Reply

The principle was to keep the definition consistent with previous measurements, within their uncertainty. We already had a definition for the metre, just not as precise as the current definition, and we want the new definition to be as consistent as possible, but just easier to measure precisely. Rounding to 300,000 km/s would change the definition of the metre by about 0.07%. That would just make life different for everybody: we'd have to specify if we're talking about the "old" metre or the "new" metre, because that 0.07% change is big enough to matter. It'd change the circumference of the Earth by about 30 km, for instance - a big enough difference that it's measurable, even if it's small.

Just in case anybody didn't catch it, they're working on redefining all measurement units based on universal constants. This will allow for much more consistent and precise definitions. Iirc we've done almost everything, possibly the only exception is time?

How did you get the 0.1 mm? The circumference of Earth is about 1/7.5 light second, so +- 0.5 m for one light second leads to +- 6 cm for the circumference of Earth. And indeed this is 0.0000003% * 40,000 km/2 (divided by 2 to account for the +- 0.5 m instead of 1 m).

According to National Geographic (not something referred to by physicists often) the original definition of the meter was determined by the French Academy of Science back in 1791 as being 1/106 of the distance from the Equator to the North Pole.

Another thing is that if you’re doing calculations by hand (which is when a nice round 300km/sec is helpful) you’re probably perfectly willing to accept that 0.07% error. I’m not actually laying hot dogs from here to Alpha Centauri so being off a bit is okay.

What is the old definition?

Didnt it used to be some even fraction of the distance fron Paris to the north pole or something

Then what happened with the Kilogram and the kilogram?

But how is a second defined?

That makes sense.... Does the ring around the circumference of the earth assume it's touching the ground the whole time, this accounting for differences in elevation? Or does it sort of average all elevation?

Not to mention, even if we had set it at a nice round exact number, as our ability to precisely measure stuff (specifically the speed of light) in the future changes, we would end up with either a slightly different metre, or have to change the definition to a less round number anyhow.

Isn’t it weird that humans picked both a metre and second basicaly at random, and it just so happens that light travels at almost exactly 300.000km/s ? Your simulation is showing.

Don't forget also that if we picked a different number for the meter, then centimeters and millimeters would be way off as well, since they are based on the meter.

I vaguely remember that a meter was 100cm, where 1cm3 is the volume of 1g of water at STP. I'm not sure where that comes from now, as I can see the the meter has been defined based on the speed of light since the late 1800s...

I seem to recall hearing that the meter was originally conceived as an approximation of one human step, but I can't find anything supporting that now that I actually look. Is this the case? If not, I'd like to know what the reasoning is for the meter being as long as it is.

Is the fact that the number is really close to a nice round number just a coincidence? Or was it originally set to be 3e8 and then later refined?

So basically the same reason Americans use imperial, it would be better to change to the system that makes sense, but it's too much effort

Does this mean that a meter is shorter at the intersection between air and water?

The meter is was defined as 1/10,000,000 the distance from the equator to the North Pole at the longitude of Lyon.

This is the correct answer. The meter was chosen such that 40000km is the circumference around the poles. The speed of light is incidental

To add to this, in May this year we redefined the last of the metric system's measures: the Kilogram. Previously, like the metre, it was defined by a man-made artefact that was kept in a vault. Now, like the metre being defined in relation to a universal constant — the speed of light — the kilo is defined based on plank constant, the speed of light and the second (time), and can be measured with a watt balance, which measures the weight based on the electric current and voltage needed to compensate for the weight of the object.

[удалено]

[удалено]

The Measure of All Things by Ken Alder - tells the story of how they fixed the distance between those points. The fudge factor they snuck in, even at the beginning of this whole measurement saga, kinda illustrates the politics embedded in even the most serious science.

Huh. And here I was thinking that it was based on the distance a pendulum swings in a 1 second arc/interval

[удалено]

The goal was to define the meter using constants, not redefine the meter itself. So, the people who defined it said "how long does it take light to travel one meter in a vacuum?" and did the math. Light will travel that distance anywhere in the Universe, so the definition is now not subject to change unless we are measuring the speed of light incorrectly. If we used some other metric, like a distance on Earth, that distance is subject to tectonic adjustments and is not quite as constant, plus it can't be derived on other planets (not that that matters right now).

There have been 4 definitions of the meter through history:

I've often wondered if there is a wavelength of radiation emitted by a particular stable element that would work well for establishing a meter.

Hm. How long is a meter [which is measured] a meter away from the event horizon of a black hole? Or is that like asking how much 27 degrees Celsius weighs?

Doesn't the metric expansion of space mean that it's constantly changing length?

More precisely, the goal was to redefine the meter using constants without changing its length.

The meter was originally defined in the late 18th century as 1 / 10 000 000th of the distance between the equator and the north pole. The definition as a fraction of a light year was later adopted in 1983 so it would depend on a constant.

How did they measure that distance in the first place? Maths?

[удалено]

[удалено]

[удалено]

This number was not chosen. A metre is a specified distance. It is done to keep the physics logical. Acting against a force of 1 Newton through a distance of 1 metre will require 1 joule of energy.

If we stick to strict mathematics, then nearly nothing is as simple as a "well rounded whole number". Its convenient for us and it's what we usually do when it doesn't have to be an exact measurement, but usually those few digits off mean something failing or not.

The circumference of earth was a good way to measure at one point, long time ago. Today, how can you measure accurately to less than a meter? We have these things call mountains, oceans, valleys, etc. Do you count the surface of the water, or just the bottom of the ocean. Also, the earth is different at the Equator than at the poles. I think the guy that came up with the meter used the shadow cast on the moon to do his calculations. Pretty clever.

It sure beats the previous definition, where 1 metre was equal to the length of "this stick".

[удалено]

They did choose the more convenient one.

I've read a lot of the responses here and I think it can be summed up thus:

[удалено]

It's because it's a retro-active definition. We already have the meter and use it, and have used it since well before we knew what the speed of light is. Now we need an always-consistent definition for scientists to use, so we're gonna arbitrarily decide to measure how much time it takes for light to travel a meter. And then we'll use that as the new definition of the meter.

Aside from the more nuanced explanations about the nature of the metre, once you get beyond the kind of measurements humans needs for their day to day lives (where it actually matters if calculations are quick and sums easily sub-dividable) it becomes far more trouble than it's worth to worry about having "convenient" numbers.

Because the speed of light in a vacuum is 299,792,458 meters per second. The meter had defined length before that, so rather than redefining the speed of light and the meter, it was decided to use the speed of light, a physical constant, to define the meter.

This number wasn't "chosen" but was rather formulated by James Maxwell's laws of electromagnetism. The number essentially comes from the speed that an electromagnetic wave travels at and that's derived from some constants that have been used elsewhere so that's what nature's given us. It also paved the way for special relativity etc

Actually, the meter was previously defined somewhat arbitrarily but was later redefined as exactly 1/299,792,458 of a lightsecond which fixes the speed of light. This number was chosen to match the length of the "new" meter with the "old" meter (as the speed of light had been measured as almost exactly 299,792,458m/s using the "old" meter). The meter could also have been redefined as 1/300,000,000 of a lightsecond in which case the speed of light would have been fixed to 300,000,000m/s, which is what OP is asking about.

[удалено]

[удалено]

You're thinking of the modern method of measuring a metre by the distance light travels in a certain time. That was done for accuracy, but really it is a conversion of the previous method of measurement which is what a meter is derived from, which is a unit of measure based off a naturally occuring increment here on earth. The number you listed is just the updated version of that number.

Because speed of light was first measured using old definitions of meter and second. It was measured pretty accurately. When the time was to redefine second and meter, after redefining a second, and then it was easy to define meter via fixed speed of light in vacuum. The close round number was chosen, but not too round, because that would redefine meter to be much too different value. It would wreck havoc to entire world industry, and all previously published science literature. It would be too different to consider same unit.

The original measurement was based on a fraction of the size of the earth as understood in the time of "Le Revolucione." this is justa way of calibrating from that which holds up better than standard meter sticks in the capitla city checked against the international platinum-iridium alloy standard in Paris every so often

Like the kilogram, the meter was already a unit of measurement before we defined it in terms of physical constants. A kilogram, until just a few months ago, was equal to the mass of a specific lump of metal kept in France. It is now defined in terms of amperes.

I do believe the Standard for a Kilo was actually loosing mass, even though it was in a completely sealed container.

I am reminded of the decline of the famous mathematician Groethendieck.

Quite simply because we want a meter from before the change and after the change to be mostly the same. The speed of light was measured to be 299,792,458.A-Whole-Bunch-Of-Decimals, so if we cut out the decimals, we'd get a much easier number to do calculations with and we wouldn't make much on an impact on the length of a meter.

I have always had a related question: Right now I get that sticking to the old metre is practical, but with units like the kilogram now also being based on universal constants, it only seems logical that in the long run distance will be based on the constant speed of light. Would that be the case and why/why not?

It's a pure coincidence that light speed is close to 310m/s. The measurement was done with high precision, and for complex reasonings, the reference point got changed. So instead of measuring speed of light with more precision, any improvement in precision will alter other constants/measurements, but light speed was fixed at whatever precision was available when the standard changed. By not rounding it to 3108, you avoid complications where a mile is different from a nautical mile, and a metric ton is different from an imperial ton.

We have to really be careful when we use words like convenience. A liter, for instance, is 1/1000th of a cubic meter. A liter of water, in turn, masses 1 kg. That's not currently the definition of a kilogram (that comes from

Because the original meter wasn't based off this constant. It also amuses me to note that the imperial foot is actually a better metric due to it being closer to a billionth of a light second (9.836e+8) and adjusting the foot to equal 1 billionth of a light second would effectively be increasing it by about .2 inches. Which would have no impact on the circumference of the earth since it was never a metric for the foot to begin with. It would change the mile by 100.6 feet.

The speed of light is related to the permittivity of electric fields and the permeability of magnetic fields in space. These two are universal constants. In order for the speed of light to be rounded like that, these constants (and the values they are derived from) would have to change as well. The speed of light wasn't necessarily plucked from a hat. Its value is derived from constants related to how electric and magnetic fields propagate though a vacuum.

This is a little misleading, because although permittivity and permeability of free space (and the speed of light) are indeed constant, the value of those constant depends on the definition of the units used - in this case the meter. If we'd defined "1m = 39 inches" those constants would still be the same in reality, but their value in measurement systems using meters would be different.

Those may be constants, but they are not dimensionless. So they depend on our choice of base units, which definitely can be picked out of a hat. E.g. in planck units the speed of light is 1 and so are the electric and magnetic permeabilities. The real reason is that people didn't want to change the length of the original meter, which just coincidentally happened to result in something close to 300k km/s together with our definition of the second.

But would it be universal near a blackhole?

Why the metre is defined as that is beyond me and obviously a confusing and pretty bad definition as they are on completely different scales. The metre is based on the circumference of the earth. Long before we started calculating the speed of light. The metre is the constant here, defined pretty good by itself IMO. Then again it is good to have something to compare with.

The meter used to be a rod of platinum. Based on that every country would calibrate a few of its own prototypes unto which all industrial and non industrial measuring equipment would be calibrated in a progressive manner. So a couple large industries would calibrate their prediction rulers and then smaller ones would calibrate based on those etc.

[удалено]

[удалено]

[удалено]

[удалено]

[удалено]

Sadly this is due to out time keeping devices. If our second was defined (1/207542) slower, then it would come out to the whole number.