• tom
    1.5k
    Interesting, but the point is this. The reason why the frequency is precisely 9,192,631,770 times per second, rather than 5 billion, 10 billion, or some other arbitrary number, is that the second is already is defined in relation to the year. So if they chose one of those other numbers, 5 billion times per second, for example, there would not be the right number of seconds in a day, and in a year. So what this statement ("9,192,631,770 times per second") represents, is a relationship between the activity of those caesium atoms, and the motion of the earth in relation to the sun. If that relationship is not absolutely stable, then that number cannot be represented as a stable number.Metaphysician Undercover

    No it doesn't. The second is DEFINED with respect to a material property of Caesium. The new definition would have been chosen to be close to a previous definition which it superseded, for convenience, but needn't be the same. I presume you are familiar with leap seconds (and leap years)?

    As a matter of interest, all Imperial units also changed slightly when they became defined in terms of S.I. units.
  • Metaphysician Undercover
    13.2k
    No it doesn't. The second is DEFINED with respect to a material property of Caesium. The new definition would have been chosen to be close to a previous definition which it superseded, for convenience, but needn't be the same. I presume you are familiar with leap seconds (and leap years)?tom

    Yes, the second is defined that way, I am fully aware of this. However, the year is defined by the earth's orbit. For fdrake's claim that the caesium clock will continue to be as accurate as it is now for 100 million years to be true, the relationship between the earth's orbit, and the caesium frequency, must remain the same for 100 million years. The use of leap seconds demonstrates that this is highly unlikely.
  • tom
    1.5k
    Yes, the second is defined that way, I am fully aware of this. However, the year is defined by the earth's orbit. For fdrake's claim that the caesium clock will continue to be as accurate as it is now for 100 million years to be true, the relationship between the earth's orbit, and the caesium frequency, must remain the same for 100 million years. The use of leap seconds demonstrates that this is highly unlikely.Metaphysician Undercover

    Right, so the second is defined by a physical constant, but the year is defined by a varying quantity. Certain mechanisms are employed to keep the invariant and varying quantity in good agreement. These include leap years and leap seconds.

    The clock will not be as accurate as it is now in 100,000,000 years. No one is claiming that. The clock will certainly not exist then. However in 100,000,000 years, clocks may be 100% accurate.
  • Metaphysician Undercover
    13.2k
    Right, so the second is defined by a physical constant, but the year is defined by a varying quantity.tom

    That's an arbitrary assumption, that the second is constant, and the year is variant. Because of this arbitrary assumption, any, and all discrepancy in measurement is assigned to a variance in the year, and no variance is assigned to the second, despite the fact that some discrepancy might actually be due to a variance in the second.

    Consider this example. Assume that the length of the day is constant, and that the length of the year is also constant. However, they are not completely compatible, so there is a necessity of leap years. This does not indicate that one or the other, the year, or the day, is constant and the other is variable, it simply indicates that the two are incommensurable. Likewise, in the comparison of the second and the year, the need for leap seconds does not indicate that one is constant and the other is variable, it indicates that the two are incommensurable.

    The clock will not be as accurate as it is now in 100,000,000 years. No one is claiming that.tom
    Actually, that seems to be exactly what fdrake was claiming.
  • tom
    1.5k
    That's an arbitrary assumption, that the second is constant, and the year is variant.Metaphysician Undercover

    We know the Earth is Moving away from the Sun and that the year is getting longer. I's been measured.

    We can measure and calculate the energy of transition between hyperfine ground states of the caesium atom.

    For the energy of transition of caesium atoms to change - a change affecting all caesium atoms everywhere simultaneously I presume - what laws of physics do you propose to change?
  • fdrake
    6.7k
    @Metaphysician Undercover

    Actually, that seems to be exactly what fdrake was claiming.

    Well, we had an argument over whether metaphysical necessity of physical law was required for the measurement to be accurate at that point. I tried to argue that that was a category error, you tried to argue that I required it. Whether in 100 million years the clock has the same error rate depends on whether the physical laws would change. One way of preventing the change conceptually would be the application of necessity to physical law. I tried to argue that that would be sufficient but not necessary, what is required that the laws would change, not that they could or must: a contingent fact, rather than the possibility of its negation or its elevation to necessity.

    The quantification of the error in terms of 1 sec/100 mil years and its equivalence to the stated error rate in the paper is a separate issue. If you want to treat it as a separate issue now, that's fine with me -to me that looks like progress. Since you were arguing as if the metaphysical necessity of physical law was required for scaling the error to an equivalent rate; I argued that it wasn't.

    So we had this super-discussion of the necessity of physical law - neither of us believed that it was necessary. But yeah, if you want to talk about the scaling of the error rate without, in my view, muddying the waters with all this talk of the metaphysical necessity of physical law, I'd be interested in chatting about it again.
  • Metaphysician Undercover
    13.2k
    We know the Earth is Moving away from the Sun and that the year is getting longer. I's been measured.tom

    OK, that's an example of how something which is assumed to be constant from observation on the short term may prove to be less constant on the long term.

    We can measure and calculate the energy of transition between hyperfine ground states of the caesium atom.tom

    So, according to the paper that fdrake referred, this has been proven to be constant for a period of one month. On what basis does one claim that it will remain constant for 100 million years?

    For the energy of transition of caesium atoms to change - a change affecting all caesium atoms everywhere simultaneously I presume - what laws of physics do you propose to change?tom

    You are falling into the same pattern of argumentation as fdrake did, asking me to prove that things will change. Fdrake insisted that this particular activity will remain the same for that extrapolated time period, so the onus is fdrake's to demonstrate that it will. From my perspective, I just need to demonstrate that change is possible, to refute fdrake's claim that this activity will necessarily stay the same.

    For example, if prior to scientists knowing that the year is getting longer, some people thought that the year would remain constant for billions of years, and someone like me argued that this is a faulty extrapolation, how would that person, like me, be expected to know just exactly what was changing? It is not necessary to know what is changing in order to make this argument. All that is necessary to prove wrong the claim that things will remain the same, is to demonstrate the possibility of change. If change is possible, then the claim that things will stay the same is unsound.

    My argument is that the extrapolation is faulty because there are too many unknowns which could influence things. So if you want to defend the extrapolation, then you should demonstrate that there are no such unknowns, do not ask me what the unknowns are, and how they will affect the proposed constant activity, because they are unknowns. However, I did indicate one such unknown factor, and that is what is called "dark energy".

    Well, we had an argument over whether metaphysical necessity of physical law was required for the measurement to be accurate at that point.fdrake

    What do you mean by metaphysical necessity of physical law?

    Whether in 100 million years the clock has the same error rate depends on whether the physical laws would change.fdrake

    Remember, we went through this, physical laws are descriptions produced by human beings. Let's see if we can maintain a distinction between "physical laws" and "the way things are". That the caesium clock has x number of cycles per second is a physical law. The evidence of experimentation demonstrates reason to believe that this is the way things were for a period of one month. In other words, the physical law which states x cycles per second of the caesium atom has been demonstrated to be accurate for a month of time.

    The quantification of the error in terms of 1 sec/100 mil years and its equivalence to the stated error rate in the paper is a separate issue.fdrake

    I don't see how this is a separate issue, it is the issue. The question is whether such an extrapolation is valid.

    So we had this super-discussion of the necessity of physical law - neither of us believed that it was necessary. But yeah, if you want to talk about the scaling of the error rate without, in my view, muddying the waters with all this talk of the metaphysical necessity of physical law, I'd be interested in chatting about it again.fdrake

    Perhaps I misunderstand what you mean by metaphysical necessity of physical law, but I do believe that if you want to extrapolate the way that you do, you need some principles whereby you can argue that what was observed to be the case for one month will continue to be the case for 100 million years.

    Take tom's example, that it has now been proven that the earth is getting further from the sun, and the years is getting longer. That difference is so slight that people in the past would never have noticed it. They would do projections into the future, extrapolations as you do, without realizing that every year the length of the error grows by the tiniest amount. After a very long time, this tiniest amount multiplies into a larger amount. What if something similar is the case with the caesium frequency? This is just one example, of one possibility, but have you considered this possibility, that the error is cumulative?
  • fdrake
    6.7k


    Take tom's example, that it has now been proven that the earth is getting further from the sun, and the years is getting longer. That difference is so slight that people in the past would never have noticed it. They would do projections into the future, extrapolations as you do, without realizing that every year the length of the error grows by the tiniest amount. After a very long time, this tiniest amount multiplies into a larger amount. What if something similar is the case with the caesium frequency? This is just one example, of one possibility, but have you considered this possibility, that the error is cumulative?

    The possibility of error in the measurement in the year induced by the Earth getting further away from the sun, based upon the assumption that the Earth has a constant elliptic orbit isn't the reason why that measurement was flawed. The reason why the measurement was flawed was because there was an error in the measurement of the year induced by the Earth getting further away from the sun. The possibility of error does not invalidate a measurement, the actuality of error does. And 'the actuality of error' consists in the claim that 'the actual quantity ascribed in the measurement error analysis is wrong'. Not that it's possibly wrong. Of course it's possible wrong, scientific knowledge is fallible. Just because it's possibly wrong gives no reason to reject it.

    Perhaps I misunderstand what you mean by metaphysical necessity of physical law, but I do believe that if you want to extrapolate the way that you do, you need some principles whereby you can argue that what was observed to be the case for one month will continue to be the case for 100 million years.

    I actually did this. I made a case that the error rate would be the same for the same measurement process in 100 million years. There are things that would make atoms behave in different ways, like if all their protons decay (which is possible). If there were no protons, there'd be no caesium or strontium atoms, and no optical lattices, so no caesium clocks. If something like was predicted to happen within 100 million years, the claim that 'the measurement error of the clock would be the same in 100 million years' has some evidence against it. So I quoted you some stuff about the chronology of the universe - the stelliferous era, the one which we are in now, is predicted to have the same atomic physics through its duration. The end of the stelliferous era will be in about 1000 more universe lifetimes, much much longer than 100 million years. This is a matter of being consistent or inconsistent with physical theories, not one of their possibility of error. There's just no good reason to believe that atomic physics will change in a meaningful way in 100 million years. It's a tiny amount of time on the scale of the universe's chronology - 100 million years is 1*10^-9% of the lifetime of the stelliferous era, which we are in and will still be in.

    Instead of focussing on what we can believe evidentially about the actuality of the laws of nature changing, you instead internalised the laws of nature to scientific consensus - claiming that the laws of nature change because of changes in science. In some trivial sense this is true; laws are descriptions of patterns in nature, if our descriptions change the linguistic formulation of patterns changes or new patterns are given descriptions. General changes in scientific consensus implies nothing in particular about the measurement error analysis of that clock.. Changes in the operation of nature might, if they influence the patterns atomic physics is concerned with in a meaningful way. Notice might, not will, since to establish that changes in the operation of nature will invalidate the error analysis a flaw has to be found in the error analysis. Not the possibility of a flaw - this is a triviality, scientific thinking is fallible, the establishment of a particular flaw in the error analysis.

    And in this, you provide the claim that the behaviour of oscillations between hyperfine states has been observed for one month, therefore measurement error analysis based on that month's observations cannot be used to calculate an error rate which is beyond the month. Maybe not beyond the month, you've been admittedly imprecise on exactly how 'the data was gathered in a month' actually changes the error analysis. Saying you have no idea of how 'it was gathered in a month' invalidates the quantification of error in the measurements.

    In general, this argumentative pattern is invalid. I have generalised here because you have not provided and cannot provide a way in which the duration of the data gathering for the paper influences the derived error rates. So, if the principle is 'we cannot say that the error is less than the data gathering duration because of a possible multiplicative effect on the error due to changes in physical law', which is still imprecise as it provides no translation of uncertainties in quantities of different dimensions (like temperature and time), we end up in a situation I detailed a bit earlier, but will provide more detail on now.

    (1) You read the temperature from the thermometer at time t. Say that the duration of your observation was 1 second.
    (2) There is a possible error associated with the thermometer and its error analysis which can multiply the error in an unbounded fashion.
    (3) After 1 second, you do not know the temperature in the room since the error is possibly so large.

    Try as you might, there isn't going to be any way you can establish the constancy of the laws of nature within a second through an a priori argument. All we have are perceptions of regularity and that stuff seems to work in the same way through terrestrial timescales in the real world. If this were something that could be reconciled a-priori Hume's arguments against it and Wittgensteinian-Kripkian analogues in philosophy of language and the whole problem with grue and blue wouldn't be there. It's always going to be possible that there's a huge unaccounted for error in the thermometer, therefore we don't know the temperature in the room on the thermometer's basis.

    I would like to think you would also believe that this argument form is invalid, since it leads to the complete absurdity that it's impossible to form opinions based on measurements. Just substitute in 'measuring process' for thermometer and index a quantity instead of 'temperature', the argument works just the same.

    And again this is an independent issue of whether it's appropriate to ask the question 'how many seconds are required to make caesium clock produce an error of 1 second' - that already assumes the clock is functioning, or would be functioning in the manner it did in the experiment for that time period. Counterfactually: if same process, same measurements, same errors. You can answer that question with a simple algebraic operation - taking the reciprocal. If my pulse has an error of 0.1 seconds per second, then it takes 10 seconds for my pulse to accumulate 1 second of error.

    At this point, you said taking the reciprocal and saying the clock has amassed that error assumes the clock is working for that long. In a trivial sense it does - since if the clock didn't function for that long it would have a different amassed error but not a different error rate. Unless, for some reason, you undermine the measurement process of the clock by saying it requires the constancy of the laws of nature...

    In that case, we end up in the absurd position that a*10^x per k error rate isn't the same as (b*a)*10^x per b*k - which is an error in basic arithmetic.

    Edit: when I say there's no good reason to believe atomic physics will change in 100 million years, I mean that there's no good reason to believe that operation of nature relevant to atomic physics will change, not that the scientific understanding of atoms won't change in that time period. It will, it will get more expansive and more precise. If we're still even alive as a species by that point, ho hum.
  • fdrake
    6.7k
    @Metaphysician Undercover

    By metaphysical necessity, I mean the metaphysical necessity of a proposition. By the metaphysical necessity of a proposition, I mean that it's something true which is not contingent. Something that must be the case of necessity, and cannot change. I'm sure you can see that 'the physical laws will not change' is implied by 'the physical laws cannot change' - and in the latter statement is the expression of what I mean by metaphysical necessity of physical law. I don't think it holds. I don't think it's necessary for the clock to function as it does, and I don't think it's required for reciprocating the error rate in terms of seconds/seconds to get how many seconds are required for amassing a single second of error.
  • Metaphysician Undercover
    13.2k
    The possibility of error does not invalidate a measurement, the actuality of error does.fdrake

    I don't claim that the possibility of error invalidates the measurement. I assume that the measurement is accurate. I claim that the extrapolation is invalid due to the likelihood of unknown factors in relating the micro time scale to the macro time scale.

    You keep on assuming that the extrapolation is the actual measurement. It is not. The measurement was for a one month period. The extrapolation is for 100 million years.

    So I quoted you some stuff about the chronology of the universe - the stelliferous era, the one which we are in now, is predicted to have the same atomic physics through its duration.fdrake

    You still have not accounted for dark energy yet. If I understand correctly, the so-called expansion of the universe indicates that frequencies such as that of the caesium atom, are changing. I assume that all your statements concerning the stability of the stelliferous era are unjustified until dark energy is properly accounted for.

    Instead of focussing on what we can believe evidentially about the actuality of the laws of nature changing, you instead internalised the laws of nature to scientific consensus - claiming that the laws of nature change because of changes in science. In some trivial sense this is true; laws are descriptions of patterns in nature, if our descriptions change the linguistic formulation of patterns changes or new patterns are given descriptions.fdrake

    Yes, the laws of physics, which are the human descriptions of nature, change. But this is not trivial, as you claim. They change because human beings really have a very limited understanding of the vast universe, and they are always learning new things which make them reassess their old principles. You seem to think that our knowledge concerning the universe is already conclusive, and there is nothing which is unknown. Therefore you claim that our descriptions and principles of measurement will remain the same. I think this is naïve. And, my example of dark energy indicates that a huge part of the universe, that which falls into the concept of spatial expansion, remains essentially unknown.

    And in this, you provide the claim that the behaviour of oscillations between hyperfine states has been observed for one month, therefore measurement error analysis based on that month's observations cannot be used to calculate an error rate which is beyond the month. Maybe not beyond the month, you've been admittedly imprecise on exactly how 'the data was gathered in a month' actually changes the error analysis. Saying you have no idea of how 'it was gathered in a month' invalidates the quantification of error in the measurements.fdrake

    As I said, I don't say that there are errors in measurement, just in the extrapolation. Do you understand the difference between measuring something and producing an extrapolation from that measurement?

    (1) You read the temperature from the thermometer at time t. Say that the duration of your observation was 1 second.
    (2) There is a possible error associated with the thermometer and its error analysis which can multiply the error in an unbounded fashion.
    (3) After 1 second, you do not know the temperature in the room since the error is possibly so large.

    Try as you might, there isn't going to be any way you can establish the constancy of the laws of nature within a second through an a priori argument. All we have are perceptions of regularity and that stuff seems to work in the same way through terrestrial timescales in the real world. If this were something that could be reconciled a-priori Hume's arguments against it and Wittgensteinian-Kripkian analogues in philosophy of language and the whole problem with grue and blue wouldn't be there. It's always going to be possible that there's a huge unaccounted for error in the thermometer, therefore we don't know the temperature in the room on the thermometer's basis.
    fdrake

    We are not talking about measuring something, then turning away for a second, and asking whether the measurement is still valid, we are talking about measuring something then turning away for 100 million years, and asking whether the measurement is still valid. So your analogy is really rather ridiculous.

    I would like to think you would also believe that this argument form is invalid, since it leads to the complete absurdity that it's impossible to form opinions based on measurements.fdrake

    Again, as I've stated over and over, the issue is not the measurement, it is the extrapolation. For some reason you seem to still be in denial that there is an extrapolation involved here.

    At this point, you said taking the reciprocal and saying the clock has amassed that error assumes the clock is working for that long. In a trivial sense it does - since if the clock didn't function for that long it would have a different amassed error but not a different error rate. Unless, for some reason, you undermine the measurement process of the clock by saying it requires the constancy of the laws of nature...fdrake

    If, the frequency of the caesium atom is actually changing over time, like in the example of the earth's orbit actually changing over time, then the error rate will change over time, unless the frequency rate is adjusted to account for that change.

    Edit: when I say there's no good reason to believe atomic physics will change in 100 million years, I mean that there's no good reason to believe that operation of nature relevant to atomic physics will change, not that the scientific understanding of atoms won't change in that time period. It will, it will get more expansive and more precise. If we're still even alive as a species by that point, ho hum.fdrake

    The point is, how well do the laws of atomic physics represent what is really the case with the activities of the atoms. Hundreds of years ago people would say that there is no good reason to believe that the length of a year would change in millions of years. Now they've been proven wrong. Do you not think that the atomic physicists of today, will be proven wrong in the future?

    By metaphysical necessity, I mean the metaphysical necessity of a proposition. By the metaphysical necessity of a proposition, I mean that it's something true which is not contingent. Something that must be the case of necessity, and cannot change. I'm sure you can see that 'the physical laws will not change' is implied by 'the physical laws cannot change' - and in the latter statement is the expression of what I mean by metaphysical necessity of physical law. I don't think it holds. I don't think it's necessary for the clock to function as it does, and I don't think it's required for reciprocating the error rate in terms of seconds/seconds to get how many seconds are required for amassing a single second of error.fdrake

    I can't grasp your point here at all. If you take a measurement of one month, and extrapolate that measurement for 100 millions years, then in order for your extrapolation to be correct, the physical law produced by your measurement, "cannot change". Therefore any possibility of change negates the validity of your extrapolation.
  • TheMadFool
    13.8k
    That is quite a startling claim given that the relationships between mass and energy, energy and wavelength, mass and velocity, length and velocity, time and velocity, ... ... (I could go on and on) were all discovered in Theory before any measurement or reason for measurement could be conceived.tom

    Empiricism!?

    All we would need to do is measure g and use that to DEFINE L and T. But of course, no such physical system exists.tom

    The unit of g is m/s^2...time! has to be measured accurately first.

    But atomic transitions do exist, and the energy of transition can be measured. Because theory tells us the relationship between energy and frequency, and that transitions are induced in atoms when subjected to EM radiation of that frequency, we may DEFINE the second via that frequency.tom

    Science is empirical. Measurement, time, length, mass, etc. comes first.
  • tom
    1.5k
    Empiricism!?TheMadFool

    One of my favourite fallacies!

    The unit of g is m/s^2...time! has to be measured accurately first.TheMadFool

    With some rudimentary algebra, s = sqrt(m/g). So, we really can derive the second from other units if we wish, which is really what the SI standard does when it defines the second. It defines a frequency in terms of a measurable energy.

    Science is empirical. Measurement, time, length, mass, etc. comes first.TheMadFool

    Comes before what?
  • TheMadFool
    13.8k
    One of my favourite fallacies!tom

    Really? Empricism is the working principle of science. Why is it that scientists perform experiments if empiricism is a fallacy?

    With some rudimentary algebra, s = sqrt(m/g)tom

    I'm not saying g = m/s^2. The unit of g is m/s^2.
  • TheMadFool
    13.8k
    Comes before what?tom

    Before we discover relationships (laws).
  • tom
    1.5k
    Really? Empricism is the working principle of science. Why is it that scientists perform experiments if empiricism is a fallacy?TheMadFool

    To test their theories.

    I'm not saying g = m/s^2. The unit of g is m/s^2.TheMadFool

    I was doing some dimensional analysis for you. The fundamental units are arbitrary.

    Before we discover relationships (laws).TheMadFool

    Then why were gravitational waves known about 100years before we could detect them?
123456Next
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.