• vesko
    19

    I think that time is measured with the rotation of the earth but I don't know why the unit hour is 1/24 of one turn of the earth.
  • vesko
    19
    so we do not have time, we have only rotation of earth which we consider as time.
  • TheMadFool
    13.8k
    There is absolutely nothing mysterious here. It isn't philosophy, it's well established engineering and mathematics.fdrake

    How do you check the accuracy of your watch? You must compare it to some standard clock, say A. The same question applies to A too and so on...ad infinitum. We can never be sure of the accuracy of a clock.


    My counter example works fine with nanoseconds.noAxioms

    Imagine a clock, A, that's supposed to mark off seconds (1 tick = 1 second) but it actually marks off 0.9 seconds (1 tick = 0.9 seconds). How long would it take for clock A's error to be noticed if our discerning power is 1 second? Consider ticks as x. We have the following inequality:

    x - 0.9x > 1...it'll be only 10 true seconds later that we will be able to notice the error.

    Imagine now that for A 1 tick = 0.9999999999 seconds. Plug that in and:

    x - 0.9999999999x > 1

    Doing the math we need at least 317 years before we can find the error in clock A's time.

    So smaller the difference between true time and the time of a clock the longer it'll take for you to detect the error.
  • fdrake
    6.6k
    @TheMadFool

    How do you check the accuracy of your watch? You must compare it to some standard clock, say A. The same question applies to A too and so on...ad infinitum. We can never be sure of the accuracy of a clock.

    Except no, because this isn't an infinite regress. It stops at whatever measurement of time is conventionally accepted as the definition. The duration of a second now means:

    "the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom."

    And so comparisons ultimately derive from this one.
  • Metaphysician Undercover
    13.1k
    The duration of a second now means:

    "the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom."

    And so comparisons ultimately derive from this one.
    fdrake

    The problem though, is that this defined "second" is also related to the length of a "day" which is defined by the rotation of the earth. So there is a specific number of those seconds in every day (one rotation of the earth). When, after a long period of counting that specific number of those seconds, without adding any "leap seconds", the point of change over from one day to the next begins to be out of sync, such that it moves from midnight toward morning or something like that, we have to ask, what is the true constant, the caesium-133 atom, or the rotation of the earth.
  • fdrake
    6.6k


    The entire point of calibrating measurements of time is that there is a privileged time-measurer and other measurements of time are calibrated through their relationship to the privileged one. This is then what it means for two time-measurers to be in accord. If they are out of accord, they can be corrected. If the privileged one behaves in an unexpected way, it will be changed.

    This is because the conventional definition of time with respect to the rotation of the Earth around the Sun is slightly different from the conventional definition of time with respect to the oscillations of a Caesium atom. And thus the introduction of the leap second is precisely an attempt to calibrate the atomic clock second with proportion of a year second. This is so that we can keep the conventional organisation of time in terms of hours, days, months, years and not reinvent the wheel purposelessly.

    If you like you could become an advocate of year definitions without leap seconds.
  • Metaphysician Undercover
    13.1k
    The entire point of calibrating measurements of time is that there is a privileged time-measurer and other measurements of time are calibrated through their relationship to the privileged one. This is then what it means for two time-measurers to be in accord. If they are out of accord, they can be corrected.fdrake

    What gives "privilege" to one time-measurer over another? Why would the caesium-133 atom be more privileged than the rotation of the earth?

    If the privileged one behaves in an unexpected way, it will be changed.fdrake

    In other words, it may turn out in the future, that we find out that we were wrong in assigning privilege to the one time measurer over the other.

    This is because the conventional definition of time with respect to the rotation of the Earth around the Sun is slightly different from the conventional definition of time with respect to the oscillations of a Caesium atom. And thus the introduction of the leap second is precisely an attempt to calibrate the atomic clock second with proportion of a year second. This is so that we can keep the conventional organisation of time in terms of hours, days, months, years and not reinvent the wheel purposelessly.fdrake

    Now this just validates The Mad Fool's point. Instead of handing privilege to one clock over another, we introduce leap seconds and live with the inconsistency. One person can argue that the caesium clock gives the more accurate measure of time, and another can argue that the earth's rotation gives a more accurate measure of time. The leap second doesn't resolve anything, it just negates the inconsistency without determining which is more accurate. To determine which is more precise, we turn to a third time-measurer which is the revolution of the earth around the sun. But now we still have inconsistencies and we still have not adequately determined which is more accurate, so we could compare another time-measurer, and on and on, as The Mad Fool says, ad infinitum
  • fdrake
    6.6k
    @Metaphysician Undercover

    Convention privileges a measurer of time as a definer of the second. Then other ways of measuring time are calibrated to it.

    What's the time where you are MU?
  • TheMadFool
    13.8k
    How do we know that the cesium atom's radiation is regular - that one instance of 9,192,631,770 periods (let's call this a cycle) is the same as the next 9,192,631,770 periods?

    If one cycle of cesium atom A differs from its next cycle we have no way of detecting the error that'll creep into cesium atom A's clock. We'll need another more accurate clock to detect the error and what if that clock is also irregular?
  • Metaphysician Undercover
    13.1k
    Convention privileges a measurer of time as a definer of the second. Then other ways of measuring time are calibrated to it.fdrake

    That's what I mean, it's just a convention, it's not necessarily an accurate way of measuring time. So the conventions change from time to time, and we still haven't found a measurer which has proven to be accurate.
  • vesko
    19
    ok. if I am a person from the planet Mars f.i. what can be my measure for time, rotation of Mars around its axis or around the sun?Obviously it will be DIFFERENT from our own here on our planet.So the time in the universe will depend from the place ( planet) where a civilisation measures it.so a second here can be an year on planet x ☺
  • fdrake
    6.6k
    Every clock has a measurement error associated with its time. This is literally a quantification of how accurate the clock is. For the caesium-122 clock, this is an error of 1 second in 100 million years. The reason the atomic clock was switched to over the mean-solar-day definition was that it was more accurate, it had less measurement error and variability.

    The cycles of caesium atoms don't differ in any meaningful way. That's kinda the point. They're regular enough to make a measurement of time to the tune of 1 second of error in 100 million years.

    Accuracy = precision of measurement. Precision of measurement = small measurement error. The absence of measurement error is impossible, all that matters is whether it is low enough to make good measurements. If a new time measuring device is more accurate like this one which won't get an error until the universe doubles in age from now, then definitions can be made with respect to the more accurate clock.

    This is why the second standard based the Earth's rotation round the sun was rejected, it was demonstrably less precise. But - but - we keep leap-seconds, leap-days etc so that we stay calibrated with the Earth's rotation around the sun since we don't want to reject the solar year and its monthly/daily/hourly divisions and come up with a new manner of organising time...

    This is also why the number of oscillations of the caesium atoms was chosen, since it was incredibly close to the current definition of the second but measured far more precisely.
  • Metaphysician Undercover
    13.1k
    For the caesium-122 clock, this is an error of 1 second in 100 million years.fdrake

    You don't seem to be getting The M[ad Fool's point. By what principle do you derive that margin of error? You could only determine the clock's accuracy by comparing it to another clock. So why would you conclude that the caesium clock is more accurate than the other clock? Have you recorded it for a hundred million years? What makes you think that the caesium clock is so incredibly accurate, other than your assumption that another clock which it was compared to is less accurate?

    But - but - we keep leap-seconds, leap-days etc so that we stay calibrated with the Earth's rotation around the sun since we don't want to reject the solar year and its monthly/daily/hourly divisions and come up with a new manner of organising time...fdrake

    So, there is a need for leap-seconds. Why do you assume that this need is produced by the earth's rotation being less accurate than the caesium clock, instead of assuming that the caesium clock is less accurate than the earth's rotation?

    This is also why the number of oscillations of the caesium atoms was chosen, since it was incredibly close to the current definition of the second but measured far more precisely.fdrake

    See, you keep make assertions like this, without explaining what you mean by "far more precisely".
  • fdrake
    6.6k
    I did some googling for you. @Metaphysician Undercover

    Here's a paper that does measurement error analysis for a type of atomic clock.

    Here's one that does measurement error analysis for a modern optical lattice clock.

    Here's the wikipedia page on the adoption of the atomic clock standard.

    Measurement error estimates in general are obtained from making repeated measurements. When there are multiple components to the measurement error like in the error analysis for atomic clocks, individual component error can be obtained by varying one component independently of the others. The errors are then usually combined through the square root of the sum of their squares, or the square root of the sum of squared %errors.
  • fdrake
    6.6k
    A cliffnotes version of the conclusion: errors in measuring the number of oscillations of atoms or lattices between different quantum states within a given duration are then translated into errors in time measurement.
  • TheMadFool
    13.8k
    The cycles of caesium atoms don't differ in any meaningful way. That's kinda the point. They're regular enough to make a measurement of time to the tune of 1 second of error in 100 million years.fdrake

    How do we know that? My watch's error can be detected by an atomic clock. How do we detect the error of an atomic clock? How do we know the ''1 second of error in 100 million years''?
  • noAxioms
    1.5k
    How do we know that? My watch's error can be detected by an atomic clock. How do we detect the error of an atomic clock?TheMadFool
    Read the links fdrake posted. They answer exactly this question. At the sort of accuracy they're talking, two clocks would need to be in exactly the same environment. Put them in adjacent parking spaces and the difference in latitude will get them out of sync.
  • vesko
    19
    if you are in a space ship somewhere in the universe and you have no clock, how can you measure the time with some approximation?
  • vesko
    19
    the answer is as follows :
    simple way can be the measuring of our pulse which is a given by God interval we can use to measure the time .Of course pulse is not constant but we can take an average of 60 pulsatings per minute (the minute is related to our measures on earth which we are aware a priori of)
    So the time is nothing else but a counting of repeated events done by humans.
  • noAxioms
    1.5k
    if you are in a space ship somewhere in the universe and you have no clock, how can you measure the time with some approximation?vesko
    the answer is as follows :
    simple way can be the measuring of our pulse which is a given by God interval we can use to measure the time .
    vesko
    If you have a pulse, you have a clock. Lousy precision, but a clock nevertheless. You can time the boiling of your egg by counting heartbeats.
    So the time is nothing else but a counting of repeated events done by humans.vesko
    The counting can be (doesn't need to be) done by humans. The counting is not what time is. It is simply a human taking a measure of what time is. Plenty of non-human things utilize time measurement.
  • tom
    1.5k
    By taking the temperature of the Cosmic Microwave Background.
  • tom
    1.5k


    How do we know that? My watch's error can be detected by an atomic clock. How do we detect the error of an atomic clock? How do we know the ''1 second of error in 100 million years''?TheMadFool

    A physical process provides the definition of the second, the accuracy relates to the technology we have with which to measure that physical process.
  • vesko
    19
    I think that time is only a non constant reflection in our human minds and we humans can measure it ,what animals etc. can't.
  • Metaphysician Undercover
    13.1k

    The one referred article states that the measured frequency was found to remain stable for a month. How do you make a claim about the clock's accuracy for 100 million years from this?
  • fdrake
    6.6k


    Honestly I don't understand literally everything in the paper. I trust their error analysis. If you really want me to translate the error analysis in the paper to a more convenient form I could try, but not now.
  • Metaphysician Undercover
    13.1k
    Honestly I don't understand literally everything in the paper. I trust their error analysis. If you really want me to translate the error analysis in the paper to a more convenient form I could try, but not now.fdrake

    No need to do that. I just don't believe that it's possible to make a statement concerning the accuracy of a clock over a 100 million year time frame, when the activity which is used as that time-measurer has only been proven to be stable for one month.
  • fdrake
    6.6k


    I think you're mistaking the one measurement for another. Can you cite the passage?
  • Metaphysician Undercover
    13.1k

    One measurement for another? What do you mean by that?
    This is from the first article you referred. The introduction I believe.

    "Furthermore, two independent Sr clocks are compared
    and they agree within their combined total uncertainty over a period of one month.

    ...

    When the SrI and SrII clocks were compared over a period of one month, we found their frequency difference to be nSrII – nSrI = -2.8(2)×10-17, well within their combined uncertainty of 5.4×10-17."
  • fdrake
    6.6k
    They ran for a period of a month, and they got out of phase by 2.8 x 10^-17 seconds. That doesn't mean it's only proven to be stable for a month. Quite the contrary, the error is so low in a month that it's negligible.
  • TheMadFool
    13.8k
    Read the links fdrake posted. They answer exactly this question. At the sort of accuracy they're talking, two clocks would need to be in exactly the same environment. Put them in adjacent parking spaces and the difference in latitude will get them out of sync.noAxioms

    There doesn't seem to be a law that cleary demonstrates true regularity of any physical process. Every clock is imperfect. All we've done is postponed the event when our clocks will accumulate enough error to be noticeable. While this may be acceptable in living im the seconds, minutes, hours, days, months or years, we can't ignore it in doing science where accuracy is vital.

    A physical process provides the definition of the second, the accuracy relates to the technology we have with which to measure that physical process.tom

    The physical process has to be regular. In my OP I mentioned how this is ''less'' of a problem with other quantities like length, mass, volume because we have a standard whose state has been specified. With time it's different because we can never be sure of the regularity of a time piece. We can't be 100% certain that one period of a cesium atom takes the same time as the next.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.