THE DIFFERENT ANSWERS ARE SEPARATED BY A LINE OF STARS I used to handle the time calibration and all time scale conversions for the Rossi X-ray Timing Explorer, and from that experience I would very much counsel against any changes. I now have the same role for the Chandra X-ray Observatory and the current determination method of UTC, combined with Bulletin C, is serving us well. ****************************************************************************** Given my previous employment at the U. S. Naval Observatory, I would note that over the past few years a number of key personnel doing Earth orientation related research and operational work to determine UTC (via VLBI and other techniques) have left and have not been replaced. It seems clear from afar that USNO's commitment to Earth orientation determination (including it's commitment to operate the NEOS VLBI networks and to provide rapid service information to the IERS) is waning. I would strongly urge the IERS to examine this situation and determine whether USNO plans to continue to provide Earth orientation related information and services, and to determine if other organizations (e.g. the U.S. NASA/GSFC VLBI group, or the U.S. NRAO) will be able to continue this work if USNO does not. ****************************************************************************** If you screw around with UTC, then it will be necessary to reinvent it. ****************************************************************************** We are very satisfied with the available information. ****************************************************************************** The only real question is whether or not we want astronomically determinted time (UT1) to be in close agreement with civil time. If so, then some kind of adjustment is required every so often. The leapsecond is far superior to previous attempts to correlate UT1 and civil time. ****************************************************************************** I only need to determine the correct time of a measurement for comparison with other spacecraft and earth-based datasets. So long as we all agree on the convention, anything is fine with me. ****************************************************************************** Seconds should not be inserted June 30, only December 31, when people are aware of possible roll-over problems already. This may mean slightly higher UT1-UTC tolerance values are needed for some years. ****************************************************************************** Changing the definition of a second should not be considered as an option; it would break far too much, including changing all units derived from the S.I. second. Whatever option is chosen, it should be published well in advance of being implemented (say 3 years), to allow any assumptions in existing systems to be checked. ****************************************************************************** If we were going to changes something with leap second, it would be better to come up with an algorithm which predicts leap seconds. This could then be used by software rather than having to wait for a leap second adjustment to be announced. ****************************************************************************** Options c and d in question 4 are unacceptable. Navigation will still use ut1, autonomous systems will continue to tai, and options a and b will just make for fewer but larger corrections. The database of leapseconds is pretty small, I see no reason to try to make it smaller. ****************************************************************************** Re: Computer Timekeeping and Leap Seconds >From a computing persepective, the biggest problem is that current systems do a terrible job of dealing with leap seconds. However, I think it's better to fix the software than to change the UTC system itself. It's useful here to look at the history of computer timekeeping. Initially, most computers kept time in local time. This led to problems in comparing timestamps across timezones, as well as problems with time jumps and ambiguities at DST (Daylight Savings Time, a.k.a. Summer Time) changes. Hence some systems (notably Unix) began to use UTC as the internal timescale, with the timezone and DST being handled as part of the conversion to and from human-readable forms. At first, DST changes were based on a fixed (usually US) rule, and only integer offsets were allowed, but this was problematic for many countries. The next generation allowed the complete rule and (perhaps non-integral) offset to be incorporated into a long string in the TZ value, but that still didn't accomodate changes in the DST rule and/or offset in different years. The latest systems using the "zoneinfo" mechanism maintain the entire history of offsets and rules associated with any supported timezone. Similarly, using UTC as the primary synchronized timescale suffers problems with "glitches" similar to DST changes (albeit much smaller), and even a system that adjusts for leap seconds correctly can't necessarily compute correct intervals for past times that cross leap-second boundaries. Just as systems evolved from LT to UTC as the primary timescale, I believe that they should evolve further from UTC to TAI (or TAI-K, where K is a constant integral number of seconds) as the primary timescale, with the appropriate TAI-UTC being taken into account as part of the input/output conversion process to or from UTC or LT. This does not mean that UTC and/or LT can't be used in appropriate contexts, but just that the basic "time counter" maintained by the OS should be unfettered by leap-second adjustments. It's also worth noting that, just as a scalar representation of LT is undesirable due (at least) to DST ambiguities, a scalar representation of UTC is undesirable due to leap-second ambiguities. At the very least, the latter needs to include some form of "leap indicator" to disambiguate the final two seconds of a "leap-second day." This differs from the LI field of NTP packets, which would more appropriately be called "leap warning" than "leap indicator" (it would be far better to base NTP on TAI-K than on UTC, as it is currently defined). Naturally, a scalar representation of TAI-K is perfectly reasonable, even at high resolutions. If such a system were introduced, the value of K could be chosen to initially synchronize TAI-K to UTC at some particular time, just as GPS time was intially synchronized to UTC as of Jan-1980 but tracks with TAI rather than UTC. A value of 32s would match UTC both right now and at the recent "millenium boundary" (regardless of which definition of "millenium" one uses :-)). Even if it took a few years to deploy such a system, the low value of d(TAI-UT1)/dt at this particular time means that the largest changeover discrepancy between TAI-32s and UTC at the time probably wouldn't be more than a couple of seconds. Although I'm primarily discussing general-purpose computer systems here, I don't see why a similar approach (TAI-K internally, UTC or LT for humans) couldn't be used by the "telecommunication and navigational" systems referred to above. In fact, GPS already takes this approach by using TAI-19s as its internal timescale. And the "historical" aspect could be greatly simplified (two values of TAI-UTC with a changeover time) in systems that aren't concerned with times significantly different from the present. Other simple time-aware devices (e.g. VCRs) could do something similar, although most of them are too inaccurate to be concerned about leap seconds, anyway. Re: Determination and Dissemination of Leap Seconds I don't see the current six-month lead in knowing leap seconds as problematic (perhaps others do), but in keeping with the above, a philosophical change in the process would make sense. In the scheme I propose, general-purpose computer systems would have a complete historical table of TAI-UTC values as a function of TAI or UTC. These values are entirely well-determined for the past, and are also well-determined for some amount of time (currently on the order of six months) into the future. Since converting between TAI-K and UTC might be meaningful for any value of TAI or UTC (including those in the future), any information about predicted TAI-UTC values could be useful, even when it's not sufficiently accurate to be "frozen." Given completely accurate models and the complete lack of unpredictable perturbations, it would of course be possible to schedule leap seconds thousands of years in advance, but in reality the uncertainties are much higher. Nevertheless, even approximate predictions of TAI-UTC are likely to be more useful than merely assuming that TAI-UTC remains constant for all time after the next adjustment opportunity, so extending tentative TAI-UTC predictions considerably into the future would seem reasonable. Thus, the scheme I would favor is one in which a complete leap-second "history" (covering the past and the future) is published (in a well-standardized form suitable for computers), and updated as required. The periodic announcements of leap seconds (or lack thereof) would be replaced (or augmented) by periodic announcements of which portion of the history is "frozen" versus "tentative." The distance into the future for which leap seconds are "committed" should be determined mainly by how far it *can* be done while meeting the |DUT1| goal, and that timeframe might vary with the particulars. ****************************************************************************** We only use it as a reference for all our computers. We do not worry if it is accurate with respect to the earths rotation. ****************************************************************************** Ideally, TAI (or TT) should be the standard, everywhere on Earth. Local times, DST, etc., should be abandoned in the end (but that is of course an other discussion). The general availability of TAI (or TT) and deltaT is all that is needed. ****************************************************************************** [This suggestion can be considered as my SECOND choice; however, I still prefer the current system.] Announce leap seconds for 'brackets' of, say, 5-year periods (1 to 6 years into the future). Allow the UT1-UTC tolerance to be within +/- 2 seconds. If a decision is made (under this system) for 3 leap seconds in the 5-year period concerned, but only 2 leap seconds were necessary, then subtract 1 from the prediction for the following 5-year period, in order for the 2 time-scales to converge again. If there were too few leap seconds, then put too many in the next 5-year period. This will allow some short-term certainty, while still allowing for the instability of UT1. DUT1 transmissions would need an extra tone to indicate +1 or -1 second, on top of the present CCIR DUT1 code, but I feel that, that can be accommodated fairly easily. ****************************************************************************** We could tolerate some the proposed changes, but not all: a. No leap second [NO] b. Increase tolerance for |UT1-UTC| [YES] c. Smooth over leap second step [YES] d. Redefine the second [NO] ****************************************************************************** Refer to GPS solution ****************************************************************************** Suggest that both UTC and TAI are available. (GPS time for TAI) The user can thus choose which is more appropriate for his/her application. ****************************************************************************** I would like to have a webpage, whith al the leapseconds and the UT1 - UTC from the beginning. last year there was a page ftp://maia.usno.navy.mil/ser7/deltat.data but unfortunally it is removed. ****************************************************************************** My main concern is to be able to correctly time stamp real events with a clock synchronised to a GPS source. As UTC is defined, it is quite easy to derive TAI time from a GPS source and synchronize our computers to that time. When I have no concern with time overlapping events, UTC is a correct approximation of legal time which is fine for me. I am really happy with the current definition of UTC. ****************************************************************************** I would favour no change until the time difference is much larger, say 1 minute, and then 1 big change to correct things. I don't believe this would affect the "man on the street" in any way, as most people are not aware of leap seconds anyway. ****************************************************************************** A web page with a simple text table - with metadata in header to point a lot of search engines there. Also list the UTC-GPS values. List values for past decade or more so people working with historical data can easily get information. ****************************************************************************** UTC is a convenient piecewise continuous scale related to what we see in the sky. The 1 second accuracy is appropriate for typical simple claculations for astronomical positions. I imagine if the time community stops providing it the astronomy will have to generate its own timescale with a definition equivalent to the current definition of UTC, leading to unwelcome proliferation or divergence of standards and standards bodies. ****************************************************************************** > Many devices and programs are designed with the leap second in mind. ****************************************************************************** In no case should the second be redefined. This would result in a need to change all numerical values in celestial mechanics, planetary theories, etc. If no leap second is introduced in the future, then it would indeed= be better to broadcast TAI, not a sort of new UTC. However, i= n my opinion it would still be better to broadcast and use the Dynamical Time instead of TAI. The difference of 32 seconds between TAI and Dynamical Time resulted in a lack of coordination between astronomers and physicists about 1972. After all, the Dynamical Time (ex Ephemeris Time) is the time scale used by astronomers. Again, using a new UTC would result in an unnecessary complication,= as we would have = *three* "parallel" time scales: Dynamical Time, TAI, and UTC. = ****************************************************************************** I think that any system that requires such precision in its measurement of time that the UTC leap seconds cause a problem should use TAI internally, converting TAI to UTC when it is necessary to present the time to humans. Since a conversion is already necessary at this point (from the internal form used for calculations to the m/d/y h:m:s form expected by humans, and probably to correct for time zone as expected by most humans) adding the TAI to UTC conversion at this point would be relatively easy. (It might be more convenient to use TAI-32s, effectively using the current UTC and ignoring leap seconds, to avoid the 32 second jump to "real" TAI.) Such systems would need to know the current value of UTC-TAI, of course. The correct method of obtaining this value depends on how long the devices that present the time to the humans are expected to operate correctly when they are not receiving information from an external source and whether it is necessary to continue to use existing data formats or devices. For a device that is of little use if it is not receiving information from an external source (a handheld GPS unit, for instance) which uses a data format that is newly designed or can be modified to include UTC-TAI, the solution is simple: the external source transmits UTC-TAI at some suitable interval and the device uses it. If the device is expected to operate in isolation at times, it will usually be adequate to simply retain the last transmitted UTC-TAI value. Devices with this level of accuracy are not usually expected to be isolated for long, so the most that would need to be added would be an indication of whether a leap second will be added at the next opportunity to do so. Devices which cannot receive UTC-TAI automatically will have a problem. Some of them will be able to use an alternative data path to get data, but some will need to get it manually. Even those devices with an alternative data path may not have the alternative data path available at all times (a handheld unit that can be connected to a computer to get the data, for instance). If such devices will exist, it would be useful for the leap seconds to be forecast farther in the future than is currently done. I would suggest that the forecast extend to five years, updated every six months. The forecast for the next opportunity for a leap second would, as it is now, be definitive. The forecasts for the later opportunities would be subject to change, with the probability of change increasing with time. An attempt would be made to honor previous forecasts whenever possible. A device which was loaded with such a forecast whose operator neglected to obtain the latest forecast would still function correctly for its primary purpose. The human-readable time that it presented would sometimes be one or two seconds off, but this is probably acceptable to an operator who does not get the latest data. ****************************************************************************** Please do not remove leap seconds from UTC. Please keep |UT1-UTC| < = 2 sec or better yet keep it < 1 sec as it is now. ****************************************************************************** Keep it as it is. ****************************************************************************** We have a saying here in the US: 'If it ain't broke, don't fix it!' ****************************************************************************** UTC was introduced and adopted for civil timekeeping purposes in the early 1970s. It seems incredible that, in the meantime, any group could have build a system which requires accurate timekeeping yet neglects the operation of leap seconds. The telecommunications and navigational communities are stated to be the proponents of the abandonment of leap seconds yet their arguments have never been satisfactorily explained to the rest of us. On 2000/07/05 Demetrios Matsakis issued a report to the respondees of the previous leap second questionnaire in which he stated: A discussion of the motivations for a change and of possible solutions has been published by McCarthy and Klepczynski in the Innovations Section of the November, 1999 issue of GPS World. How many people would have ready access to "GPS World"? He continues The authors consider the most significant reason for a change to be keeping spread-spectrum communication systems and satellite navigation systems compatible with each other and with civil times. Another reason is the emerging need in the financial community to keep all computer time-stamps synchronized. This is not an explanation! If all the world's astronomical observatories can keep their timekeeping in sync why can't these others? And if it's so difficult for them to implement leap seconds why don't they simply use TAI? Matsakis states that one half of the respondents of the previous questionnaire were opposed to a change while one quarter were in favour (and the rest indifferent). In summarizing the arguments for either side he states that for those in favour: Along with responses based upon reasons already covered in the GPS World Article, there were also ... Thus, in eliding the reasons covered in "GPS World", the vast majority of us who have no ready access to that journal still have not seen a substantive argument in favour of eliminating leap seconds. This would appear to be a farcical situation; a major change in civil timekeeping is proposed but the arguments in favour of it are secreted away in some obscure journal which appears to be focussed on GPS navigation! While a web search for this journal did uncover www.gpsworld.com, critically this site does not include the back-issue containing the article. Given the overwhelming ratio of 2-to-1 against dropping leap seconds in the last questionnaire (5-to-3 if you split the indifferents) I would also question why the issue has been raised a second time. I have participated in the development and maintenance of several satellite orbit estimation/prediction/navigation systems over the last 23 years. Programmers who don't have a thorough understanding of astrodynamics and timekeeping have always had trouble coding leap second and UT1-UTC handling properly, requiring several iterations of design and review before it was correct. One of the mature systems I worked with had problems built into it that hadn't been discovered for many years. Operationally, it is not always understood that input of the compatible UT1-UTC must accompany entry of the leap second. Elimination of future leap seconds by either allowing UT1-UTC to grow negatively, or just using TAI, would save on system development costs as well as maintenance, and simplify operational use. ****************************************************************************** Telecommunications and navigation use just have to adapt. This is not a problem for GPS navigation, because they use GPS time with no leap second. The receiver clock bias relative to GPS satellite clock (with general relativity rate change) is estimated along with navigation position. ****************************************************************************** It works well; don't change it. I answered question 4-1 despite answering NO to question 4 because the questionnaire otherwise fails to provide an opportunity for opposing any change to express a preference for the least disruptive change. ****************************************************************************** I suggest that you announce the leap seconds every four or five years. Then we'll adjust the leap seconds (not a leap second) just every four or five years. ****************************************************************************** Although I DO think the service is perfectly sufficient, when dealing with leapseconds in a Telescope Control System it becomes very awkward when reconciling the hardware served UTC and the software controlled DUT1. NTP does not do much to help this problem, though that is a software<->hardware problem. ****************************************************************************** Since I'm not professionally involved, I can't determine what is best. But if leap seconds continue to be used, it is interesting to know in advance when they will take place. ****************************************************************************** UTC, with leap seconds, should remain the primary high-accuracy standard, but the profile of TAI should be raised now that it is extremely accessible through GPS. ****************************************************************************** The second should not be redefined merely to eliminate =B3leap seconds=B2 (Indeed, since UT1 wanders unpredictably, this would not work anyway). Redefine the second ONLY if a more accurate clock someday replaces the cesium clock. Finally, do not modify the current system of time scales without very good reason. Astronomers tend to be reluctant to change, as evidenced by the continuing wide use of discarded time scale notations such as ET and TDT, and the persistence of cgs units despite the recommendation of SI units by the IAU and IUPAP. Updates are adopted with glacial slowness! ****************************************************************************** I have seen many organizations spend enormous amounts of time, effort, and money dealing with leap seconds. It is my opinion that Delta-UT1 should be allowed to drift indefinitely -- as long as the current offset is published, why does it have to stay between -1 and +1 seconds? It makes no difference to anyone in my field whether Delta-UT1 is 1 second, 10 seconds, or 100 seconds, as long as we know what it is, so that we can correctly compute the Earth's orientation. ****************************************************************************** An increased tolerance on UT1-UTC might be essentially the same as no further leap seconds (for some time into the future). At some point, the question may then become moot, as the dissemination of time for civilian use may be handled quite differently. ****************************************************************************** NOTE: I have been dealing with leap-seconds and earth rotation since the early 1970s (since shortly after the current definition of UTC was adopted). At that time, leap-seconds were predicted about 6-8 weeks before their epochs. To accommodate a software configuration management process that required a longer lead time, I developed my own leap-second prediction software that used IERS (then BIH) raw data as input. This software also provided very accurate determinations of the constant and linear terms in for the conversion between UTC and UT2. I know of a number of legacy software systems still in use for tracking and controlling space satellites, all of which implement the current discontinuous leap-second definition. Changing that definition could cost a substantial amount of money to revise that software. ****************************************************************************** Redefine the size of the SI second to eliminate the secular rate of change between TAI and UT1. Inaccurate size of SI second for UTC seems responsible for the GPS apparent Y axis bias (which is not a true acceleration) and problems with solar radiation model during GPS eclipse. Also, the leap second insertion requires manual processing within the GPS control segment to adjust GPS satellite observations with the modified GPS time for several weeks to get a consistent database before and after the leap second insertion, because the Kalman filter of the GPS control segment does not smoothly handle jumps in the Earth-Centered Earth-Fixed (ECEF) frame from rotation. GPS control segment does this intensive manual effort so that external GPS users do not experience such a jump in the calculations of the ECEF during navigation. Coincidentally, the judicious increase in the SI second interval that empirically eliminates the Y axis bias also compares favorably with eliminatation of the leap second (by eliminating the linear component of the secular drift between TAI and UT1). zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz The current standard allows for leap seconds to be issued as frequently as monthly. It will be many hundreds or thousands of years before a monthly sampling frequency is insufficient. On the other hand, the alternative of not issuing leap seconds would establish a trap for future generations and a challenge for current practioners to handle the rapidly diverging UT1 and civil time. Increasing the tolerance for |UT1-UTC| would itself establish the likelihood of Y2K-like disasters as lazy programming practices failed to account for even larger discontinuities on a much less frequent basis. The proper answer is to establish a monthly leap second decision schedule that will require projects that require UTC to use it (and leap seconds) properly. Alternately, projects that would benefit from unsegmented time should be encouraged to use TAI directly. Most of the problems this unwise initiative is trying to solve are the result of precision timing projects having choosen the wrong time scale in the first place. The civil population should not have to pay for the mistakes of these projects. ****************************************************************************** In my research, we use GPS time signals to synchronize data streams from different gravitational-wave detectors and to be able to integrate over long time periods (~months) to search for continuous-wave sources. We use UTC only secondarily, to compare our data against data from other sources (e.g. the times of gamma-ray bursts, which are given in UTC). But we rely on the assumption that the GPS second has fixed length. The role of the second as a fundamental constant is so important to many fields of modern science that redefining it would be unacceptable. I suppose it would be possible to differentiate between "scientific" (TAI) and "civil" (UTC) seconds, but I do not find that appealing, and I believe it would cause much confusion, since UTC is commonly used in scientific research too. And would the GPS system distribute TAI or UTC? In any case, distributing information about a time-varying conversion between the lengths of TAI and UTC seconds (option c above) is at least as inconvenient as distributing leap-second information. Also, in my experience, the inconvenience of leap seconds arises from having to deal with them at all, not how frequent they are; thus option b above does not seem to provide any benefit. I think that the task of keeping track of leap seconds would be made significantly easier if the IERS would provide a reliable, machine-readable file on the web (perhaps mirrored to a few additional locations, e.g. web sites of national time-keeping authorities) containing complete leap-second information. Issues of bulletin C, and the table of TAI-UTC on the IERS web site, are fine for human consumption, but a text file with a well-defined format (and a commitment to keep it up to date!) would allow users to write software to get this information automatically and directly from the IERS, without manual intervention or intermediate file formats. Besides the leap seconds themselves, the file should indicate how long the information is assured to be valid (i.e. a date before which it is certain that no additional leap seconds will be introduced), and perhaps also the expected date of the next update. (I actually have written software to parse the text of Bulletin C, but it assumes that the format will not change much in future issues! An ASCII file with a fixed format would be much better.) ****************************************************************************** I believe the problems that people have with UTC are in all cases a problem of those users of UTC having poorly chosen which time scale to use. There is no real problem with the definition, determination, or operation of UTC. The problem is that almost nobody involved in the specification and design of information systems which keep timestamps is aware of the different timescales that they can choose from. In many cases, TAI would be a good choice. But out of ignorance, they choose UTC alone. One thing that might help is to encourage those that distribute time signals to distribute, in one form or another, both UTC and TAI, and a complete (as far as is currently known, both historical and announced future) table of leap seconds. And documentation of such signals should present UTC and TAI with equal emphasis. This would force those who want to get accurate time to realize that they have a decision to make. Do they want a timescale that just keeps on ticking? (If so, they want TAI.) Or do they want a timescale that easily and accurately corresponds to civilian's notions of what time it is? (If so, they want UTC.) In cases where they want both, then they probably should build a system that keeps both UTC and TAI and is able to map between them (and these days it should be no problem for anyone to muster the small bit of VLSI, software, and communication necessary to get this right). What is needed is more education of the user community. Not any changes in how things are done. ****************************************************************************** It makes no sense to have UTC be just another version of TAI offset by some number of seconds. UTC should stay the way it is. It seems to me that any change in the way UTC works will require international agreement. This will take time to achieve. The agreement itself will have to recognize that no change can be made to UTC without publishing the specifications of the change well in advance. The agreement will also have to recognize that any change to UTC cannot occur until after some significant period of years elapses in order to give everyone time to retire old equipment and replace it with new systems that implement the change to the way UTC works. I would estimate that the likely timescale for such a process is on the order of decades, perhaps even as much as 30 years. If that is the case, the change to UTC will not be happening until the difference between TAI and UTC is about one minute. This will also occur at about the same time as the system clocks in 32-bit Unix computers will expire, and this will provide significant incentive for upgrading hardware. I propose that UTC should stay as it is. I propose that in about 30 years, when TAI and UTC differ by exactly one minute, that civil time be changed from UTC to TAI. The entire world can hold a colossal leap minute party to celebrate the event. It's not to early to start planning that party. ****************************************************************************** Although I would prefer TAI (awnser 4.1.2), I assume 4.c it is better for long term use and contininuity in interpretation of UTC. ****************************************************************************** As I wrote above in 4-2, I believe that the current UTC determination method with leap second adjustements is the best. ****************************************************************************** Allowing UTC to drift from GMT would be a major problem for the astronomical community. Those who care know that UTC isn't exactly GMT, but the difference is small for most purposes. The current system seems the best compromise among the alternatives listed above. ****************************************************************************** The operational control system in use here is hard coded with certain parameters set which would inhibit our capability to change methodology or unnecessarily increase the work load. ****************************************************************************** > Every time I am online, I contact DCF77 to adjust my clock in my PC on my real desk I have a clocck, recieving the time every second via longwave, assembled of my self from a kit from HOPF, and on my wrist a JUNGHANS Mega 1 - at any time I follow the correct time. ****************************************************************************** My main interest in this subject derives from my effort with the current clock system of the GPS Block IIR satellites. The GPS operation, which provides both navigation and time information, was started in 1980 and was designed from the beginning to work with the current UTC operation using leap seconds. As I understand it, the GPS system and its users work well with the current UTC operation. I have heard allegations that the current definition of UTC time is related to the apparent Y axis bias problem as well as the modeling of solar radiation model during eclipse. According to this view, these problems would be resolved by eliminating the secular drift between UTC and UT1 and changing the definition of the SI second. I concur with those who hold opposing opinions that these issues would not be helped by the elimination of the secular drift between UTC and UT1 and changing the definition of the SI second. I feel that the Y axis bias and the solar radiation issues are likely related to the fact that different sides of the GPS vehicles have differing temperatures, which give rise to radiation forces which modify the orbit. The Y axis bias and solar radiation during eclipse issues then arise because we have not been able to fully model the vehicle temperatures and emissivity. If radiation modeling is our problem, messing up the UTC definition will probably only hurt the GPS operation and not help it.I have heard another allegation that the GPS Control Segment Ground Kalman filter has problems with the occurrence of leap seconds and that extra effort is required at that time. My understanding is that the current ground Kalman filter is totally unaffected by the presence of leap seconds because the Kalman filter uses GPS time which does not have any leap seconds. In summary, the current GPS system works well and has no problems with leap seconds. My concern is that changes might be made in the definition of UTC, which would impact the operation of the GPS system and obsolete very expensive equipment such as the satellites and all the millions of ground receivers, which depend on GPS for navigation and timing information. On the other hand, there obviously are many users of UTC that are hurt by the present definition and would like to eliminate leap seconds or make them more predictable. One could make the leap seconds more predictable, so that the equipment and software providers could design to a known set of leap seconds, which would cover the life of the equipment. This new definition UTC leap second definition would depend on how well the astronomers can predict UT1 time and on an increase in the allowed absolute size of UT1 - UTC. ****************************************************************************** As a geophysicist, we need to know when and that's it. ****************************************************************************** Il would be highly desirable to have a rela-time prediction of UTC ****************************************************************************** ****************************************************************************** UT1 is more appropriate for general purpose timekeeping that UTC in most cases, including calculations across long time spans, calculations that must be made in embedded systems (you generally can't supply updated leap second information to devices in the field), approximate timekeeping (e.g., wall clocks) and most things that try to keep track of calendar information. When UT1 is inappropriate (e.g., in a high precision operation), it is simpler to track an integral number of seconds (e.g, TAI). Many applications track an integral number of seconds that are assumed to behave like UT1 (i.e., 86400 seconds per day) (e.g., typical computer calendar calculations), anyway, and are forced to either put up with 0.9 s of inaccuracy or convert to something like TAI anyway. Set the wallclocks and most computers to UT1. Use TAI exclusively when calendar information is not needed. Maybe we should rename the S.I. second to a "quantum" or something, use the label "second" for 1/86400 solar day, and treat them like two separate units (akin to degrees celsius and kelvins). Almost all applications will be simpler (or more correct). Distributing accurate UT1 need be no more difficult than distributing leap second information. ****************************************************************************** In future redefine the second ****************************************************************************** For most computer users, synchronization of time stamps is far more important than accurate frequency and interval length. Simplicity of implementation (i.e., no rare special cases for application software) and wide availability of official reference time information (close relation to UTC/civilian time radio broadcasts) are equally important. Leap seconds (special case 23:59:60) are not practical to be implemented widely in application software. TAI is not practical to be widely used, because it has no simple fixed relationship to UTC/civilian time. It would be helpful if the custodians of UTC (IERS, ITU) introduced formally an additional timescale for computer applications that is identical to UTC except for a brief period surrounding a UTC leap second. This second timescale would smooth out the leap second by adjusting the clock frequency by for example 0.1% for 1 kilosecond, instead of inserting an out-of-range numbered second. Smoothing clock adjustments has a long tradition in computer operating systems (cf. the adjtime() system call in BSD Unix), and the official formal standardization of a smoothed variant of UTC would allow implementors to refer to and use "smoothed UTC" in a consistent and interoperable way. "Smoothed UTC" would not affect time broadcasting, as any time signal receiver that knows classical UTC as well as short-term announcements of UTC leap seconds with be able to calculate "smoothed UTC" via a simple formula *locally*. In computer systems synchronized by some network time exchange protocol, the calculation of the "smoothed UTC" from the received normal UTC would typically be performed by the operating system kernel. Application software would never see 23:59:60, but would still enjoy almost all the benefits of strict synchronization to UTC. The UTS time scale, a detailed and already widely discussed proposal of what the definition of such a smoothed variant of UTC could look like is attached and is also available on http://www.cl.cam.ac.uk/~mgk25/uts.txt It never differs more from UTC than 1.0 seconds, and even that can happen only during the 1000 seconds before a leap second. It can be implemented based in the leap second warning bits that most time broadcasting stations transmit already at least an hour in advance of any leap second. It is very similar to UTC, but less disruptive to software. ------------------------------------------------------------------------ Proposal for a Smoothed Coordinated Universal Time (UTS) -------------------------------------------------------- Considering a) that the definition of Coordinated Universal Time (UTC) in ITU-R Recommendation TF.460-4 relies on the astronomical observation of the UT1 time scale in order to introduce leap seconds, b) that UTC is today the basis for almost all official national time scales and therefore widely available to users, c) that most broadcast time signals provide UTC and rarely provide the uniform time scale International Atomic Time (TAI) or the current TAI-UTC difference, d) that leap seconds and therefore TAI-UTC changes are announced by the International Earth Rotation Service (IERS) only six months in advance and therefore accurate information about them cannot be embedded practically as stable look-up tables into products, e) that a very large and steadily increasing number of information systems have access to broadcast UTC time signals and use these to synchronize their internal clocks, f) that broadcast UTC time signals commonly announce leap seconds at least one hour in advance but usually provide no other information about past or future leap seconds, g) that numerous information and communication systems use an internal time scale based on a fixed length of the day of 86400 seconds, in which there exists no unique representation for points in time during an inserted UTC leap second, including the widely used POSIX time scale defined by ISO/IEC 9945-1:1996 in section 2.2.2.113, h) that short-term time interval measurements and event scheduling on such systems is severely affected if UTC leap seconds are handled by either halting during an inserted leap second or setting back by a one second step after an inserted leap second the internal clock and by advancing the internal clock by a one second step at a deleted leap second, i) that a temporary circa 0.1% modification of the speed of the internal clock near a UTC leap second in order to provide a continuous mapping between the internal time representation and UTC is far less disruptive on such systems than halting or resetting the system clock, j) that a standardized recommended way of performing such a temporary modification of the speed of the clock in the internal time representation of information and communication systems will significantly improve the interoperability of such systems, k) that the availability of such a standard could significantly reduce the need to modify the definition of UTC, as it is currently under study by the International Union of Radio Science (URSI), the International Astronomical Union (IAU), the International Telecommunications Union (ITU-R) and the International Bureau for Weights and Measures (BIPM), I suggest to standardize the UTS time scale defined in the following as a recommended basis for defining the internal time scales used on UTC-synchronized information and communication systems that do not provide a unique representation for points in time during inserted UTC leap seconds. UTS shall be identical to UTC except during the last 1000 seconds of a UTC day that is shortened or extended by one leap second as announced by the International Earth Rotation Service in accordance with ITU-R Recommendation TF.460-4. Inserted leap second Whenever a UTC day is extended by an inserted leap second that lasts from 23:59:60 to 24:00:00, this leap second shall not appear on the UTS time scale. The last 1000 seconds on the UTC time scale from 23:43:21 to 24:00:00 will be represented on the UTS time scale by 999 modified seconds 23:43:21 to 24:00:00, each of which lasts 1000/999 seconds. The following table shows the exact and simultaneous display of a UTC and UTS clock at various points in time near an inserted leap second: UTC UTS 23:43:20.000 23:43:20.000 23:43:21.000 23:43:21.000 <- UTS starts to diverge from UTC 23:43:22.000 23:43:21.999 23:43:23.000 23:43:22.998 23:43:24.000 23:43:23.997 ... 995 seconds later ... 23:59:59.000 23:59:58.002 23:59:60.000 23:59:59.001 <- leap second starts 00:00:00.000 00:00:00.000 <- leap second ends and UTS=UTC 00:00:01.000 00:00:01.000 Deleted leap second Whenever a UTC day is shortened by deleting a leap second that would otherwise last from 23:59:59 to 24:00:00, this leap second shall nevertheless appear on the UTS time scale. The last 1000 seconds on the UTC time scale from 23:43:19 to 24:00:00 will be represented on the UTS time scale by 1001 modified seconds 23:43:19 to 24:00:00, each of which lasts 1000/1001 seconds. The following table shows the exact and simultaneous display of a UTC and UTS clock at various points in time near a deleted leap second: UTC UTS 23:43:18.000 23:43:18.000 23:43:19.000 23:43:19.000 <- UTS starts to diverge from UTC 23:43:20.000 23:43:20.001 23:43:21.000 23:43:21.002 23:43:22.000 23:43:22.003 23:43:23.000 23:43:23.004 ... 995 seconds later ... 23:59:58.000 23:59:58.999 00:00:00.000 00:00:00.000 <- leap second skipped and UTS=UTC 00:00:01.000 00:00:01.000 Algorithmic representation In the algorithmic description below, the following variables appear: UTC refers to the time given in the time signal received from a Coordinated Universal Time reference clock UTS refers to the resulting calculated UTS time that corresponds to UTC midnight(UTC) refers to the time of the start of the day identified in UTC, that is the date of UTC with the time set to 00:00:00 leap_warning(UTC) = 1 if there is already the insertion of a leap second announced for the end of the day identified in UTC -1 if there is already the deletion of a leap second announced for the end of the day identified in UTC 0 otherwise Where a variable represents a time, this representation is in the form of the time interval measured in seconds that has passed since some unspecified epoch and the point in time referred to. Constants of the form hh:mm:ss.sss refer to a time interval measured in hours, minutes and seconds. Note that 23:59:60 = 24:00:00 = 86400 s and 00:16:40 = 1000 s. On reception of a UTC time, the corresponding UTS time can be determined by applying the following algorithm: if UTC - midnight(UTC) >= 23:59:60 then -- we are in the middle of an inserted leap second (i.e., the -- current UTC day is already longer than 86400 seconds) UTS := midnight(UTC) + 23:59:59.001 + (UTC - midnight(UTC) - 23:59:60) * 999/1000; else -- if not, calculate seconds remaining until the end of the day R := midnight(UTC) + 24:00:00 + leap_warning(UTC) * 00:00:01 - UTC; -- if less than 1000 seconds remain, start interpolation if R < 00:16:40 then UTS := UTC - (00:16:40 - R) * leap_warning(UTC) / 1000; else UTS := UTC; end if; end if; This algorithm uses only information present in the currently available UTC time signal, which has to include a leap second announcement that starts to be available at least 1000 seconds before the end of the UTC day. Rationale for some of the design choices: - UTS uses linear interpolation because this avoids discontinuities and is simpler to describe, understand, and implement and also more efficient to compute than interpolation using for instance higher-degree polynomials. UTS is not primarily intended to be used for controlling large masses (e.g., the phase and frequency of generators in power plants), where a more complex interpolation technique (e.g., B-splines) that minimizes control forces might be preferable. - Stretching the linear correction over 1000 seconds leads to nice and intuitive decimal UTS display values at the start of UTC seconds. This would not be the case if this correction time interval were an integral number of minutes. Requiring only 1000 seconds advance notice for an upcoming leap second leaves plenty of time for error checking and correction in a potentially noisy low-bandwidth time signal that starts to announce leap seconds 59 minutes before the end of the day (e.g., DCF77). Stretching the linear correction over only 100, 10, or even 2 seconds would have increased the potential error in time interval measurements and with it the number of applications that cannot tolerate this error. - Keeping the time interval in which UTS differs from UTC entirely before the end of the day ensures that UTS can be accurately determined in real time using the type of leap-second announcement message that will disappear from the broadcast signal immediately after the end of the leap second. On the other hand, the disadvantage of this choice is that UTS diverges from UTC by up to 1 second. In contrast, centering the correction time interval around the leap second would limit the maximum divergence between UTS and UTC to 0.5 seconds. Additional remarks UTS is primarily intended as the basis for defining the internal clock representation used in information systems that have problems handling UTC leap seconds. Time service broadcasts and definitions of national and regional civilian reference times are not affected by this proposal and should continue to be based on UTC. The conversion from UTC to UTS is expected to happen only inside an end system, not in reference clock hardware or at a time service broadcasting station. Users of systems with an internal time representation based on UTS will typically be shown UTS-based and not UTC-based times, which seems tolerable in practice, especially if UTS becomes a formally widely recognized standard for this purpose and the use of UTS instead of UTC as a basis for the synchronized system time is properly documented. It should be noted for comparison that a loss of connection with a UTC reference clock for just a few days will lead to more than one second divergence from UTC with typical commercial crystal oscillators without temperature control. If a system designer should nevertheless decide to broadcast a UTS-based reference time signal, then this time signal (like most UTC time signals) should include an announcement of forthcoming inserted or deleted UTC leap seconds, starting at least 59 minutes in advance. This information is essential to allow a receiver of a UTS time signal to distinguish normal seconds from 0.1% modified seconds and to allow it to convert a UTS time signal back into a UTC time signal. Comments, endorsements, and suggestions on how to proceed with the potential formal standardization of this proposal by a relevant international body (ITU, IERS, IAU, BIPM, etc.) are welcome and solicited.