Home » NIST Time Scale Operation

NIST Time Scale Operation

1994 – 2000

The retirement of David Allan of “Allan variance” fame led to Howe’s reassignment in 1994 to NIST’s Atomic Time Standards Group which Mr. Allan formerly led. This Group is responsible for maintenance of the SI physical standard of time interval, the “second,” and the national standard of time used as the basis of Universal Coordinated Time, or simply UTC.  Improved realization of NIST’s Atomic Time occurred during his seven-year tenure.  The NIST national time standard known as UTC(NIST) is generated from a set of H-masers that are calibrated by NIST’s primary Cs standard and coordinated worldwide to UTC at the BIPM in Paris by administered “steers” via two-way satellite time transfer and GPS.

At NIST’s highest level, atomic clocks and system noise must be well characterized with world class precision and accuracy. This requires a thorough understanding of standards, physical signals, various protocols, and a myriad of analysis techniques. In particular, he has intimate knowledge, experience, and problem solving abilities with respect to references and measurements of those references, specifically topics related to (1) random noise processes and uncertainties, (2) systematic or deterministic processes, (3) optimum prediction, (4) statistical combining, known as “ensembling,” and (5) synchronization methodologies.

Related publications:

SIGNAL AND NOISE ANALYSIS

Having in-depth, practical experience in constructing and maintaining the U.S. national time standard is a unique skill, Howe developed world-class proficiency in the following subjects:

  • Statistical studies (population statistics, time-series analysis, etc.)
  • Variance, covariance, and auto-covariance analysis
  • Degrees of freedom and equivalent degrees of freedom
  • Statistical bias and signal-processing bias (for example, in servo systems, limited-live data, multipath, code correlations, etc.)
  • Cross- and auto-correlation functions, orthonormality, stationary and ergodic criteria
  • Confidence intervals, uncertainty, and error analysis
  • Regression analysis
  • Stochastic modeling
  • Applied probability
  • Estimation and prediction of systematic effects
  • Maximum likelihood estimation
  • Frequency response functions
  • Time, frequency, scale, and phase domain analysis
  • Computer simulation
  • Integral and differential vector calculus, numeric and analytic problem solving
  • Perturbation theory
  • Distribution functions (normal, Gaussian, Chi-square, t, etc.)
  • Finite differences, structure functions, and interpolation techniques
  • Matrices and linear transforms
  • Characterization of noise (white, flicker, random-walk, etc.) and pseudo-noise codes (Gold, MLS, etc.)
  • Power measurements and power spectrum estimation, and power-spectral density
  • Fourier transform techniques (discrete spacial, discrete time, continuous, fast-fourier, etc.)
  • Window functions (rectangular, Hamming, Hanning, cos-squared, Gaussian, etc.)
  • Leakage effects, truncated data effects, zero-padding effects
  • Wavelet analysis, transforms, mother wavelet functions, basis functions (Haar, Daubechie, Mex-hat, etc.)
  • Modeling and prediction (using wavelet coefficients, Kalman, auto-regressive moving-average–ARMA, etc.)
  • Data compression and expansion algorithms, secure-data, and data encription techniques
  • Maximum entropy method

Related publications:

NEW DOCUMENT STANDARDS AND TESTS FOR STABILITY

Since the classical standard variance is not a suitable measure of oscillator and/or system instability, noise level and type, the Allan variance developed at NIST in 1965 has been the standard statistic for determining these fundamental parameters. Howe is lead author with Dave Allan and Jim Barnes of the paper, entitled “Properties of Signal Sources and Measurement Methods.” This document has been a classic referenced by members of the IEEE subcommittee on standards, the International Telecommunications Union, and other international governing committees which recommend publications for scientists interested in advanced, thorough treatments of time-series analysis as applied to communications and navigation systems involving oscillators and frequency standards.

The most significant impairment to assuring UTC(NIST) is the accuracy of or confidence in each clock in the Atomic Time scale.  He developed a new class of variances beginning in 1995, called TOTAL and THEO variances, which specify oscillator and network instability with unprecedented precision and accuracy, particularly for long averaging times and tolerance to data gaps, compared to any statistic including the sample Allan variance which has poor long-term confidence. This advancement for national standards institutes, system developers, manufacturers and users made his new variances an immediate IEEE recommended statistic for specifying all oscillator-related system instability measurements [IEEE 1139-1999: Standard Definitions of Physical Quantities for Fundamental Frequency and Time Metrology – Random Instabilities]. These statistical tools have extensively used in commercial software, user handbooks, and standardized clock statistics.  They have been long-established as the best analytics for high-confidence, dynamic characterizations of state-of-the-art atomic clocks and oscillators. US and Italian proposals are in place to formally add these statistics by a vote of the 2016 International Telecommunication Union (ITU) Standards Working Group 7 in Geneva.

Related publications:

Since its inception, Howe has worked closely with the U.S. military and several contract companies on the GPS, a constellation of 28 satellites run by the Air Force. Besides being an outstanding military resource and civilian navigation aid, GPS is now a main synchronization resource for many world-wide communication networks and national standards laboratories. He has intimate knowledge of a variety of synchronization and calibration methods which use GPS, having done original research and having developed many operational systems for over 25 years. He has consulted to several companies regarding GPS’s effect on systems which rely on it, including the analysis of timing errors, selective availability, network stability, multipath and VSWR effects on code tracking, vulnerability to local jamming and spoofing, and even military and national emergency policies. Since GPS is often used to demonstrate and deploy certain military and civilian technologies, such as synchronization of power industry’s power generators and grids, web based network time protocol (NTP), wireless OFDM 4G LTE cell site synchronization, time-tags of stock transactions, space-vehicle tracking, surveying, etc., he has also done specific work in developing alternative resources to reduce dependence on GPS’s signal. These include:

  • Common view time transfer
  • Two-way time transfer
  • Specifying, estimating, improving, and/or correcting node-oscillator drift and aging
  • Ring laser gyro and various inertial navigation systems
  • Broadcast services (for example, WWVB, Loran, etc.)
  • Fixed satellite service
  • Laser ranging
  • Fiber optic data communications
  • Oscillator combining, developing and testing ensemble algorithms, and redundant systems analysis
  • Managed and administered real-time optimization of network topology
  • Automated computer time services
  • Real-time time-interval noise analysis (called “cluster” analysis) for isolating and correcting cause of noise in a system
  • Automatic seek of lock to highest available stratum reference
  • Wide area augmentation systems (WAAS) and local area augmentation systems (LAAS)
  • Differential GPS

Related publications:

COMPUTER SCIENCE AND RELATED SKILLS

The maintenance of time synchronization among standards laboratories is typically labor-intensive, requiring a great deal of manual work to setup equipment and collect data, if these procedures are not automated. From January, 1994, to May, 1995, Howe authored a study of options dealing with the implementation of a variety of computer architectures, with the ultimate goal of integrating the time from advanced world-wide frequency standards labs using two-way satellite-time transfer, or TWSTT (see item 1, next section), as efficiently as possible into NIST’s existing atomic time-scale computer and without sacrificing performance criteria. This required not only theoretical analysis, modeling, and optimizing of computer-logic timing sequences, but also architecture, and choice of programming language to achieve a maximum level of automation, speed, and uncompromised performance. His combined formal background in math, physics, and electronics enabled me to perform computer science tasks to complete this study, in addition to other subsequent computer-related tasks, which use both PC and custom time-scale computers and FPGAs.

He has installed servers and contributed significantly to NIST’s Automated Computer Time Service (ACTS) and the Internet Time Service (ITS). The following is a list of various skills related to this work:

  • Proficient on computer operating systems
  • Proficient in major computer languages
  • Proficient in major high-level computer programs

In addition, he has substantial knowledge in the operation of highly specialized test equipment and hardware built by the following manufacturers: Hewlett-Packard (now Keysight), Rhode-Schwartz, Fluke, Tektronix, IBM, any Intelsat-compliant satellite equipment manufacturer, any FCC type-accepted transmitter and receiver manufacturer, any ISO-9000 and ITU-compliant measurement, test, and analysis equipment.

Back To Top

Log Out