How NOAA Is Using Microwaves To Measure Tides
By John Oncea, Editor
Tides can have a marked effect on maritime-related activities. NOAA is using microwave radar water level sensors to measure water levels around the nation and help ensure that those activities are safe.
Predicting tides helps ships safely navigate through shallow water ports and intercoastal waterways. Engineers depend on tidal schedules when planning for and constructing bridges and docks. Increased marine traffic in channels leaves little room for error, requiring a better understanding of the channel’s depth during high and low tides.
“Tidal data is also critical to fishing, recreational boating, and surfing,” writes the National Oceanic and Atmospheric Administration (NOAA). “Commercial and recreational anglers use their knowledge of the tides and tidal currents to help them improve their catches. Depending on the species and water depth in a particular area, fish may concentrate during ebb or flood tidal currents.”
Measuring tides is not a recent need, either. People have been doing so for centuries with increasingly sophisticated methods developed over time. Here, we take a look at those methods, from visual observations on tide poles dating back to the 17th century to the use of microwaves today.
A Brief History Of Tide Measurement
People have been measuring tides for thousands of years and, in 325 BCE, Greek explorer Pytheas of Massalia reportedly took note of the tides of Great Britain and became the first to notice a correlation between tides and the Moon. Some 2,000 years later, at the tail end of the 1600s, the first systematic measurements of sea levels were made with tide poles.
These tide poles – along with all tide gauges – “are instruments that measure coastal sea level relative to the land on which they are grounded; hence, their recording quantity is termed as relative sea level,” writes Science Direct. “Because of their relevance for maritime navigation and harbor operation and safety, sea level measurements from tide gauges are among the longest geophysical instrumental records.”
“One of the disadvantages of tide staffs,” writes NOAA, “was that measurements could only be taken if an observer was present. This meant making tidal observations was time-consuming. Improvements in technology led to tide gauges that worked with increasingly less human intervention.”
The first of these improvements came about in 1851 when Joseph Saxton invented a self-recording tide gauge that used a pen and a rotating paper drum to record the changing water level. “While this gauge was not the first self-recording tide gauge, it was an improvement over existing instruments and was the type first deployed by the U.S Coast Survey,” NOAA notes.
Saxton’s self-recording tide gauge represented a significant advancement in tidal measurement technology. This instrument utilized a float housed within a stilling well, which rose and fell with the changing tide. The float was connected to a wire that interfaced with a system of mechanical pulleys and gears, ultimately controlling the movement of a stylus across a strip chart.
As the tide fluctuated, the stylus traced a continuous graph on the strip chart, providing a visual representation of tidal changes over time. To ensure accurate time correlation, spring-wound clocks were employed to advance the strip chart at a consistent rate. This innovative design allowed for the creation of a detailed, time-stamped record of tidal movements, which could later be analyzed by tidal computers in Washington, D.C. The implementation of these self-recording tide gauges marked a significant improvement in both the quantity and quality of tidal data collected by the Coast Survey.
Saxton’s gauge gave way to the Standard Tide Gauge, a more compact analog tide gauge that was used until the late 1960s and revolutionized tidal measurements with its innovative design. It utilized a float attached by wire to a gearing mechanism inside a stilling well, typically a 12-inch-wide pipe.
The gearing mechanism controlled a pencil’s position relative to a rotating drum covered with paper, creating a continuous record of tidal changes. This setup, housed in a stilling well, allowed for accurate readings even in the presence of waves. The system employed sophisticated spring-wound clocks for precise timing, resulting in syncronized pencil tracings that depicted the rise and fall of tides.
Processing tide records was a manual task that involved careful analysis of the paper strip charts. Technicians would advance these charts across a desk using rollers and calibrated rulers to determine the times and heights of high and low tides, as well as hourly heights. These readings were then recorded on paper forms, from which monthly mean values of various tidal parameters were calculated by hand.
Portable versions of the Standard Tide Gauge were also developed for short-term deployments during hydrographic surveys, using wax marigrams that required weekly replacement and manual processing similar to their fixed counterparts.
Tide Measurement Enters The Digital Age
The Analog-to-Digital (ADR) tide gauge, introduced in 1966, marked a significant advancement in tide measurement technology. While retaining the float, wire, and stilling well components of its predecessor, the Standard Tide Gauge, the ADR incorporated innovative features that revolutionized data recording and processing.
The ADR utilized a punch paper tape system to record water levels at six-minute intervals, replacing the analog paper charts of earlier gauges. Solid-state battery-powered timers superseded the spring-wound clocks, improving timing accuracy. This new system generated computer-compatible data, which could be read by an optical reader and transferred to nine-track magnetic tape for computer processing.
The transition from Standard Tide Gauges to ADR gauges occurred gradually over a decade, from 1966 to 1976. ADR gauges remained in use until 2003 when NOAA completed its shift to the more advanced Next Generation Water Level Measurement System (NGWLMS), a significant advancement in tide gauge technology. This system modernized various aspects of water level measurement, including sensors, data collection, transmission, processing, and database management. Key features of the NGWLMS include:
- Primary sensor: A downward-looking acoustic sensor that sends acoustic energy down a PVC-sounding tube and measures the travel time of the reflected signal to determine the water level.
- Backup sensor: A strain gauge pressure transducer that records data on a separate platform.
- Measurement frequency: The system takes measurements every 6 minutes, with each measurement consisting of 181 one-second interval water level samples.
- Data transmission: Measurements are transmitted via GOES satellite every three hours, with additional telephone connections available for data retrieval and system interaction.
- Ancillary sensors: The system can handle up to 11 different oceanographic and meteorological sensors, measuring parameters such as wind speed, water temperature, and barometric pressure.
- Accuracy: The system is designed to meet NOAA's accuracy standard of ±0.01 foot or 0.2 percent of the effective stage for most applications.
The NGWLMS represented a complete modernization of water level measurement technology, improving data collection, processing, and dissemination capabilities compared to earlier analog systems. It remained the standard until about a decade ago when NOAA introduced the microwave radar water level sensor — a revolutionary step forward in measuring water levels around the nation.
Measuring Tides With Microwave Radar
In the past, before the use of computers to record water levels, “tide houses” were constructed to house permanent tide gauges. These structures held the necessary instruments, including a well and a mechanical pen-and-ink recorder, while a tide or tidal staff was affixed outside.
The tide staff, serving as a large measuring stick, allowed scientists to manually record tidal levels. These manual observations were then compared to readings taken every six minutes by the recorder. Monthly maintenance was required for the tide houses and the data they recorded. During these sessions, scientists collected the data tapes and sent them to headquarters for manual processing.
“While similar in design to older tide houses, newer tide station enclosures are designed to protect sensitive electronics, transmitting equipment, and backup power and data storage devices,” writes NOAA. “The older stilling well has been replaced with an acoustic-sounding tube and the tidal staff with a pressure sensor. The new field equipment is designed to operate with the highest level of accuracy with a minimum of maintenance, transmitting data directly back to NOAA headquarters for analysis and distribution.”
Recently, NOAA began transitioning from acoustic water level sensors to the Microwave Radar Water Level Sensor, a big step forward in how water levels are measured that will replace the hundreds of older acoustic water level sensors deployed around the nation.
The microwave radar sensor operates by emitting high-frequency radio waves that bounce off the water surface and return to the sensor, notes the American Meteorological Society. The time between emission and reception is used to calculate the distance to the water surface and determine the water level.
Microwave radar sensors offer several advantages over acoustic sensors, starting with the elimination of hydraulic effects from pressure variations by eliminating contact with the water surface. They are also insensitive to temperature variations, there are no submerged parts that require cleaning or maintenance, and they can penetrate vapor layers and insulators that might absorb sound waves.
They are also more cost-effective and efficient than acoustic systems with an accuracy of ±0.03% of the measured range and an average accuracy of around ±0.02 mm for NOAA CO-OPS usage.
Typically mounted within 131 feet above the water surface, microwave radar sensors must be installed perpendicular to the water surface and require careful evaluation of the beam pattern to avoid interference from mounting structures. On the downside, the signal can be scattered/blocked by rain, ice, or floating debris, and accuracy can be affected by interference echoes from attachments or debris. Also, variable surface-area footprint introduces a spatial filter effect.
NOAA is carefully transitioning from acoustic to microwave sensors across its National Water Level Observation Network. This involves:
- Extensive testing and comparison of sensors
- Running both old and new sensors side-by-side for 1-2 years before fully switching
- Field studies to compare performance against other sensor types
Overall, the microwave radar sensors offer significant advantages in terms of maintenance, cost, and performance for NOAA’s water level monitoring network. The transition is ongoing as NOAA carefully validates the new technology across different environments.