This guest blog post is part of a series written by Edward J. Farmer, PE, ISA Fellow and author of the ISA book Detecting Leaks in Pipelines. To download a free excerpt from Detecting Leaks in Pipelines, click here. If you would like more information on how to purchase the book, click this link. To read all the posts in this series, scroll to the bottom of this post for the link archive.
When we analyze observations of processes we think about changes in our important parameters over time. A controller finds a flow to be low and moves a valve some amount with the idea of quickly changing flow to the correct amount.
Continuous or periodic adjustments intended to steer observed parameters toward optimal values occurs continuously over time to accomplish our objectives. Generally speaking, a graph of this process would have some parameter of interest on the vertical scale, and the horizontal scale would be “time.”
When we get into analyzing processes and designing methods to eliminate problems and optimize results we use test methods such as the impulse function, and unit step function to help us categorize the deeper characteristics of the occurrences we need to understand as well as what we can do with them, and how it might be done. Such analysis involves transitioning from the “time domain” into the “frequency domain.”
A long time ago a French mathematician and physicist named Jean-Baptiste Joseph Fourier developed a mathematical process (a transform) for characterizing occurrences in the time domain into a collection of single-frequency sinusoids (in the frequency domain) so that the same result could be characterized either way.
Essentially, the process adds the contribution of each of the component frequencies to produce a different combination. Click this link to watch a video demonstration. This is essentially the musical idea between individual notes and the chord one hears when they are played together.
Knowing the frequency content of a time-domain depiction of a process variable helps expose the magnitude and frequencies of its frequency-domain component parts. The base time-domain signal structure usually results from frequencies commensurate with the time domain presentation. Rapid turns, inflection points, sharp corners and edges, and other detail-oriented stuff is usually the product of higher frequencies.
Each of these component frequencies is characterized in a Fourier analysis by its magnitude and its frequency. A unit step pulse is a time domain signal that rises instantly above a baseline to a finite value, continues for a time, and then returns instantly to baseline. The basic shape of the time domain depiction is usually the result of a low frequency sinusoid of commensurate wavelength. Sharper features (steep rise and fall times, sharp corners, and greater detail) involve contributions from higher frequencies. Essentially, a pulse without high frequencies in its spectrum looks more like a sinusoid.
Suppose one wanted to filter the time-domain signals in a precise way. The time domain stream could be transformed to the frequency domain, specific frequency components mathematically removed, and the result converted back to the time domain.
This is the idea behind digital filtering. A low-pass filter, for example, could assign the magnitude of all frequency components above some cut-off frequency to zero, eliminating them. When the processed signal is transformed back to the time domain the result will be apparent from the smoothing away of the sharp and fast-response characteristics.
Similarly, a high pass filter can be created by assigning coefficients below some cut-off frequency to zero while preserving all the others. A band pass filter, of course, results from high-pass and low-pass filters, each with a cut-off frequency at the desired filter edge.
Filtering data can be very elucidating about process conditions and sources of noise in the measurement signals. Obviously, the process itself has some finite bandwidth so frequency components greater than that aren’t really there or aren’t the result of things in which we are interested.
Eliminating them (filtering them) can improve clarity and reduce processing time. Shifts in baseline can be eliminated by setting the zero Hertz frequency amplitude to zero – all the dynamic characteristics of the process remain visible without shifting bias. A step function test produces a pulse in the various outcomes and the frequency domain transforms disclose the effective bandwidth involved in each of them.
Rounded and indistinct results indicate low-pass filtering, for example. When a control loop involves a measurement that is a long distance from the control device there is often dead-time between a change in the control device and seeing it in the measurement. This dead time introduces oscillation in the control system that has a frequency related to the dead time. In the frequency domain this shows up as a large component with stable frequency that is occurring for no understandable reason. Seeing such a thing, and relating its frequency to wavelength provides evidence about the location and behavior of the precipitating equipment.
In pipeline work, it becomes apparent long runs of line pipe tend to low-pass filter the fluid transport process. Changes that are seen as fast and abrupt near an event become less distinct and smoother as distance from the event increases.
What might be seen as a fast rise when observed near the event will look gradual and smooth as distance from the event increases. The shape difference between such response curves can exacerbate accurate timing which can affect control loop operation and time-interval dependent calculations such as leak location.
Again, resolving the stochastic nature of such happenings and the conditions in which they occur emphasizes the importance of focusing on the underlying process nature and characteristics, not just the way the signals “wiggle” at different places. Once upon a time I patented an algorithm that estimated distance from a measurement to the event’s precipitating location based, essentially, on waveform degradation.
While it worked when enough things about the fluid and the pipe were known the stochastic nature of precipitating events, location, wave travel differences, changes in fluid characteristics resulting from the event and simultaneous random events made it difficult to count on commercially accurate and specific results.
The world we observe happens in the time domain but a lot of its secrets and idiosyncrasies are easier to imagine and observe in the domain of frequency.
Learn more about pipeline leak detection and related industry topics
Book Excerpt + Author Q&A: Detecting Leaks in Pipelines
How to Optimize Pipeline Leak Detection: Focus on Design, Equipment and Insightful Operating Practices
What You Can Learn About Pipeline Leaks From Government Statistics
Is Theft the New Frontier for Process Control Equipment?
What Is the Impact of Theft, Accidents, and Natural Losses From Pipelines?
Can Risk Analysis Really Be Reduced to a Simple Procedure?
Do Government Pipeline Regulations Improve Safety?
What Are the Performance Measures for Pipeline Leak Detection?
What Observations Improve Specificity in Pipeline Leak Detection?
Three Decades of Life with Pipeline Leak Detection
How to Test and Validate a Pipeline Leak Detection System
Does Instrument Placement Matter in Dynamic Process Control?
Condition-Dependent Conundrum: How to Obtain Accurate Measurement in the Process Industries
Are Pipeline Leaks Deterministic or Stochastic?
How Differing Conditions Impact the Validity of Industrial Pipeline Monitoring and Leak Detection Assumptions
How Does Heat Transfer Affect Operation of Your Natural Gas or Crude Oil Pipeline?
Why You Must Factor Maintenance Into the Cost of Any Industrial System
Raw Beginnings: The Evolution of Offshore Oil Industry Pipeline Safety
How Long Does It Take to Detect a Leak on an Oil or Gas Pipeline?
Pipeline Leak Size: If We Can’t See It, We Can’t Detect It
An Introduction to Operations Research in the Process Industries
The Enigma of Process Knowledge
Energy in Fluid Mechanics: How to Ensure Physical Line and Operating Data Are Consistent