Often, the act of performing a measurement on a signal can alter the signal. For example, your measurement instrument could cause extra current to flow between the device under test (DUT) and measurement tool, resulting in a signal that is slightly different than the original output.
Sometimes a slight change in a signal is fine. But other times, you need to minimize the impact your measurement devices might have on your signal if you need a highly accurate measurement. Thus, there is a balancing act needed to take an accurate, high-quality measurement in a way that does not compromise the integrity of your signal. This is not always an easy task as there is a lot to think about.
Considerations for Minimizing a Measurement’s Impact to a Signal
To minimize the impact of a measurement on a signal, you first need to determine the most important quality of the signal that needs to be measured. Ultimately, this will help you decide what you need to focus on preserving such as the signal amplitude or phase, rise time, or frequencies. During a test, some signal measurements may not be necessary, or some measurements may be more tolerant to variances. Therefore, prior to performing a measurement, it’s best to think about the following questions:
- How much added noise is OK?
- How much crosstalk is permissible?
- Is any amount of signal attenuation acceptable?
- Are there any common mode issues?
You also need to fully understand the physical implementation or source of your signal. Some of these considerations include the following:
- What type of connector is being used (electrical, pneumatic, etc.)?
- Are you measuring a pair of signals or a single signal referenced to ground?
- What is the impedance of the signal source? Signals with low source impedances are more resistant to changes from the measurement environment.
- What is the current or voltage range? Some measurements are so tiny that very minor interferences could mess up your measurement.
- Can your measurement path be modeled as lumped elements or a distributed system?
It’s also important to think about the limits of the measurement instrument’s physical interface. For example, if you exceed a certain voltage or current, will you destroy your measurement instrument? Does it have any bandwidth restrictions to consider?
Similarly, you should also think about any safety considerations that need to be addressed such as those presented by working with high voltages or currents. To protect your employees and equipment you may need isolation between the signal source and the measuring device. However, if isolation is necessary, it must be done in a way that will not interfere with signal integrity.
Another consideration, especially for RF measurements, is electromagnetic interference (EMI). Since EMI can disrupt other electronic devices and equipment, you don’t want these signals to leak out and escape into the world. Additionally, government agencies such as the FCC often limit leakage amounts so that you do not interfere with other organization’s signals. To prevent leakage, you should shield the signal you’re trying to measure in a way that doesn’t corrupt your signal.
Overall, performing a high-quality signal measurement is a balancing act with a lot of factors that can quickly change the quality of your signal. While this post is just scratching the surface, in general, if you can identify and address potential signal integrity issues quickly and early, you can ensure the quality of your measurements, and ultimately, of your products.
For more information on creating highly accurate RF test systems, read our white paper: Five Best Practices for Quality RF ATE Measurements