When starting a design or looking at a test situation, knowing whether to take the digital or analog fork in the road can be quite difficult. As I mentioned last month, there is often a misplaced sense of accuracy attached to digital measurements or techniques.
Digital approaches suffer from both artifacts and granularity, which can be quite irritating in many applications. To really understand each design path, it helps to see what each brings as benefits and drawbacks to the design or test situation.
Analog techniques are continuous, high-accuracy, infinitely variable solutions, but they can suffer from drift, especially temperature drift, and often represent data that is hard to store and recall without degradation. They can have significant noise floor components that are additive in successive processing steps. Analog techniques offer the truest possible fidelity, but cannot be stored or retrieved at the same fidelity, and have inescapable signal-to-noise issues that often are thermally related. Analog systems generally have gradual degradation, and usually offer usable "marginal zone" or degraded operation (ordinary AM or FM radio, for example).
Digital techniques are granular, discontinuous solutions with inescapable resolution errors, but they can be easily stored and retrieved without degradation, and copied or processed without degradation or accumulated noise floor problems. Digital techniques offer excellent reproduction, and potentially very high signal-to-noise ratios. But that replica also has uncorrectable error components linked to reference accuracy and bit resolution. Digital systems generally have catastrophic degradation, and have sharply demarcated operate/fail zones.
It is also worthwhile to understand that when addressing most real-life situations in which an interface to signals is required, all digital systems will suffer some analog degradation in the conversion stage. Their benefit is that this occurs only at one stage, or in the worst case, twice if an analog output is also required. In this way, intermediate digital processing can provide improvements in long-term reproduction, noise, processing and drift, and is the principal attraction of modern techniques like music CDs.
Areas that have proven very problematic recently have been the crossover to digital radio techniques and digital measurements, such as oscilloscopes. Digital radio can potentially offer good data transfer, but is somewhat less stellar in terms of voice communication, as many narrow-channel digital cell phone users are discovering.
The shift to digital radio is often motivated by shrinking bandwidth allocations or the desire for privacy. But the problem can be that voice data processed at too low a sampling rate often is incapable of reproducing consonants and sibilants correctly (considerable high frequency data is required to do this). This results in voice transmissions of very low intelligibility. In addition, users often experience catastrophic loss of communication with signal fading or interference as the encryption sequence or data stream is interrupted.
There are many anecdotal stories (such as the 1984 Olympic games system failure) involving digital radio – especially encrypted digital radio – which highlight the marginal performance of such systems under real-world conditions. It is clear from tests and personal experience that this technology has much higher link failure rates than analog radio. But it is pressing ahead regardless, due to many requirements for data transmission, which is really it’s best application.
Test equipment has made the shift to digital technology with mixed successes and failures. Digital multimeters and counters are runaway successes in most areas, offering good solutions for most applications and much more rugged operation than mechanical meters. Their shortcomings have been for peaking/nulling adjustments, where meters are much simpler. Many DVMs now include a supplemental "bar graph" display to solve this. Higher frequency RMS readings are also poorly implemented in most of these units, due to marginal RMS A-to-D conversion.
The main digital technology failure has been oscilloscopes. Waveform fidelity is generally poor and very granular, and distinguishing noise or distortion from artifacts is almost impossible. Many earlier units, especially from HP, have bizarre specifications that seem to indicate wide vertical bandwidth. But in reality, they have very low sampling rates, making useful high-speed measurements quite difficult, unless simple repetitive waveforms are displayed.
The shift to digital techniques in oscilloscopes is largely motivated by cost concerns (CRTs of high quality and the support circuitry are expensive) and the desire for better portability, storage or printing capability. An additional "shift to digital" pressure is that existing higher frequency sampling designs are already largely digital. They can be translated to a full digital display design easily, eliminating the CRT system with an LCD or other display.
I would have to class many digital scope designs largely as failures, due to their many erroneous operating modes, including aliasing (caused by too low a sampling rate), false noise (quantization error), waveform granularity (bit limitations), and almost total failure to catch transient phenomena (caused by successive waveform averaging). Many applications find them unusable because of these limitations, and few examples have been truly successful so far.
Only Tektronix’s new DPO (digital phosphor oscilloscope) series has been able to do a credible job of replicating CRT performance through its clever use of intensity control shading, user control of averaging techniques, and gigasample data rates. These techniques give the DPO units very high utility beyond 300 MHz and relative freedom from misleading presentations. Tektronix’s family of scopes and the well thought out Fluke 123 (a low frequency portable design) are the only examples so far that seem to offer truly usable performance in their bench and field test categories.
Keep in mind that even these "best of class" designs cannot show real-time transient phenomena correctly, if at all, and if you really want to see what is happening in real-time, there is no complete substitute for an analog scope. We live in an analog world. Skillfully translating that world into useful designs and measurements is what really separates the men from the boys.
Walter Shawlee 2 welcomes reader comments and can be reached by e-mail at [email protected]