Difference between anolog and digital readings
Hi Guys could someone help me out with this question
If I were to take a load reading using
1. a standard analog clip on ammeter
2. a digital true RMS clip on ammeter
What would the difference in readings be ,would the true RMS reading be 25% more than the average reading
it depends on the load...if the load is a sine wave both should be accurate but if the load is a non sinusoidial load then if the current clamp cannot read true rms the reading will be inaccurate.
RMS is away of measuring AC waveforms by comparing them to a DC equivalent. A true RMS tester actually uses heating effect to measure this whilst a standard tester might measure the peak voltage and multiply by 0.707 to calc the RMS. An analogue tester will simply have the calibration adjusted to show RMS values rather than the average value of the waveform which it show naturally. The only time you might see a substantial difference between a standard analog clip on ammeter and a digital true RMS clip on ammeter might be if the circuit is supplied by a generator with a modified sine wave output.
If you have two testers that show a 25% difference on a load supplied by a clean sine wave then it's time to send them in for calibration.
Thanks given for this post: