Unlike us, microprocessors have not grown up with the idea that 10 is
a convenient number of digits to use. We have taken it so much for
granted that we have even used the word digit to mean both a finger
and a number.
Microprocessors and other digital circuits use only two digits – 0 and
1 – but why? Ideally, we would like our microprocessors to do
everything at infinite speed and never make a mistake. Error free or
high speed – which would you feel is the more important?
It’s your choice but I would go for error free every time, particularly
when driving my car with its engine management computer or when
coming in to land in a fly-by-wire aircraft. I think most people would
agree.
So let’s start by having a look at one effect of persuading microprocessors
to count in our way.
The noise problem
If the input of a microprocessor is held at a constant voltage, say 4V,
this would appear as in Figure 2.1.
If we try to do this in practice, then careful measurements would show
that the voltage is not of constant value but is continuously wandering
above and below the mean level. These random fluctuations are called
electrical noise and degrade the performance of every electronic
circuit. We can take steps to reduce the effects but preventing it
altogether is, so far, totally impossible. We can see the effect by
disconnecting the antenna of our television. The noise causes random
speckles on the screen which we call snow. The same effect causes an
audible hiss from the loudspeaker. The effect of noise is shown in
Figure 2.2.
No comments:
Post a Comment