Where to draw the line between analog and digital?
#1
Where, exactly, do you draw the line between analog and digital?

Analog describes the world before the middle of the 20th century. Digital electronics had not put in an apprearance yet. Music was recorded as irregularities on vinyl. Files were collections of marks on paper stored in drawers. A photograph was made using light-sensitive film chemically processed in a darkroom. Then came the integrated circuit, ushering in the digital calculator and the home computer. Data morphed into ones and zeros, nibbles and bytes. The difference between analog and digital couldn't be more obvious. Right?

Not quite. When you look at a data sheet for a digital integrated circuit, you will find specifications that seem like they would be more at home in an analog context. Digital circuitry may be the realm of ones and zeros, not analog voltages, but what is the highest voltage that might be considered a logic zero? What is the range of voltages that can be considered a logic one. How does all that change when the power supply voltage changes? Once you quantify the decision points to be used at the inputs, how close to ideal are the output voltages? How much output voltage is still OK for a logic zero, and what exactly constitutes a logic one at an output? What about loading? How many inputs can you connect to one output before the output stops meeting its specifications? That is called fanout. Early logic families were limited to single-digit fanouts which could be a real limitation in practical circuits.

Then, there is time. Nothing is instant. When an input changes, how long does it take for the corresponding output to appear? In a simple digital logic element, that delay is called a propagation delay. Then there is slew rate, being how long it takes for a logic level to transition to the other logic level. Is the transition time symetrical both ways? In a digital IC, there may be several propagation delays following a change in an input before a change appears at the output, then, remember to allow for the slewing time. There is also temperature to account for. What happens if the circuit gets very hot or very cold? At some point it is going to stop working. All these factors are analog effects that can be used to argue that digital circuitry is just a form of analog circuitry dressed up in black and white to look binary. There is more than a little truth in those arguments.

On the other side, take the example of good old-fashioned light. You might first think a lightwave is obviously analog, but you may remember learning something about photons. Photons are particles of light, and sometimes light behaves more like a stream of particles.  A photon either exists or it doesn't. That would be digital, no? Or what about matter itself. Matter is made up of atoms,and atoms are particles, made up of subatomic particles. Electrons jump from one orbit to another and do not occupy the spaces between. All these bits of accepted knowledge have a digital ring to them. Maybe what seems like the analog world around us is actually a digital world, with the bits being on the small side, so that they all seem to run together? 

Where does that leave the analog to digital converter? Or for that matter, its sibling the digital to analog converter? None of the above alters that, as a practical matter, we store and process digital and analog information very differently. Even so, in some cases, the end results are nearly interchangable. It hardly matters to the rods and cones in the eye whether an image is a high quality digital image or an analog photo. Maybe what counts is doing a good enough job of high-resolution digitizing so that the distinction ceases to matter.

Tom Lawson
Feb. 16, 2021
Reply
#2
The major difference between both signals is that the analog signals have continuous electrical signals, while digital signals have non-continuous electrical signals.
Reply
#3
It is true that a theoretic digital signal is discontinuous, but the output of a digital gate is still a continuous voltage. Digital signals show up as continuous traces on an oscilloscope screen. So yes, an abstract digital process involves discrete ones and zeros, but digital circuitry is really an analog approximation of an abstact construct.

If digital circuitry was ideal, you could speed the digital clock up to arbitrarily fast speeds. In fact, the maximum clock frequency depends on how well the analog approximation matches the digital theory.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)