Thoughts on the Digital Revolution
#1
In the 1970s digital electronics began taking over from analog electronics. The most conspicuous form was the
personal computer, but it went far beyond that. At the time, I was working for a medical electronics company. One of
our products was a Cardiac Output Computer. It calculated cardiac output from a dye dilution curve using analog
multiplier divider circuitry. The logic was mostly implemented in a large multi-wafer rotary switch with dozens of
interwired contacts. The result would appear on a large, old-school meter face which took up most of the front of the
instrument.

A bolus of dye was  injected near the heart of a catheterized patient and sampled at a downstream artery with a pump
and a detector called a densitometer. The dye would appear as a peak with an exponential decay. Some of the dye
would take a shorter route back to the heart and would appear a second time, called recirculation. Recirculation would
cause the decreasing dye curve to reverse direction. Cardiac output was calculated from the area under the curve,
absent recirculation. The challenge was that integration of that area under the curve got more and more complete a
time went on, but from the moment recirculation hit, the area under the curve no longer represented the cardiac
output. In order to estimate what the area would have been without recirculation, the integral representing the past
area was summed with a differential term repesenting the predicted area under the curve if the exponential decay
continued.

The answer improves with time because the total area calculation consists more of the integral history and less of the
differential prediction. Then, when recirculation starts, the curve bends upward, and the differential term gets huge.
At that point the prediction becomes meaningless.

The doctor injected the dye and watched the meter settle to the answer, then the needle hit the peg upon
recirculation. The doctor's confidence in the answer went up to the extent that the needle was steadier for longer.
Simple enough, and quite intuitive.

One day the head of sales came into the engineering department with a nixie tube in his hand and said digital is the
new big thing. Sure, we responded, we could replace the big meter with a three-digit digital display. We did that, but it
wasn't quite the whole story. A a changing digit tended to look like an 8, regardless, because all the segments lit up
fast and dimmed down slowly. Ok, that meant we needed to latch the answer, which didn't seem like it should be too
hard. We were wrong. If you latched too soon, you would get a substandard reading and if you latched too late you
would have no reading. Plus, the doctor no longer knew how much confidence to place in the answer.

The conclusion was that we needed to calculate a set of possible answers and evaluate them statistically to obtain a
confidence level. Enter a microprocessor. In those early days we needed to design our own development system. We
needed to create our own tools, down to needing to write our own floating point math routines in assembly language
(there was no higher level language for microcomputers yet) for the statistics.

Complexity went through the roof. Do you iterate and rank different possible answers then pick the best? Do you
report both an answer and a confidence level? How long will the doctor wait for your primitive computer to grind
before he/she complains to the salesman that it is too slow and he liked the analog computer better?

In the end, the switch to digital computation surely improved the results, but there was a long, uncomfortale period
between pure analog and mostly digital when a successfull transition was no sure thing. It is easy to forget that, in
retrospect.

Tom Lawson
October 2020
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)