The Fundamentals of Digital Electronics, Part 5 - The Microprocessor
#1
The Fundamentals of Digital Electronics, Part 5 - The Microprocessor

Part 4 of this series got us to the digital calculator, the first widespread application of highly integrated digital logic. A calculator chip is dedicated to a single purpose. Input is from a keypad, and output is to a display, and the processing is basic arithmetic. The success of the pocket calculator was due to small size, low cost, and ease of use. The next big digital development was the generalized miniature digital processor, which could be programmed to do almost anything. One of the first high volume applications for microprocessors was gasoline pumps. When putting gas in your car, you might not notice the difference, but the new pumps were cheaper, more reliable and more flexible.

All that flexibility brought a huge step forward in possibility, but a big step back in the convenience of putting these new devices to work. Mainframe computers of the day used paper tape, punch cards, and modified electric typewriters for input. For output there was usually a noisy, wide-bed printer that required special fanfold, pin-feed paper. None of those input or output devices were practical for use with microcontrollers. CRT computer monitors for displaying text didn't appear on the market until until the 1970's. (It is easy to forget that television sets had only been available since the 1940's.) So when about that time, microprocessor chips became available and affordable, how could a person put one to use?

In the 70's I was working at a medical electronics company, one of whose products was called a cardiac output computer. The computations were all analog, but the marketing department wanted trendy digital computation. We didn't have the budget for a rarefied development system with a CRT display, even if one had been available. Instead, phase one was designing and building an affordable development system that we could get up and running quickly.

We used a National Semiconductor SC/MP, the first microprocessor selling for under $10. The processor was an 8-bit device. That is to say that data was handled 8 bits at a time. Memory was a block of 8-bit storage registers addressed by what is called an address bus. The SC/MP address bus was 16 bits wide, allowing access to 65,535 storage registers. The memory itself was not included, and memory chips were expensive. We had under one kByte of physical memory actually in place for data storage. There was a central processing unit which understood 56 simple instructions. Some instructions were arithmetic, like ADD, some were logical, like OR, and some were for flow control, like JZ - jump if zero.

We put LEDs on the data bus. That worked pretty well for an 8-bit word. With practice, you can mentally translate back and forth between binary and decimal when dealing with 8 bits at a time. But 16 LEDs on the address bus would have been too much to handle. We built a 13-bit binary-to-BCD encoder out of a pile of discrete TTL family logic encoder chips. (Low power TTL wasn't available yet.) The encoder enabled a 4-digit LED 7-segment display to show the lower 13 bits of the address bus, up to location 8191, in decimal. Eight telephone switches allowed for data entry. (Telephone switches are obsolete now, too. I imagine most people today have never seen one.) A button caused the processor to execute the current instruction, called single-stepping. We entered the program code one binary byte at a time. Care was essential since a mistake could send you into a state beyond recovery. A car battery backed up the memory. If power was lost, you had to start over, since there was no non-volatile storage like a cassette tape or a disk drive. Even if we could have afforded a disk drive, there was no operating system, so it wouldn't have been any use without a lot more work. Non-volatile memory was then available in the form of PROM, Programmable Read Only Memory. You only got one try, a PROM with a mistake in it was useless. EPROM, or erasable PROM became practical only a little while later.

Troubleshooting code was painfully slow. Either you single-stepped, one instruction at a time, or you inserted Halt commands, and allowed the system to run until it hit a Halt. We inserted blocks of NOPs, for no operation, to leave places for corrections and for later additions. A jump instruction would skip over the block of NOPs so you didn't have to step through. If there was a mistake, and the processor got lost, you could overwrite sections of your program. The more code we wrote and tested, the more time it took to recover from such a mistake. So much for ease of use!

There were many lessons to be learned from the process. Most obvious, was the tradeoff between time spent on improving the development system and how long it took to write and test code. But the implications of the second major difference between a microprocessor-based digital system and the analog circuitry it replaced became increasingly important. Analog calculations are continuous. Digital calculations proceed in a series of discrete steps over a period of time. If the processor is fast enough, that difference doesn't appear to matter that much, but the more you give the processor to do, the more the processing time becomes a factor.

We bogged down our microprocessor early in the game. It had a 1 Megahertz clock and most instructions took several cycles. One task was establishing a baseline before the measurement. Another was taking the integral of an input signal minus the baseline. Another task was taking the differential of that input signal. A third task was combining the integral and the differential results and scaling them according to an earlier calibration. Input controls needed to be responded to, and outputs and status indicators needed to be updated and displayed. The processor had to keep up in real time while the data was being gathered at hundreds of points per second. Then, it had to complete its calculations and display results promptly. The doctors using the new digital version were accustomed to the earlier analog computers, so they expected to see their results immediately. We added circuitry to help out for speed.

By the end, we had added an analog integrator and an analog differentiator and an analog multiplier/divider. The microprocessor managed the system, and took care of the display. It turns out that a digital system doing 16-bit calculations with an 8-bit data bus requires a lot of time just to do the simple arithmetic. We actually considered interfacing a calculator chip to work the numbers. That turned out to be a general need. A few years later there were numeric co-processors paired with microcontrollers for speeding up calculations.

A whole range of other issues cropped up in the course of the microprocessor-based cardiac output computer project. Many of those issues are now commonplace and well-understood in what is now called computer science. In the mid 1970's, the entire enterprise was new and exotic. 


Tom Lawson

March 2022
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)