Engineers use many methods to minimize
logic redundancy in order to reduce the circuit complexity. Reduced complexity reduces component count and potential errors and therefore typically reduces cost. Logic redundancy can be removed by several well-known techniques, such as
binary decision diagrams,
Boolean algebra,
Karnaugh maps, the
Quine–McCluskey algorithm, and the
heuristic computer method. These operations are typically performed within a
computer-aided design system.
Embedded systems with
microcontrollers and
programmable logic controllers are often used to implement digital logic for complex systems that do not require optimal performance. These systems are usually programmed by
software engineers or by electricians, using
ladder logic.
Representation A digital circuit's input-output relationship can be represented as a
truth table. An equivalent high-level circuit uses
logic gates, each represented by a different shape (standardized by
IEEE/
ANSI 91–1984). A low-level representation uses an equivalent circuit of electronic switches (usually
transistors). Most digital systems are divided into combinational and sequential systems. The output of a combinational system depends only on the present inputs. However, a sequential system has some of its outputs fed back as inputs, so its output may depend on past inputs in addition to present inputs, to produce a
sequence of operations. Simplified representations of their behavior called
state machines facilitate design and testing. Sequential systems divide into two further subcategories.
"Synchronous" sequential systems change state all at once when a
clock signal changes state.
"Asynchronous" sequential systems propagate changes whenever inputs change. Synchronous sequential systems are made using
flip flops that store inputted voltages as a
bit only when the clock changes.
Synchronous systems The usual way to implement a synchronous sequential state machine is to divide it into a piece of combinational logic and a set of flip flops called a
state register. The state register represents the state as a binary number. The combinational logic produces the binary representation for the next state. On each clock cycle, the state register captures the feedback generated from the previous state of the combinational logic and feeds it back as an unchanging input to the combinational part of the state machine. The clock rate is limited by the most time-consuming logic calculation in the combinational unit logic.
Asynchronous systems Most digital logic is synchronous because it is easier to create and verify a synchronous design. However, asynchronous logic has the advantage of its speed not being constrained by an arbitrary clock; instead, it runs at the maximum speed of its logic gates. Nevertheless, most systems need to accept external unsynchronized signals into their synchronous logic circuits. This interface is inherently asynchronous and must be analyzed as such. Examples of widely used asynchronous circuits include synchronizer flip-flops, switch
debouncers and
arbiters. Asynchronous logic components can be hard to design because all possible states, in all possible timings, must be considered. The usual method is to construct a table of the minimum and maximum time that each such state can exist and then adjust the circuit to minimize the number of such states. The designer must force the circuit to periodically wait for all of its parts to enter a compatible state (this is called "self-resynchronization"). Without careful design, it is easy to accidentally produce asynchronous logic that is unstable—that is—real electronics will have unpredictable results because of the cumulative delays caused by small variations in the values of the electronic components.
Register transfer systems in this circuit, and the register holds the state. Many digital systems are
data flow machines. These are usually designed using synchronous
register transfer logic and written with
hardware description languages such as
VHDL or
Verilog. In register transfer logic, binary numbers are stored in groups of flip flops called
registers. A sequential state machine controls when each register accepts new data from its input. The outputs of each register are a bundle of wires called a
bus that carries that number to other calculations. A calculation is simply a piece of combinational logic. Each calculation also has an output bus, and these may be connected to the inputs of several registers. Sometimes a register will have a
multiplexer on its input so that it can store a number from any one of several buses. Asynchronous register-transfer systems (such as computers) have a general solution. In the 1980s, some researchers discovered that almost all synchronous register-transfer machines could be converted to asynchronous designs by using first-in-first-out synchronization logic. In this scheme, the digital machine is characterized as a set of data flows. In each step of the flow, a synchronization circuit determines when the outputs of that step are valid and instructs the next stage when to use these outputs.
Computer design The most general-purpose register-transfer logic machine is a
computer. This is basically an
automatic binary
abacus. The
control unit of a computer is usually designed as a
microprogram run by a
microsequencer. A microprogram is much like a player-piano roll. Each table entry of the microprogram commands the state of every bit that controls the computer. The sequencer then counts, and the count addresses the memory or combinational logic machine that contains the microprogram. The bits from the microprogram control the
arithmetic logic unit,
memory and other parts of the computer, including the microsequencer itself. In this way, the complex task of designing the controls of a computer is reduced to the simpler task of programming a collection of much simpler logic machines. Almost all computers are synchronous. However,
asynchronous computers have also been built. One example is the
ASPIDA DLX core. Another was offered by
ARM Holdings.
Computer architecture Computer architecture is a specialized engineering activity that tries to arrange the registers, calculation logic, buses and other parts of the computer in the best way possible for a specific purpose. Computer architects have put a lot of work into reducing the cost and increasing the speed of computers in addition to boosting their immunity to programming errors. An increasingly common goal of computer architects is to reduce the power used in battery-powered computer systems, such as
smartphones.
Design issues in digital circuits Digital circuits are made from analog components. The design must ensure that the analog nature of the components does not dominate the desired digital behavior. Digital systems must manage noise and timing margins, parasitic inductances and capacitances. Bad designs have intermittent problems such as
glitches, vanishingly fast pulses that may trigger some logic but not others,
runt pulses that do not reach valid
threshold voltages. Additionally, where clocked digital systems interface to analog systems or systems that are driven from a different clock, the digital system can be subject to
metastability where a change to the input violates the
setup time for a digital input latch. Since digital circuits are made from analog components, digital circuits calculate more slowly than low-precision analog circuits that use a similar amount of space and power. However, the digital circuit will calculate more repeatably, because of its high noise immunity.
Automated design tools Much of the effort of designing large logic machines has been automated through the application of
electronic design automation (EDA). Simple truth table-style descriptions of logic are often optimized with EDA that automatically produces reduced systems of logic gates or smaller lookup tables that still produce the desired outputs. The most common example of this kind of software is the
Espresso heuristic logic minimizer. Optimizing large logic systems may be done using the
Quine–McCluskey algorithm or
binary decision diagrams. There are promising experiments with
genetic algorithms and
annealing optimizations. To automate costly engineering processes, some EDA can take
state tables that describe
state machines and automatically produce a truth table or a
function table for the
combinational logic of a state machine. The state table is a piece of text that lists each state, together with the conditions controlling the transitions between them and their associated output signals. Often, real logic systems are designed as a series of sub-projects, which are combined using a
tool flow. The tool flow is usually controlled with the help of a
scripting language, a simplified computer language that can invoke the software design tools in the right order. Tool flows for large logic systems such as
microprocessors can be thousands of commands long, and combine the work of hundreds of engineers. Writing and debugging tool flows is an established engineering specialty in companies that produce digital designs. The tool flow usually terminates in a detailed computer file or set of files that describe how to physically construct the logic. Often it consists of instructions on how to draw the
transistors and wires on an integrated circuit or a
printed circuit board. Parts of tool flows are debugged by verifying the outputs of simulated logic against expected inputs. The test tools take computer files with sets of inputs and outputs and highlight discrepancies between the simulated behavior and the expected behavior. Once the input data is believed to be correct, the design itself must still be verified for correctness. Some tool flows verify designs by first producing a design, then scanning the design to produce compatible input data for the tool flow. If the scanned data matches the input data, then the tool flow has probably not introduced errors. The functional
verification data are usually called
test vectors. The functional test vectors may be preserved and used in the factory to test whether newly constructed logic works correctly. However, functional test patterns do not discover all fabrication faults. Production tests are often designed by
automatic test pattern generation software tools. These generate test vectors by examining the structure of the logic and systematically generating tests targeting particular potential faults. This way the
fault coverage can closely approach 100%, provided the design is properly made testable (see next section). Once a design exists, and is verified and testable, it often needs to be processed to be manufacturable as well. Modern integrated circuits have features smaller than the wavelength of the light used to expose the photoresist. Software that is
designed for manufacturability add interference patterns to the exposure masks to eliminate open-circuits and enhance the masks' contrast.
Design for testability There are several reasons for testing a logic circuit. When the circuit is first developed, it is necessary to verify that the design circuit meets the required functional and timing specifications. When multiple copies of a correctly designed circuit are being manufactured, it is essential to test each copy to ensure that the manufacturing process has not introduced any flaws. A large logic machine (say, with more than a hundred logical variables) can have an astronomical number of possible states. Obviously, factory testing every state of such a machine is unfeasible, for even if testing each state only took a microsecond, there are more possible states than there are microseconds since the universe began! Large logic machines are almost always designed as assemblies of smaller logic machines. To save time, the smaller sub-machines are isolated by permanently installed
design for test circuitry and are tested independently. One common testing scheme provides a test mode that forces some part of the logic machine to enter a
test cycle. The test cycle usually exercises large independent parts of the machine.
Boundary scan is a common test scheme that uses
serial communication with external test equipment through one or more
shift registers known as
scan chains. Serial scans have only one or two wires to carry the data, and minimize the physical size and expense of the infrequently used test logic. After all the test data bits are in place, the design is reconfigured to be in
normal mode and one or more clock pulses are applied to test for faults (e.g., stuck-at low or stuck-at high) and capture the test result into flip-flops or latches in the scan shift register(s). Finally, the result of the test is shifted out to the block boundary and compared against the predicted
good machine result. In a board-test environment, serial-to-parallel testing has been formalized as the
JTAG standard.
Trade-offs Cost Since a digital system may use many logic gates, the overall cost of building a computer correlates strongly with the cost of a logic gate. In the 1930s, the earliest digital logic systems were constructed from telephone relays because these were inexpensive and relatively reliable. The earliest integrated circuits were constructed to save weight and permit the
Apollo Guidance Computer to control an
inertial guidance system for a spacecraft. The first integrated circuit logic gates cost nearly US$50, which in would be equivalent to $. Mass-produced gates on integrated circuits became the least-expensive method to construct digital logic. With the rise of
integrated circuits, reducing the absolute number of chips used represented another way to save costs. The goal of a designer is not just to make the simplest circuit, but to keep the component count down. Sometimes this results in more complicated designs with respect to the underlying digital logic but nevertheless reduces the number of components, board size, and even power consumption.
Reliability Another major motive for reducing component count on printed circuit boards is to reduce the manufacturing defect rate due to failed soldered connections and increase reliability. Defect and failure rates tend to increase along with the total number of component pins. The failure of a single logic gate may cause a digital machine to fail. Where additional reliability is required, redundant logic can be provided. Redundancy adds cost and power consumption over a non-redundant system. The
reliability of a logic gate can be described by its
mean time between failure (MTBF). Digital machines first became useful when the MTBF for a switch increased above a few hundred hours. Even so, many of these machines had complex, well-rehearsed repair procedures, and would be nonfunctional for hours because a tube burned out, or a moth got stuck in a relay. Modern transistorized integrated circuit logic gates have MTBFs greater than 82 billion hours (). This level of reliability is required because integrated circuits have so many logic gates.
Fan-out Fan-out describes how many logic inputs can be controlled by a single logic output without exceeding the electrical current ratings of the gate outputs. The minimum practical fan-out is about five. Modern electronic logic gates using
CMOS transistors for switches have higher fan-outs.
Speed The
switching speed describes how long it takes a logic output to change from true to false or vice versa. Faster logic can accomplish more operations in less time. Modern electronic digital logic routinely switches at , and some laboratory systems switch at more than .. ==Logic families==