Module - 5
Memory Elements
Selection of memory elements depends on the following parameters
i. Area
ii. Power dissipation
iii. Volatility
Power dissipation
Static power dissipation is nil since current flows only when RD is high and logic 1 is
stored at T2.
Hence actual dissipation associated with each bit stored will depend on the pull up
transistor and on the duration of RD signal and on the switching frequency.
Volatility
RAM Cell is dynamic and will hold the data as long as sufficient charge remains on the
gate capacitance Cg of T2.
If we exclude the external capacitor then we can consider the diffusion to substrate
capacitance to store the charge by extending the source diffusion layer.
The capacitance formed by the diffusion to substrate capacitance is very small. To
increase the capacitance a polysilicon plate is placed on the source and then connected to
the VDD as shown in the fig.5.3.
Fig.5.3: Stick diagram of single transistor dynamic RAM by extending the source diffusion layer
Thus Cm can be realized as a three plate structure as shown in the fig.5.4.
Volatility
It’s volatile. Leakage current will deplete the charge stored in Cm and thus the data will
be held for only up-to 1msec or less.
Pseudo-Static RAM
If the stored elements are considered as volatile then the data should be periodically
refreshed.
In order to overcome this static storage, cells are designed which will hold the data
indefinitely.
Fig.5.5 shows the pseudo-static RAM where RD and WR synchronized with Φ1.
The row select line is activated at the same time as column select and the bit line states
are written via T3 and T4 and stored at the gate capacitance Cg1 and Cg2 of T1 and T2.
T1 and T2 are interconnected in such a way that they will be in complementary states
when row select line is high.
Once the select lines are deactivated, the states of T1 and T2 are remembered until the
charges are discharged.
To perform read operation the bit and ����
bit buses are pre-charged to VDD once again in
coincidence with Φ1 through transistors T5 and T6.
If ‘1’ has been stored then T2 will be on and T1 will be off. Thus ����
bit line will be
discharged to logic 0 through T2 and stored bit is read into bus.
The arrangement for six transistor static cell for storing one bit is as shown in fig.5.7 (b).
The transistors T1 and T2 of fig.5.7 (a) are replaced by an inverter.
The write and read operation for six transistor static cell is similar to four transistor
dynamic cell.
When these cells are used it is incapable of sinking large charges quickly therefore RAM
arrays uses the sense amplifier circuits as shown in fig.5.7(c).
The transistor T1, T2, T3 and T4 forms a flip-flop circuit.
If sense line is inactive then the state of the bit line is reflected at the gate capacitance of
T1 and T3.
Current flowing from VDD through an on transistor helps to maintain the state of the bit
lines and predetermines the state which will be taken up by the sense flip-flop when the
sense line is activated.
The geometry (W and L) of the sense amplifier is such that it amplifies the current
sinking capability.
This data is stored at the gate capacitance of the second inverter; correspondingly
complemented output is obtained at the output of second inverter which is true data D.
Introduction
Testing is an organized process to verify the behavior, performance, and reliability of a
device or system against designed specifications.
It ensures a device or system to be as defect-free as possible.
Testing a die (chip) can occur at the following levels
i. Wafer level
ii. Packaged chip level
iii. Board level
iv. System level
v. Field level
By detecting a malfunctioning chip early, the manufacturing cost can be kept low. For
instance, the approximate cost to a company of detecting a fault at the various levels is at
least
i. Wafer $0.01–$0.10
ii. Packaged chip $0.10–$1
iii. Board $1–$10
iv. System $10–$100
v. Field $100–$1000
Obviously, if faults can be detected at the wafer level, the cost of manufacturing is lower.
Logic Verification
In the design of integrated circuits, at all levels of abstraction, verification tools compare
the design at different levels to make sure that in the synthesis process, the designers or
optimization tools have not introduced errors, particularly logic errors.
Due to the high complexity of VLSI design and the complexity of synthesis tools, logic
verification has become increasingly important.
Logic verification detects any discrepancy in the function implemented by the two
compared logic designs.
• For a given specific input signals, the simulator solves for the signals inside
the circuit.
• Simulators come in a wide variety depending on the level of accuracy and the
simulation speed desired.
a. circuit simulation
b. switch-level simulation
c. logic simulation
d. functional simulation
The behavioral specification might be a verbal description, a plain language textual
specification, a description in some high level computer language such as C, or a
hardware description language such as VHDL or Verilog, or simply a table of inputs and
required outputs.
RTL converts the HDL into a set of registers and combinational logic. The combinational
logic is optimized using algebraic and/or Boolean techniques
Structural specification converts the combinational logic into switch level.
Physical specification converts the switch level into layer specifications.
During a design, it is common practice to run a regression test after design activities have
concluded to check the bug.
High-level language scripts are frequently used when running large testbenches,
especially for regression testing.
Version Control
Combined with regression testing is the use of versioning, that is, the orderly
management of different design iterations. Unix/Linux tools such as CVS or Subversion
are useful for this.
Bug Tracking
Bug-tracking systems such as the Unix/Linux based GNATS allow the management of a
wide variety of bugs.
In these systems, each bug is entered and the location, nature, and severity of the bug
noted.
The bug discoverer is noted, along with the perceived person responsible for fixing the
bug.
The fault coverage of a set of test vectors is the percentage of the total nodes that can be
detected as faulty when the vectors are applied.
To achieve world-class quality levels, circuits are required to have in excess of 98.5%
fault coverage.
Automatic Test Pattern Generation (ATPG)
Manufacturing test ideally would check every node in the circuit to prove it is not stuck.
Sequence of test vectors should be applied to the circuit to prove each node is not stuck.
Generation of test vector (test pattern) is tedious, hence Automatic Test Pattern
Generation (ATPG) tools are used
Automatic Test Pattern Generation, or ATPG, generates the vectors or input patterns
automatically which are required to check a device for faults.
The vectors are sequentially applied to the device under test and the device's response to
each set of inputs is compared with the expected response from a good circuit.
An 'error' in the response of the device means that it is faulty.
The effectiveness of the ATPG is measured primarily by the fault coverage achieved and
by the number of patterns generated.
Delay Fault Testing
Delay fault increases the input to output delay of one logic gate, at a time but the
functionality of the circuit is untouched.
For ex, consider an inverter gate composed of paralleled nMOS and pMOS transistors as
shown in fig.5.13
The fault now becomes sequential as the detection of the fault depends on the previous
state of the gate.
Ad Hoc Testing
Ad-hoc testing is useful only for small designs where scan, ATPG, and BIST are not
available.
Common techniques used for ad hoc testing are
1. Partitioning large sequential circuits
2. Adding test points
3. Adding multiplexers
4. Providing for easy state reset
Large circuits should be partitioned into smaller sub-circuits to reduce test costs. One of
the most important steps in designing a testable chip is to first partition the chip in an
appropriate way such that for each functional module there is an effective (DFT)
technique to test it. Partitioning must be done at every level of the design process, from
architecture to circuit, whether testing is considered or not. Partitioning can be functional
(according to functional module boundaries) or physical (based on circuit topology).
Test access points must be inserted to enhance controllability & observability of the
circuit.
Multiplexers can be used to provide alternative signal paths during testing.
Any design should always have a method of resetting the internal state of the chip within
a single cycle or at most a few cycles. A power-on reset mechanism controllable from
primary inputs is the most effective and widely used approach.
Scan Design
The scan-design strategy for testing provides observability and controllability at each
register.
The registers operate either in normal mode or scan mode.
In normal mode, registers behave as expected. In scan mode, registers are connected to
form a giant shift register called a scan chain spanning the whole chip.
Scan based testing is as shown in fig.5.14. The scan register is a D flip-flop preceded by a
multiplexer.
When the SCAN signal is deasserted, the register behaves as a conventional register,
storing data on the D input.
When SCAN is asserted, the data is loaded from the SI pin, which is connected in shift
register fashion to the previous register Q output in the scan chain.
Test generation for this type of test architecture can be highly automated. ATPG
techniques can be used for the combinational blocks.
The prime disadvantage is the area and delay impact of the extra multiplexer in the scan
register.
Instead of using separate circuits for these two functions, it is possible to design a single
circuit that serves both purposes known as the built-in logic block observation (BILBO)
as shown in the fig.5.16.
Verification Testing
Verification verifies correctness of Testing verifies correctness of
design manufactured hardware
Verification is performed by Testing is a two-part process:
simulation, hardware emulation, or i. Test generation
formal methods ii. Test application
Verification is performed once prior to Testing is performed on every
manufacturing. manufactured device.
Verification is responsible for quality Testing is responsible for quality of
of design. devices.