Free Shipping on orders over US$49.99

SRAM vs DRAM | Comparison, Basic Structures and Differences


Memory is a vital part of modern microprocessor systems such as computers, smartphones, networking devices, smart gadgets and even rockets and satellites. Since their introduction in the late 1960’s, Semiconductor Memories are the preferred choice of memory overtaking magnetic (tapes and discs) and optical formats. There are two types of semiconductor memories: Volatile (RAM) and Non-Volatile (ROM). In this guide, we will take a look at two different types of volatile memories i.e., SRAM and DRAM. We will understand their basic operation and find out the differences between SRAM vs DRAM.

Overview of Semiconductor Memory

The first important question is what exactly is Memory in microprocessor systems? Memory is a device or system that has the ability to store digital information (either temporarily or permanently).

In the early days of computing, engineers used magnetic tapes and discs to store information. The polarities of the magnetic field (north and south) on a circular substrate represent the digital logic values (0 and 1).

But the technological developments in semiconductor design and manufacturing gave rise to Semiconductor Memory. In this system, we store the information using integrated circuits (involving transistors, capacitors, fuses etc., on a single semiconductor substrate).

Unlike magnetic tapes or discs, semiconductor memory doesn’t have any mechanical or moving parts. Hence, we also call semiconductor as Solid-State Memory (as transistors, which are the building blocks of many semiconductor devices, are known as Solid-State Devices).

Before proceeding further, we need to understand a couple of important terminology associated with Memory. We refer to the information stored in memory as Data and the smallest possible data is a Bit (a logic 1 or logic 0).

When we place a data into the memory, we call this operation as Writing data. Similarly, when we retrieve data from memory, we call it as Reading data.

Classification of Semiconductor Memory

Let us now see different ways in which we can classify semiconductor memories.

Volatile vs Non-Volatile Memory

We can classify Semiconductor Memory (as this is the main topic of discussion, we will stick to Semiconductor Memories and ignore other kinds of memories) into two categories based on whether they retain data when we remove power to them. They are Volatile and Non-Volatile Memories.

The term Volatile Memory refers to those memory devices that cannot hold the data in the event of power off (either the system is shut down manually or due to a power failure).

Non-Volatile Memory means a memory device that can hold data even after we remove the power. As Volatile Memory doesn’t have to bother about storing the data permanently, it is slightly faster than non-volatile counterpart. Hence, we use volatile memory to hold critical system information while the system is running. Non-Volatile memory is also important, in spite being slower as it holds start-up commands, the OS (operating system) or firmware (in case of embedded systems) and also applications.

Read Only vs Read/Write

Another way to classify semiconductor is based on how we access the data. They are: Read Only and Read/Write. A Read Only Memory or ROM is a kind of memory which should not be altered while the system is running as it holds critical system data and applications.

In contrast, we can read and write to a Read/Write Memory even when the system is running, as it holds temporary data.

Random Access vs Sequential Access

The final way to classify semiconductor memory is based on access location. They are Random Access and Sequential Access. With Random Access Memory or RAM, we can access any memory location at any time.

But this is not possible with Sequential Access Memory. As the name suggests, we can access data only sequentially and all memory locations are not accessible immediately. An example of sequential access memory is magnetic tapes. In order to access a location, the system has to spin the tape spool to that location by passing through all the previous locations.

An important point here is that almost all semiconductor memories are Random Access Memories. This is where we have a confusion with respect to the terms RAM and ROM.

The term ROM (Read Only Memory) technically means a memory that cannot be modified while the system is running. We somehow associated it with Non-Volatile Memory.

Coming to RAM (Random Access Memory), it technically means we can access data from any memory location almost instantaneously (in contrast to sequential memory where we have to wait until we reach that location). Again, we somehow associated RAM with Volatile Memory.

For the rest of the discussion, we will ignore the true meaning of the terms and continue to associate RAM with Volatile Memory and ROM with Non-Volatile Memory.

Some common and popular non-volatile memory technologies are:

  • ROM (Read Only Memory)
  • MROM (Mask Read Only Memory)
  • PROM (Programmable Read Only Memory)
  • EPROM (Erasable Programmable Read Only Memory)
  • EEPROM (Electrically Erasable Programmable Read Only Memory)
  • Flash Memory

There are only two kids of Volatile Memory Technologies. They are:

  • SRAM (Static Random Access Memory)
  • DRAM (Dynamic Random Access Memory)

In the next couple of sections, we will learn briefly about SRAM and DRAM and also compare SRAM vs DRAM.

What is SRAM?

SRAM is short for Static Random Access Memory. It is a type of Volatile Memory that is primarily used as embedded memory (internal registers of microprocessor and cache). SRAM is similar to a bistable Flip-Flop, where we store data by setting or clearing the state of the flip-flop.

The term static in SRAM means that it doesn’t need a memory refresh and once the data is written, it stays until power is supplied. This is in contrast to DRAM, which needs periodic refresh to keep the memory even when it is connected to power (as it stores the data as a charge on a capacitor – we will learn more in the next section).

As SRAM is deeply embedded into the architecture of the microprocessor, the fabrication process is similar to that of a CPU core.

Structure of SRAM

The following shows a basic structure of a 1-bit SRAM Memory Cell. Here, a Memory Cell is the smallest group of components that can hold 1-bit of data.

SRAM-vs-DRAM-Image-1

From the image, the circuit seems a little bit complex but actually it is a cross-coupled inverter with two pass transistors. Following image shows the break-up of the inverters with NMOS and PMOS transistors. This circuit is slightly easier to understand.

SRAM-vs-DRAM-Image-2

The two PMOS transistors in the inverters act as pull-ups and the two NMOS transistors act as pull-downs. While early SRAM designs used all 6 transistors – 6T (4 in cross-coupled inverters and two pass transistors), another popular design was using Polysilicon Load Resistors as pull-ups instead of PMOS Transistors.

This significantly reduced the size of the memory cell as it essentially has only 4 transistors while the load resistors (which are of high resistance) are made up of a polysilicon layer on top of the transistors.

SRAM-vs-DRAM-Image-3

There are a couple more designs of SRAM Memory cell. One replaced the large poly-load resistors (as they needed to be slightly larger to overcome leakage current) with polysilicon PMOS. The advantages of this design are that they can still be fabricated with Poly-PMOS layer on top of the NMOS layer and the overall characteristics are better than poly-load resistor design counterparts.

SRAM-vs-DRAM-Image-4

As the size of cache integrated within the CPU die is increasing, having a layer on top of the regular silicon became complex. The result was a 4 transistor (4T) design with no pull-up load. This significantly reduced the size of the memory cell as it had only 4 transistors but designers had to take extra care with respect to leakage currents.

SRAM-vs-DRAM-Image-5

Of all these designs, the 6T SRAM Memory Cell is still the popular choice from fabrication point of view due to the complexities of poly-load layer. You might have heard of CPU Cache Memory. It is actually an SRAM type memory embedded on the die of the CPU.

What is DRAM?

While we compare the SRAM circuit with a bistable latching circuit, things are quite different when it comes to DRAM. For starters, a DRAM Cell (we will see details later), we use the charge stored (or the lack of charge) on a capacitor to represent the binary data.

The reason such type of memory is known as “Dynamic” is because the charge on the capacitor slowly leaks away even when the circuit is connected to power supply. To overcome this leakage, we have to recharge (known as Memory Refresh) the capacitor periodically.

Even with the additional complexity of the memory refresh circuit, the advantage of DRAM is its low cost per bit, high memory density and high capacity. As a result, DRAM became the “main memory” of modern microprocessor systems. The RAM sticks that we use in our desktop and laptops (DIMMs and SODIMMs) are actually DRAM.

Structure of DRAM

Earlier DRAM designs (in the 1970s) had four or three transistors with one or two parasitic storage capacitors. These were large with significant chip area and high cost per bit. Added to the basic circuit is the additional circuit to refresh the charge on the capacitor in a periodic interval.

SRAM-vs-DRAM-Image-7

Engineers and scientists were able to reduce the size of the DRAM cell dramatically with a new one transistor one capacitor (1T-1C) design. The following image shows the popular 1T DRAM Cell with a storage capacitor.

SRAM-vs-DRAM-Image-6

As a single DRAM cell needs just one transistor (and a capacitor) to store 1-bit, the area of DRAM cell is very less than a SRAM Cell (which requires 6 transistors to store 1-bit). Hence, the memory density of DRAM is significantly higher than SRAM. Also, the cost of fabrication is also less in case of DRAM.

All the computer memory that we add to motherboard using DIMM (Dual in-line Memory Module) and SO-DIMM (Small Outline DIMM) sticks is actually DRAM. Technically, it is DDR SDRAM (Double Data Rate Synchronous Dynamic Random Access Memory) with DDR4 and DDR5 being the latest iterations.

Comparison of SRAM vs DRAM

Let us now compare SRAM vs DRAM with respect to cost, performance, density and other important parameters.

SRAM DRAM
Static Random Access Memory or SRAM is a type of Volatile Memory. Dynamic Random Access Memory is also a type of Volatile Memory.
It is implemented using six transistors (6T) to form a bistable latching circuit (similar to a flip-flop) that holds the data. DRAM is implemented using one transistor and one capacitor (1T1C) and the charge on the capacitor (or lack thereof) represents the binary data.
As it is a volatile memory, it retains/holds data as long as there is power to the circuit without any additional circuitry. Hence, it is Static. Even DRAM is a volatile memory. Which means it can hold/retain data as long as there is power. But since the charge on the capacitor slowly leaks away, there will be additional circuit to periodically refresh the charge and keep the data intact. Hence, it is Dynamic.
SRAM needs 6 transistors to store 1-bit data. DRAM needs only one transistor (and a capacitor) to store 1-bit data.
This means, the size of one SRAM Cell is very large. With 1T1C design, the size of one DRAM Cell is relatively very small.
Large cell size means low memory density (number of memory cells per unit area). Memory Density of DRAM is significantly high due to is small size.
The cost of fabricating SRAM is high due to 6T design. Even though adding capacitor requires separate manufacturing technique, the cost of overall fabrication of DRAM is very less.
The speed of access (read or write) is very high. In fact, in memory hierarchy, they are at the top with CPU Registers, CPU Cache. Access times are comparatively slow than SRAM. They are next in line to SRAM in memory hierarchy.
Power consumption of SRAM is generally less as there are no parasitic leakages. The leakage of capacitor and the requirement of periodic charge refresh means that overall power consumption of DRAM is slightly more.
Due to their low memory density and high fabrication cost, SRAM capacity is often limited to few megabytes (MBs). The memory capacity of DRAM is very large, often in gigabytes (GBs).
The common application is on-chip cache memory (L2, L3) in microprocessors or SRAM (that acts as a temporary storage) in microcontrollers. DRAM is often available as DIMM sticks for computers (that sit on the motherboards). Small devices (some laptops, phones, tablets etc.) have DRAM modules soldered directly on the PCB.

NOTE: We did not delve deep into the workings of SRAM and DRAM as this guide focuses on the basics of SRAM vs DRAM. If you are interested, we can do in-depth guides on the working of both SRAM and DRAM.

Conclusion

SRAM and DRAM are two important types of semiconductor memory in modern computing systems. While they both are volatile in nature, the way they store the data is completely different. In this guide, we saw the basics of semiconductor memories, SRAM, DRAM along with their respective structures. We also saw a detailed SRAM vs DRAM comparison.



Source link

We will be happy to hear your thoughts

Leave a reply

AmElectronics
Logo
Enable registration in settings - general
Compare items
  • Total (0)
Compare
0
Shopping cart