Alex - stock.adobe.com

An intro to CAMM memory and how it compares with DIMMs

Compression attached memory modules, or CAMMs, are making news as a potential replacement for DRAM DIMMs. How big of a future do they have?

There's a new memory format in town. The compression attached memory module strives to provide improved signal integrity and faster performance.

CAMM memory is an alternative to the time-tested DIMM. But how does it compare to DIMMs, and what are its best uses?

A bit of background

It's somewhat surprising that DIMMs haven't already been displaced by a newer technology. The DIMM started out as a single inline memory module (SIMM) with Wang Laboratories back in the 1980s.

Wang's novel idea was to mount memory on smaller daughterboards that plugged into the motherboard, enabling systems to be built of a single large board. They also had varying amounts of memory, depending on the customer's memory needs.

When processors' buses expanded to 64 bits with the Intel Pentium processor, a single 32-bit SIMM was too narrow to service the whole bus, so matched pairs of SIMMs became necessary. To get past the matching need, developers widened the connector and put connections on both sides of the daughterboard -- thus the name dual -- to create the DIMM in use ever since. That was in the early 1990s. It's time for an update.

Potential uses for CAMMs

As of late 2024, CAMM memory is not in wide use, although that should change quickly. Dell, the creator of the format, pioneered its use in notebook PCs.

The notebook PC will be the leading consumer of CAMMs for the next few years, as Dell's competitors start to follow the company's lead. JEDEC has created a CAMM standard called CAMM2 to foster the format's growth.

Leading DRAM makers have already introduced LPCAMM2 products featuring low-power double data rate (LPDDR) DRAM for mobile applications. In the past, mobile applications that used LPDDR for power savings exclusively used soldered-down memory instead of modules because the LPDDR DRAM couldn't drive the high capacitance that burdens DIMMs.

Over time, CAMM memory should migrate from lower-end applications, such as PCs, to higher-end applications, such as servers, thanks to the CAMM's superior signal integrity and other features.

Over time, CAMM memory should migrate from lower-end applications, such as PCs, to higher-end applications, such as servers, thanks to the CAMM's superior signal integrity and other features. This could take a few years, though. For the nearer term, IT managers need to inventory two kinds of DRAM modules: DIMMs for servers and CAMMs for mobile computers.

How CAMMs compare with DIMMs

DIMMs weren't originally conceived with the idea of running faster than a few tens of megahertz, so signal strength and capacitive loading were not big considerations in their design. The connector type, too, was already standardized well before Wang created the SIMM.

As a relic of the 1980s or earlier, the SIMM form factor was not designed to consider signal path length or connector capacitance. CAMM memory addresses these issues.

CAMM PC board layouts minimize the signal path length between the DRAM chip and the connector. Shorter signal lines lead to better signal integrity than the longer lines on a DIMM.

The CAMM connector isn't a card-edge connector like DIMMs use -- it's a bed of nails onto which the CAMM is pressed by screws, thus the name compression. Bed-of-nails connectors provide lower capacitance and resistance than card-edge connectors, which improves their performance at high frequencies. This improvement translates to better signal integrity and potentially higher bus transfer frequencies between the CAMM and the CPU. The bed of nails also has a more reliable connection than a card-edge connector.

With heightened signal integrity and shorter path lengths, less power is required to drive the signal lines, providing CAMMs with a small power advantage over DIMMs. In addition, JEDEC's LPCAMM2 specification calls for LPDDR rather than standard DDR, and active power drops by 61%, according to Lenovo.

Since developers originally designed the CAMM to be mounted parallel to the motherboard, the combination of motherboard plus CAMM memory has a lower profile than side-mounted DIMMs. CAMMs save space in most applications. Lenovo claimed the space savings are 64% compared to two small outline DIMMs.

Who's making them now?

Dell is in the lead in this business, having invented the concept. CAMMs are in multiple Dell thin-and-light PCs. Lenovo has announced an LPCAMM2-based thin-and-light PC, and MSI has unveiled a CAMM2-based desktop PC.

The three leading DRAM makers -- Samsung, SK Hynix and Micron -- make CAMMs, as do certain leading module suppliers, such as Adata and Kingston. Expect other suppliers to announce CAMM memory products.

Samsung's LPCAMM
Samsung has announced an LPCAMM product.

Jim Handy is a semiconductor and SSD analyst at Objective Analysis in Los Gatos, Calif.

Dig Deeper on Flash memory and storage

Disaster Recovery
Data Backup
  • RAID 5 vs. RAID 10

    RAID is a staple for backup and storage administrators who want to create redundancy and protect data. RAID 5 and RAID 10 offer ...

  • Eon makes cloud backups available across major hyperscalers

    Cloud backup vendor Eon, which emerged from stealth earlier this month, offers a platform for hyperscaler migrations with ...

  • RAID 1 vs. RAID 5

    Neither RAID 1 nor RAID 5 is clearly better than the other, but there are several areas to compare the two to find the right ...

Data Center
Sustainability
and ESG
Close