The 64GB capacity is the largest for DRAM modules yet, which will improve application performance because data will be kept in memory longer so that bits don't have to be transferred as often between DRAM and other components such as storage.

Memory chips are currently placed horizontally on DIMMs, a technical term for memory modules plugged into motherboards. But memory chips are getting smaller, so stacking them vertically provides an opportunity to make better use of the space available on DIMMs, which is the approach Samsung is taking.

DDR4 will gradually start replacing existing DDR3 DRAM first in servers and gaming PCs starting next quarter. DDR4 will provide 50 percent more memory bandwidth and 35 percent more power savings than DDR3. Intel is due to release a DDR4-compatible chip called Grantley in early September, which will be used by Lenovo and Dell in servers.

Applications used in data centers will particularly benefit from the new Samsung memory, said Nathan Brookwood, principal analyst at Insight 64. Database and analytics applications, including those from Oracle and SAP, keep data in memory to boost application performance.

"It's getting harder and harder to increase the density of DRAM chips. Stacking is a good way to take existing DRAM chip technology and produce twice as many bits in a package," Brookwood said.

The memory chips in a stack are linked through a connector called TSV, which is emerging as a throughput mechanism for newer memory technologies. TSV is already being used in the emerging HMC (Hybrid Memory Cube) technology from Micron, and will be used by Nvidia in its graphics chips in the coming years.

Samsung's memory chips are made using the advanced 20-nanometer class manufacturing process. Memory makers such as Crucial and Adata are also shipping DDR4 DIMMs, but with lower capacity.