IBM Storage Breakthrough Paves Way for 330TB Tape Cartridges

By Tiffany Trader

August 3, 2017

IBM announced yesterday a new record for magnetic tape storage that it says will keep tape storage density on a Moore’s law-like path far into the next decade. In collaboration with Sony, IBM scientists recorded 201 gigabits on one square inch of prototype magnetic tape, achieving a 20x improvement over the areal density used in current state of the art enterprise tape drives. This works out to a potential standard cartridge capacity of 330 terabytes (TB) of uncompressed data.

The demonstration device combines Sony’s new magnetic tape technology with IBM Research’s newly developed write/read heads and its next-generation servo and signal processing technologies, detailed in the IBM announcement:

+ Innovative signal-processing algorithms for the data channel, based on noise-predictive detection principles, which enable reliable operation at a linear density of 818,000 bits per inch with an ultra-narrow 48nm wide tunneling magneto-resistive (TMR) reader.

+ A set of advanced servo control technologies that when combined enable head positioning with an accuracy of better than 7 nanometers. This combined with a 48nm wide (TMR) hard disk drive read head enables a track density of 246,200 tracks per inch, a 13-fold increase over a state of the art TS1155 drive.

+ A novel low friction tape head technology that permits the use of very smooth tape media.

It’s the first time that IBM is using sputtered magnetic tape instead of the traditional barium ferrite technology.

Sony’s new magnetic tape technology (Credit: Sony)

As explained by IBM Scientist Mark Lantz (see video at end of article) “sputter tape uses several layers of thin metal films that are coated onto the tape using vacuum sputter technology that’s similar technology as is that used for integrated circuits.”

In current generation tape drives, a thin film of nanoscale barium ferrite particles are applied to the tape in liquid form, like a thin layer of paint.

“While sputtered tape is expected to cost a little more to manufacture than current commercial tape that uses Barium ferrite (BaFe), the potential for very high capacity will make the cost per TB very attractive,” said IBM Fellow Evangelos Eleftheriou.

(click to enlarge)

IBM envisions tape becoming a viable storage media for cloud, both as a back-up application and an archival tier for infrequently-accessed data. Of course, spinning disc and flash storage have their advantages, but tape’s economics for massive data archives are hard to argue with.

IBM believes that this latest achievement puts the company on track to continue scaling tape technology at near-historic rates, doubling the storage capacity every two years for at least the next ten years.

IBM’s legacy of tape storage innovations stretches back more than 60 years. The company launched its first commercial tape product, the 726 Magnetic Tape Unit, in 1952, providing a speed and capacity advantage over existing cathode ray tube and drum storage devices. The 726 used an oxide-coated, non-metallic tape, approximately a half-inch wide with a density of 100 bits per linear inch.

IBM and Sony presented the results of their collaboration this week at the 28th Magnetic Recording Conference in Japan. Neither company indicated when we might actually see a 330 TB tape drive in the wild. Leading conventional technology can only handle 15 TB per data cartridge.

Seeking to reign in the tediousness of manual software testing, Pfizer HPC Engineer Shahzeb Siddiqui is developing an open source software tool called buildtest, aimed at automating software stack testing by providing the community with a central repository of tests for common HPC apps and the ability to automate execution of testing. Read more…

By Tiffany Trader

In just a few months time, Senegal will be operating the second largest HPC system in sub-Saharan Africa. The Minister of Higher Education, Research and Innovation Mary Teuw Niane made the announcement on Monday (Jan. 14 Read more…

By Tiffany Trader

If it's Nvidia GPUs you're after to power your AI/HPC/visualization workload, Google Cloud has them, now claiming "broadest GPU availability." Each of the three big public cloud vendors has by turn touted the latest and Read more…

Previous:

STAC (Securities Technology Analysis Center) recently released an ‘exploratory’ benchmark for machine learning which it hopes will evolve into a firm benchmark or suite of benchmarking tools to compare the performanc Read more…

By James Reinders

Quantum computing has lived so long in the future it’s taken on a futuristic life of its own, with a Gartner-style hype cycle that includes triggers of innovation, inflated expectations and – though a useful quantum system is still years away – anticipatory troughs of disillusionment. Read more…

By John Russell

Anyone who has checked a forecast to decide whether or not to pack an umbrella knows that weather prediction can be a mercurial endeavor. It is a Herculean task: the constant modeling of incredibly complex systems to a high degree of accuracy at a local level within very short spans of time. Read more…

By John Russell

Cray revealed today the details of its next-gen supercomputing architecture, Shasta, selected to be the next flagship system at NERSC. We've known of the code-name "Shasta" since the Argonne slice of the CORAL project was announced in 2015 and although the details of that plan have changed considerably, Cray didn't slow down its timeline for Shasta. Read more…

By Tiffany Trader

It’s been a good two weeks, AMD’s Gary Silcott and Andy Parma told me on the last day of SC18 in Dallas at the restaurant where we met to discuss their show news and recent successes. Heck, it’s been a good year. Read more…

By Tiffany Trader

For nearly two hours on Monday at SC18, Jensen Huang, CEO of Nvidia, presented his expansive view of the future of HPC (and computing in general) as only he can do. Animated. Backstopped by a stream of data charts, product photos, and even a beautiful image of supernovae... Read more…

By John Russell

Riding healthy U.S. and global economies, strong demand for AI-capable hardware and other tailwind trends, the high performance computing server market jumped 28 percent in the second quarter 2018 to $3.7 billion, up from $2.9 billion for the same period last year, according to industry analyst firm Hyperion Research. Read more…

By John Russell

As part of the run-up to SC18, taking place in Dallas next week (Nov. 11-16), Intel is doling out info on its next-gen Cascade Lake family of Xeon processors, specifically the “Advanced Processor” version (Cascade Lake-AP), architected for high-performance computing, artificial intelligence and infrastructure-as-a-service workloads. Read more…

By Tiffany Trader

Networking equipment powerhouse Mellanox could be an acquisition target by Microsoft, according to a published report in an Israeli financial publication. Microsoft has reportedly gone so far as to engage Goldman Sachs to handle negotiations with Mellanox. Read more…