The secret to our data storage woes could be an atom or, more precisely, a grid of them.
Atoms, the smallest building blocks in nature, have their appeal as a storage medium. We certainly need a new storage paradigm. Something that takes up considerably less space than the current large-scale solution: data centers.
Dotted throughout the world, data centers are like highly organized versions of our basements, crammed full of stuff we only occasionally access, but still consuming vast amounts of space and energy. According to the environmental group Natural Resources Defense Council (NRDC), these data centers eat 70 billion kilowatt hours of electricity a year (as of 2014).
Companies like Facebook are at least trying to tackle the energy issue with new, eco-friendly, wind-powered data centers. But that won’t help with the space issue. Re-writable atomic memory could potentially tackle both challenges.
In a new paper published Monday in the science journal Nature, Delft University researchers announced their atomic-level breakthrough in data storage.
Using atoms to store data puts these researches a microscopic step ahead of the scientists who, earlier this month, figured out how to store an OK Go music video on DNA.
“The base pairs in a DNA molecule (A, T, C and G) each consist of tens of atoms. So no matter how you store data in there, it will be less dense than one bit per atom, as we have demonstrated,” Associate Professor Sander Otte told Mashable in an email. The lead researcher, added that, while DNA is linear, allowing for one dimension of storage, their breakthrough works in two dimensions.
According to the study, scientists took a copper plate, dried chlorine onto it and then used the natural grid-like structure of the chlorine atoms to identify 8×8 atom blocks. They also made sure to only partially cover the plate with atoms, leaving multiple blocks open within each grid, but never having two blocks open in a series (when that happened, it was identified as an error and those blocks weren’t used for data storage).
Otte explained that the chlorine atoms bond to the copper via an ionic bond. This holds them in place, but allows them to be moved. The study describes them being moved in much the same way as you move pieces in a traditional sliding puzzle, which usually has just one block missing.
Each pair of missing (known as vacancies) and filled-in atoms is identified as a bit (a 1 or a 0). The scientists were then able to move these atoms from their original spot to the vacancy using the tip of a scanning tunneling microscope (STM). Moving the atoms was equivalent to writing bits. An atom and an open slot was one bit, the 8×8 grid of atoms equaled a letter and multiple grids could equal stored text. Eight thousand atomic bits could equal 1 kilobyte.
Scientists found they could potentially store a whopping 500 terabits per square inch, making the density of this atomic storage 500-times greater than current hard drive technology.
As proof of concept, researchers stored a passage from scientist Richard Feynman’s classic lecture on miniaturization, Plenty of room at the bottom, in a 1,016-byte block of atomic memory.
Read and write
The same STM tip, which is used to move the atoms in and out of vacancies and write the data, is also what’s used to read the data back into the system.
On read, the STM tip does not actually touch the atoms. “The STM tip is a super-sensitive height probe that can tell if there is an atom beneath it or a hole, and thus read out the data,” said Otte in an email.
“I dont foresee any physical limitation in speeding this up to the same level as for example hard disks.”
Otte’s team of eight, which included two theoretical physicists and five experimental physicists like himself, generally did not manipulate the atoms by hand. “Once the memory was built, all read and write protocols were fully automated. Also the building itself was mostly automated. Only occasionally we had to intervene and fix something by hand,” he wrote.
“We are enthusiastic about new developments that can help reduce the energy consumption of Americas 3 million data centers, and the pollution associated with producing the electricity needed to run them, said Pat Remick, senior energy communications strategist with the NRDC who was made aware of the study.
Writing about the memory breakthrough in Nature, Steven C. Erwin of the Center for Computational Materials at the Naval Research Lab called the work remarkable, but noted that kilobyte atomic memory is still “far from practical.”
Creating one single block of this atomic memory required extremely low temperatures and considerable time. It takes 10 minutes to write a block and one to two minutes to read. As a result, Otte doesn’t see commercialization happening any time soon.
“In principle, I dont foresee any physical limitation in speeding this up to the same level as for example hard disks (approximately 1 megabit per second), but there are many technological hurdles,” admitted Otte.
However, Otte believes the true importance of the kilobyte re-writable atomic memory is not just data storage. It’s a “demonstration of how well we can now organize the world with atomic precision,” he wrote.
Otter also remains surprised with how well the project went.
“We manipulated the first vacancies in November last year. In a week we were able to write 64 bits.” he wrote. “I then joked to my team members that at this pace we would get to a kilobyte by the end of January. But to my amazement, this turned out not to be a joke at all.”
[UPDATED 7-18-2016 5:19 PM ET: Added new, more accurate data center energy usage number from the NRDC]
Have something to add to this story? Share it in the comments.