High Density Data Storage

The structure of DNA, for referenceApproximately two weeks ago you might have heard of a story about the storage of an entire genetics textbook in DNA – quite a feat. Today, Brian Thomas writes Scientists Store 70 Billion Books on DNA. If 70 billion sounds like a bit much, note that numerous duplicates were needed to avoid errors – that, and the researches were trying for a record.

The reason why Thomas chose to write about this subject can be summed up with this quote:

There is no material that has as much data storage density as DNA. It is better than blue-ray discs, hard drives, and even flash drives. The Science report shows that DNA is six powers of ten denser than flash drive technology.

The sheer superiority of DNA as a data storage medium is strong evidence for its supernatural creation.

I could discuss the flaws in this article – talking about what data storage actually means, and probably mentioning along the way how it does not logically follow that the best data storage system must be created by something supernatural. But that would be unnecessary and pointless, because it’s simply not true that DNA is the absolute best medium around when it comes to storage density.

The Science Daily article for this research, Writing the Book in DNA: Geneticist Encodes His Book in Life’s Language, includes the following two paragraphs:

The researchers used binary code to preserve the text, images and formatting of the book. While the scale is roughly what a 5 ¼-inch floppy disk once held, the density of the bits is nearly off the charts: 5.5 petabits, or 1 million gigabits, per cubic millimeter. “The information density and scale compare favorably with other experimental storage methods from biology and physics,” said Sri Kosuri, a senior scientist at the Wyss Institute and senior author on the paper. The team also included Yuan Gao, a former Wyss postdoc who is now an associate professor of biomedical engineering at Johns Hopkins University.

And where some experimental media — like quantum holography — require incredibly cold temperatures and tremendous energy, DNA is stable at room temperature. “You can drop it wherever you want, in the desert or your backyard, and it will be there 400,000 years later,” Church said.

That’s not quite the same degree of dominance that Thomas was spinning. And what’s this ‘quantum holography’ thing?

A 2009 press release from Stanford University talks about a breakthrough, saying:

The researchers encoded the letters “S” and “U” (as in Stanford University) within the interference patterns formed by quantum electron waves on the surface of a sliver of copper. The wave patterns even project a tiny hologram of the data, which can be viewed with a powerful microscope.

The density they then achieved is truly phenomenal:

For Manoharan, the true significance of the work lies in storing more information in less space. “How densely can you encode information on a computer chip? The assumption has been that basically the ultimate limit is when one atom represents one bit, and then there’s no more room—in other words, that it’s impossible to scale down below the level of atoms.

“But in this experiment we’ve stored some 35 bits per electron to encode each letter. And we write the letters so small that the bits that comprise them are subatomic in size. So one bit per atom is no longer the limit for information density. There’s a grand new horizon below that, in the subatomic regime. Indeed, there’s even more room at the bottom than we ever imagined.”

35 bits per electron is far greater than what we’re talking about with DNA. Now yes, this does have a few drawbacks (as does the DNA system), but there’s plenty else around. Going down a few orders of magnitude – but still arguably better than using entire base pairs – we have the Xenon “IBM”, from 1990:

IBM in Xenon atoms

There will be plenty else if you’re prepared to look, which Brian clearly was not.

But there’s more – it turns out that you can modify DNA to make it even more data-dense. Back in July Evolution News and Views was trying to spin this very fact, ironically enough, as evidence that DNA is designed. A study in PNAS added a third possible pair of synthetic DNA bases, which naturally increases the amount of information it could hold. EN&V wrote:

What’s interesting for intelligent design theory is that this achievement demonstrates contingency: the natural DNA code is not predestined. Even though the natural genetic code is “conserved through all of life,” experiments such as these show that other codes are possible.

So the ICR says that DNA is the greatest, therefore God; and the Discovery Institute claims effectively the same thing but based on the opposite premise, that DNA need not be the best on offer. While I understand that, at least on the surface, the DI tries to have nothing to do with more traditional creationists, they do need to have a meeting someday to sort out matters related to the consistency of message.


There is at least one, more minor error in Thomas’ article that I can spot – he claims:

A computer reassembled the data into its proper order because the scientists engineered ID tag sequences every 19 base pairs.

Contrast this with the Science Daily article:

In another departure, the team rejected so-called “shotgun sequencing,” which reassembles long DNA sequences by identifying overlaps in short strands. Instead, they took their cue from information technology, and encoded the book in 96-bit data blocks, each with a 19-bit address to guide reassembly. Including jpeg images and HTML formatting, the code for the book required 54,898 of these data blocks, each a unique DNA sequence.

They did not put tags every 19 base pairs – they made them 19 bits long (out of 96).

Given that inaccuracy, I’m not sure whether I can trust this claim:

The scientists first built a digital coding and decoding scheme whereby either of two of DNA’s four chemical bases was designated “0,” and either of the other two bases was designated “1.”

I can’t find a source for it, but it could be true – the effect would be to reduce the data capacity, from two bits per base down to only one.

Because of all of these problems I suspect this article may be at risk of modification to fix at least some of them – here’s a screenshot as I read it.

Advertisements

One thought on “High Density Data Storage

  1. “Our method has at least five advantages over past DNA storage approaches. We encode one bit per base (A or C for zero, G or T for one), instead of two.”

    From the research. It also notes that this work can hold 2 more log10 bits per mm^3 (14 v 16). I don’t quite know what that means, but I think it’s a lot.

Thoughts?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s