Research team turns terabytes of image data into model of neural circuits

Sarah Zhang in The Harvard Gazette:

SEAS_Hanspeter_380 The brain of a mouse measures only 1 cubic centimeter in volume. But when neuroscientists at Harvard’s Center for Brain Science slice it thinly and take high-resolution micrographs of each slice, that tiny brain turns into an exabyte of image data. That’s 1018 bytes, equivalent to more than a billion CDs.

What can you do with such a gigantic, unwieldy data set? That’s the latest challenge for Hanspeter Pfister, the Gordon McKay Professor of the Practice of Computer Science at Harvard’s School of Engineering and Applied Sciences (SEAS).

Pfister, an expert in high-performance computing and visualization, is part of an interdisciplinary team collaborating on the Connectome Project at the Center for Brain Science. The project aims to create a wiring diagram of all the neurons in the brain. Neuroscientists have developed innovative techniques for automatically imaging slices of mouse brain, yielding terabytes of data so far.

Pfister’s system for displaying and processing these images would be familiar to anyone who has used Google Maps. Because only a subsection of a very large image can be displayed on a screen, only that viewable subsection is loaded. Drag the image around, zoom in or out, and more of the image is displayed on the fly.

This “demand-driven distributed computation” is the central idea behind Pfister’s work, for which he recently won a Google Faculty Research Award.

More here. [Thanks to Sughra Raza.]