The brain project: 3D visualization and navigation
Patrick J Kelly, MD FACS NYU and Jean-Marc Gauthier ITP Tisch School of the Arts NYU

Collaborators for this project also include ITP student members of the Brain Group: Caroline Pino, Rocio Barcia, Jae In Lee, Sandra Villareal, Chunxi Jian. The current version includes spatial computer vision detection in collaboration with Didier Bouchon.

MRI data from NYU Medical Center.

 
  Links to online documentation used for this project: http://www.sph.sc.edu/comd/rorden/ http://www.nottingham.ac.uk/radiology/paul/  
  "Kelly and Gauthier's new Internet-based Web browser allows 3-D navigation inside a brain using a cloud of voxels, or pixels, located in space. Since the images of slices of the brain are displayed in space they can be visited from many angles, including new angles that were not included in the original pictures. This is key to enabling freer navigation. The viewer can then navigate the virtual brain in any direction regardless of the orientation of the original slices of the MRI."  
 

More about this project:

"Navigating Inside the Brain" by Richard Pierce

"Journey to the Center of the Brain" by Janelle Nanos

Download QuickTime Movie

 
 

The following steps show how a 3D visualization of the brain can be created from a sequence of MRI images. Source: NYU Medical Center. Viewers can navigate inside the image space model using a webcam.

1) Six MRI images of the brain are taken along the same axis.

The pictures of the brain are dependent of the orientation of the patient during the MRI. The images are organized in a sequence of frames that can be viewed as a film strip.

 
 

 

2) Slices of the brain can be viewed independently from the original sequence of MRI images of the brain.

 

In this case the images can be visualized in real time from any point of view. The viewer can explore the brain with a virtual camera moving around the brain and inside the brain. The resulting image is a slice of the brain which is created in real time according to the point of view of the viewer. This can help a surgeon during the planning of a brain surgery.

For example, a surgeon can visualize slices of the brain along the best path to reach a tumor. The display can be used as an interactive 3D map for the planning before the surgery. In this example, a webcam tracking head movements of the surgeon is used for controlling the display of the slices of the brain.

 
    

Left, this illustration shows the sequence of MRI images displayed inside a as a cloud of pixel-points inside a three dimensional volume. The resulting volume is a 3D object with the following material attributes: texture blend = decal, material = transparent. A point-cloud shader is applied to the 3D object.

Right, a visualization plane is now inserted inside the cloud of pixel-points. Real time images of the brain can be viewed on the plane slicer while in motion.

 
    
Creating in between frames that are not included in the original sequence of images:

Left view, shows the visualization plane moving inside the point cloud object. The plane slicer is aligned with the axis of the sequence of MRI images presented in (1). Bottom view, the point cloud is turned off. Although the plane slicer moves between two frames of the sequence of space images, an image is still visible on the plane slicer. This image is the result of the interpolation between two frames of the sequence of MRI images presented in (1).

Controlling new orientations of the camera and viewing frames that are not included in the original sequence of images:

Right view, the visualization plane moving inside the point cloud object at an angle. The plane at a 45 degree angle with the axis of the sequence of MRI images. The bottom view shows the plane slicer with the point cloud turned off. Please note that although the plane is at an angle with the sequence of space images, an image is still visible on the plane slicer. This image is the result of the interpolation between two frames of the sequence of MRI images presented in (1).

 
 

3) Play with point-cloud and visualization plane. Try the Interactive demo,

This demo may require a video card that can handle real time shaders.

 
 
Click on the image to view the online demo. You can orbit around the point-clouds, turn on and off the point clouds. You can also orientate the visualization plane.  
Links to online documentation used for this project: http://www.sph.sc.edu/comd/rorden/
http://www.nottingham.ac.uk/radiology/paul/ MRI data from NYU Medical Center