Designed especially for neurobiologists, FluoRender is an interactive tool for multi-channel fluorescence microscopy data visualization and analysis.
Deep brain stimulation
BrainStimulator is a set of networks that are used in SCIRun to perform simulations of brain stimulation such as transcranial direct current stimulation (tDCS) and magnetic transcranial stimulation (TMS).
Developing software tools for science has always been a central vision of the SCI Institute.

Events on February 27, 2019

Xuan Huang Presents:

Firefly: Virtual Illumination Drones for Interactive Visualization

February 27, 2019 at 12:00pm for 30min
Evans Conference Room, WEB 3780
Warnock Engineering Building, 3rd floor.

Abstract:

Light specification in three dimensional scenes is a complex problem and several approaches have been presented that
aim to automate this process. However, there are many scenarios where a static light setup is insufficient, as the scene content and
camera position may change. Simultaneous manual control over the camera and light position imposes a high cognitive load on the
user. To address this challenge, we introduce a novel approach for automatic scene illumination with Fireflies. Fireflies are intelligent
virtual light drones that illuminate the scene by traveling on a closed path. The Firefly path automatically adapts to changes in the
scene based on an outcome-oriented energy function. To achieve interactive performance, we employ a parallel rendering pipeline for
the light path evaluations. We provide a catalog of energy functions for various application scenarios and discuss the applicability of
our method on several examples.

Posted by: Steve Petruzza

Yash Gangrade Presents:

Globe Browsing: Contextualized Spatio-Temporal Planetary Surface Visualization

February 27, 2019 at 12:30pm for 30min
Evans Conference Room, WEB 3780
Warnock Engineering Building, 3rd floor.

Abstract:

Results of planetary mapping are often shared openly for use in scientific research and mission planning. In its raw format, however, the data is not accessible to non-experts due to the difficulty in grasping the context and the intricate acquisition process. We present work on tailoring and integration of multiple data processing and visualization methods to interactively contextualize geospatial surface data of celestial bodies for use in science communication. As our approach handles dynamic data sources, streamed from online repositories, we are significantly shortening the time between discovery and dissemination of data and results. We describe the image acquisition pipeline, the pre-processing steps to derive a 2.5D terrain, and a chunked level-of-detail, out-of-core rendering approach to enable interactive exploration of global maps and high-resolution digital terrain models. The results are demonstrated for three different celestial bodies. The first case addresses high-resolution map data on the surface of Mars. A second case is showing dynamic processes, such as concurrent weather conditions on Earth that require temporal datasets. As a final example we use data from the New Horizons spacecraft which acquired images during a single flyby of Pluto. We visualize the acquisition process as well as the resulting surface data. Our work has been implemented in the OpenSpace software, which enables interactive presentations in a range of environments such as immersive dome theaters, interactive touch tables, and virtual reality headsets.

Posted by: Steve Petruzza