Designed especially for neurobiologists, FluoRender is an interactive tool for multi-channel fluorescence microscopy data visualization and analysis.
Deep brain stimulation
BrainStimulator is a set of networks that are used in SCIRun to perform simulations of brain stimulation such as transcranial direct current stimulation (tDCS) and magnetic transcranial stimulation (TMS).
Developing software tools for science has always been a central vision of the SCI Institute.

Events on May 22, 2024

Shikai Fang Presents:

Bayesian Tensor Learning for Dynamic High-Order Data

May 22, 2024 at 7:30am for 1hr Meeting ID: 322 296 5503 Passcode: 132512


Tensor decomposition is an essential technique in high-dimensional data analysis and prediction, serving as a fundamental tool for uncovering the multi-faceted structures inherent in tensor data. Traditional methods like CANDECOMP/PARAFAC (CP) and Tucker decomposition have pioneered this area. However, these methods struggle with the sparsity and noise common in tensor data, and they lack mechanisms to handle the dynamic nature of real-world data, including streaming updates, temporal variations, and function tensors with continuous indices.

This dissertation presents a comprehensive suite of advancements in Bayesian tensor learning that address these challenges across various forms of dynamic tensor data: streaming tensors, temporal tensors, and functional tensors. For streaming tensor data, we introduce the Bayesian Streaming Sparse Tucker Decomposition (BASS) and the Streaming Bayesian Deep Tensor Factorization (SBDT), which both provide efficient and scalable solutions for streaming tensor analysis with sparse machnisms. As for temporal tensors, we present a novel temporal Tucker model: Bayesian Continuous-Time Tucker Decomposition (BCTT), and Streaming Factor Trajectory Learning for Temporal Tensor Decomposition (SFTL),  a efficient temporal tensor learning method that can dynamically capture the evolving temporal factors. Further extending the scope to functional tensor, the Functional Bayesian Tucker Decomposition for Continuous-indexed Tensor (FunBAT) adapts tensor decomposition to continuous domains, enabling seamless application to data with continuous indices.

Throughout this dissertation, we will explore how each contribution underpins a robust and scalable Bayesian framework, demonstrating significant real-world implications for handling the complexities of dynamic data across various applications.

Posted by: Nathan Galli