NERDS LAB

  • About
  • Research
    • Papers
    • Code / Data
  • People
    • Lab Members
    • PI-Profile
  • News
  • Contact Us

Archives for September 2021

ICIP 2021 – Multi-scale modeling of neural structure in X-ray imagery

September 22, 2021 by Eva Dyer

Check out our new paper at ICIP on multi-scale segmentation of brain structure from X-ray microCT image volumes!  [Check out our paper here!]

Abstract:   Methods for resolving the brain’s microstructure are rapidly improving, allowing us to image large brain volumes at high resolutions. As a result, the interrogation of samples spanning multiple diversified brain regions is becoming increasingly common. Understanding these samples often requires multiscale processing: segmentation of the detailed microstructure and large-scale modelling of the macrostructure. Current brain mapping algorithms often analyze data only at a single scale, and optimization for each scale occurs independently, potentially limiting the consistency, performance, and interpretability. In this work we introduce a deep learning framework for segmentation of brain structure at multiple scales. We leverage a modified U-Net architecture with a multi-task learning objective and unsupervised pre-training to simultaneously model both the micro and macro architecture of the brain. We successfully apply our methods to a heterogeneous, three-dimensional, X-ray micro-CT dataset spanning multiple regions in the mouse brain, and show that our approach consistently outperforms another multi-task architecture, and is competitive with strong single-task baselines at both scales.

Filed Under: Posts

UAI 2021 – Bayesian optimization for modular black-box systems with switching costs

September 22, 2021 by Eva Dyer

Henry presented his paper on Bayesian Optimization at the Conference on Uncertainty in AI (UAI)! [Check out the paper here!]

Abstract:  Most existing black-box optimization methods assume that all variables in the system being optimized have equal cost and can change freely at each iteration. However, in many real-world systems, inputs are passed through a sequence of different operations or modules, making variables in earlier stages of processing more costly to update. Such structure induces a dynamic cost from switching variables in the early parts of a data processing pipeline. In this work, we propose a new algorithm for switch-cost-aware optimization called Lazy Modular Bayesian Optimization (LaMBO). This method efficiently identifies the global optimum while minimizing cost through a passive change of variables in early modules. The method is theoretically grounded which achieves a vanishing regret regularized with switching cost. We apply LaMBO to multiple synthetic functions and a three-stage image segmentation pipeline used in a neuroimaging task, where we obtain promising improvements over existing cost-aware Bayesian optimization algorithms. Our results demonstrate that LaMBO is an effective strategy for black-box optimization capable of minimizing switching costs.

 

Filed Under: Posts

Recent Posts

  • NeurIPS 2024: Revealing connections between contrastive learning and optimal transport January 1, 2025
  • ICML 2024: Unveiling class disparities with spectral imbalance July 9, 2024
  • ICLR 2024: New work on data-adaptive position embeddings for timeseries transformers June 3, 2024
  • Check out this new visualization tool for behavior modeling! May 9, 2024
  • New paper on the theory of data augmentation in JMLR! April 8, 2024
  • New paper on data-adaptive latent augmentation to appear at WACV! January 6, 2024
IMG_2521
  • About
  • Research
  • People
  • News
  • Contact Us

Copyright © 2025 · Minimum Pro on Genesis Framework · WordPress · Log in