Differences

This shows you the differences between two versions of the page.

Link to this comparison view

research [2020/01/29 22:57]
research [2024/02/15 19:13] (current)
Line 1: Line 1:
 ====== Research ====== ====== Research ======
  
-===== Variational Image Analysis ​on Manifolds ​and Metric Measure Spaces ​=====+===== Assignment Flows: Dynamical Systems ​on Riemannian ​Manifolds ​for Data Analysis ​=====
  
-**Scope.** We exploit basic statistical manifolds to devise variational models of low-level image analysis that exhibit favorable properties in comparison to established convex and non-convex modelssmoothness, probabilistic interpretation,​ efficiently converging parallel and sparse Riemannian numerical updates that scale up to large problem sizes+This project is funded by the DFG within the [[https://www.foundationsofdl.de|Priority Programme on the Theoretical Foundations of Deep Learning]]
  
-The current focus is on the **//​assignment manifold//** and image labeling, and on learning ​from image assignments in large-scale unsupervised scenarios, within ​the mathematical frameworks ​of information geometry and regularised optimal transportA novel smooth ​dynamical ​system evolving on statistical manifoldcalled **//​assignment flow//**, forms the basis of our work.+**Scope.** We study product spaces of elementary Riemannian manifolds for the context-sensitive analysis of data observed in any metric space. State spaces interact dynamically by geometric averaging ​and locally according to the adjacency structure of an underlying graph. The corresponding interaction parameters are learned ​from data. Geometric integration of the resulting continuous-time flow generates layers ​of a neural networkOur approach enables to study dynamical ​relations of inference and learning in neural networks from geometric viewpointalong with a probabilistic interpretation of contextual decision making. From the numerical point of view, the approach copes with high dimensions and large problem sizes.
  
-**Mathematical aspects.** ​The assignment flow evolves non-locally for any data given on a graph. Geometric ​and variational ​aspectsextensions to continuous domainsscale separation and models of knowledge representation across the scales are investigated+**Mathematical aspects.** ​Information geometry, coupled and regularized information transport, geometric mechanics ​on manifolds ​and variational ​principlesgeometric numerical integrationstatistical performance characterization using PAC-Bayesian analysis.
  
-A preliminary extension from graphs to the continuous domain in the `zero-scale limit' (local interaction only) reveals the interplay between the underlying geometry and variational aspects. +**Recent work.** [[https://​ipa.math.uni-heidelberg.de/​publications|download]] 
-  ​[[https://​arxiv.org/​abs/​1910.07287|Continuous-Domain Assignment Flow, preprint arXiv:​1910.07287]]. +  extension to generative ​assignment ​flows for discrete joint distributions (arXiv:2402.07846, 2024) 
-A more classical //​additive//​ variational reformulation provides a smooth geometric version of the continuous cut approach. +  * geometric embedding approach to multiple games and populations (arXiv::2401.059182024) 
-  ​* [[https://​ipa.math.uni-heidelberg.de/​dokuwiki/​Papers/​Savarino2019aa.pdf|A Variational Perspective on the Assignment Flow, SSVM 2019]]+  * quantum state assignment flows (Entropy2023) 
- +  geometric mechanics ​of assignment ​flows (Information Geometry, 2023) 
-**Parameter learning.** We study how weights for geometric diffusion that parametrize the adaptivity of the assignment ​flow can be learned from data. Symplectic integration ensures the commutativity of discretization and optimization operations. Results reveal the steerability of the assignment flow and its potential ​for pattern //​formation//​+  * novel PAC-Bayes bound for structured prediction (NeurIPS, 2023) 
-  * [[https://arxiv.org/​abs/​1910.09976|Learning Adaptive Regularization for Image Labeling Using Geometric Assignmentpreprint arXiv:​1910.09976]] +  * self-certifying classification ​by linearized deep assignment ​flows (PAMM2023) 
-  * [[https://​ipa.math.uni-heidelberg.de/​dokuwiki/​Papers/​Huhnerbein2019aa.pdf|Conference versionSSVM 2019]]. +  * non-local graph PDE for structured labeling based on assignment flows (SIAM SIIMS2023) 
- +  * learning linearized ​assignment ​flows (JMIV2023) 
-**Unsupervised label learning.** Our recent work concerns the emergence ​of labels in a completely unsupervised way by data //self//-assignment. The resulting //​self-assignment flow// has connections to low-rank matrix factorization and discrete optimal mass transport that are explored in our current work. +  * convergence and stability ​of assignment flows (Information Geometry2022) 
-  * [[https://​arxiv.org/​abs/​1911.03472|Self-Assignment Flows for Unsupervised Data Labeling on Graphs; preprint: arXiv:​1911.03472]] +  * continuous-domain ​assignment ​flows (EuropJApplMath., 2021
-  * [[https://​ipa.math.uni-heidelberg.de/​dokuwiki/​Papers/​Zisler2019aa.pdf|Unsupervised Labeling ​by Geometric and Spatially Regularized Self-Assignment,​ SSVM 2019]]. +  * order-constrained 3D OCT segmentation using assignment flows (IJCV2021
- +  * self-assignment flows (SIAM SIIMS, 2020) 
-We extended the assignment ​flow to //​unsupervised//​ scenarioswhere label evolution on a feature manifold is simultaneously performed together with label assignment to given data. The following papers introduce the corresponding //​unsupervised assignment flow//. +  * unsupervised assignment flows (JMIV2020) 
-  * [[https://​ipa.math.uni-heidelberg.de/​dokuwiki/​Papers/​Zern2019aa.pdf|Unsupervised Assignment Flow: Label Learning ​on Feature Manifolds by Spatially Regularized Geometric Assignmentpreprint arXiv:​1904.10863]] +  geometric integration ​of assignment flows (Inverse Problems2020) 
-  * [[https://​ipa.math.uni-heidelberg.de/​dokuwiki/​Papers/​gcpr2018.pdf|Unsupervised Label Learning on Manifolds by Spatially Regularized Geometric Assignment, GCPR 2018]]. +  assignment flows for labelingintroduction (HandbookVarMethNonlGeomData2020) 
- +  assignment flows for metric data labeling (JMIV2017)
-**Geometric numerical integration.** We conducted a comprehensive study of //geometric integration//​ techniques, including automatic step size adaption, for numerically computing the assignment ​flow in a stableefficient and parameter-free way.  +
-  * [[https://​iopscience.iop.org/​article/​10.1088/​1361-6420/​ab2772|Geometric Numerical Integration ​of the Assignment FlowInverse Problems, 2019]] +
-  * [[https://​ipa.math.uni-heidelberg.de/​dokuwiki/​Papers/​Zeilmann2018aa.pdf|preprint:​ arXiv:​1810.06970]] +
- +
-**Evaluation of discrete graphical models.** We applied our approach to solve in a novel way the //MAP labeling problem// based on a given graphical model by smoothly combining a geometric reformulation of the local polytope relaxation with rounding to an integral solution. A key ingredient are local `//​Wasserstein messages//'​ that couple local assignment ​measures along edges. +
- +
-  * [[https://​epubs.siam.org/​doi/​abs/​10.1137/​17M1150669|Image Labeling Based on Graphical Models Using Wasserstein Messages and Geometric AssignmentSIAM J. on Imaging Science 11/2 (20181317--1362]] +
- +
-**Kick-off paper** that introduces the basic approach: +
- +
-  * [[https://​ipa.math.uni-heidelberg.de/​dokuwiki/​Papers/​Astroem2017.pdf|Image Labeling by Assignment.J. Math. Imag. Vision 58/2 (2017211--238]] +
-  * [[http://​www-rech.telecom-lille.fr/​diff-cv2016/​|Proceedings DIFF-CVML'​16;​ Grenander best paper award]] +
-  * [[https://​ipa.iwr.uni-heidelberg.de/​dokuwiki/​Papers/​Astroem2016d.pdf|Proceedings ECCV'​16]] +
-===== Estimating Vehicle Ego-Motion and Piecewise Planar Scene Structure from Optical Flow in a Continuous Framework ===== +
- +
-We propose a variational approach for estimating egomotion and structure of a static scene from a pair of images recorded by a single moving camera. In our approachthe scene structure is described by a set of 3D planar surfaces, which are linked to a SLIC superpixel decomposition of the image domain. The continuously parametrized planes are determined along with the extrinsic camera parameters by jointly minimizing a non-convex smooth objective function, that comprises a data term based on the pre-calculated optical flow between the input images and suitable priors on the scene variables. +
- +
- +
-**Researchers**:​ Andreas Neufeld, Johannes Berger, Florian Becker, Frank Lenzen, Christoph Schnörr +
- +
-[[research:​hflow:​start|Details]] +
- +
- +
- +
-===== Minimum Energy Filtering on Lie Groups with Application to Structure and Motion Estimation from Monocular Videos ===== +
- +
-We investigate Minimum Energy Filters on Lie Groups in order to reliably estimate camera motion relative to a static scene from noisy data. In addition to properly taking into account the geometry ​of the state spacewe also deal with nonlinearities of the observation equation. A long-term objective concerns the estimation of accelerated camera motion in connection with scene depth from monocular videos. +
- +
-**Researchers**: +
-Johannes Berger, Andreas Neufeld, Florian Becker, Frank Lenzen, Christoph Schnörr +
- +
-Details[[https://​ipa.iwr.uni-heidelberg.de/​dokuwiki/​Papers/​Berger2015a.pdf|Paper]] +
-===== Partial Optimality in MAP-MRF ===== +
- +
-We consider the energy minimization problem for undirected graphical modelsalso known as MAP-inference problem for Markov random fields which is NP-hard in general. We propose a novel polynomial time algorithm to obtain a part of its optimal non-relaxed integral solution. For this task, we devise a novel pruning strategy that utilizes standard MAP-solvers as a subroutine. We show that our pruning strategy is in a certain sense theoretically optimal. Also empirically our method outperforms previous approaches in terms of the number of persistently labeled variables. The method is very general, as it is applicable to models with arbitrary factors of arbitrary order and can employ any solver for the considered relaxed problem. Our method’s runtime is determined by the runtime of the convex relaxation solver for the MAP-inference problem. +
- +
-**Researchers**:​ Paul SwobodaBogdan Savchynskyy,​ Alexander Shekhovtsov,​ Jörg Hendrik Kappes, Christoph Schnörr\\  +
-Details: ​ [[https://​ipa.iwr.uni-heidelberg.de/​dokuwiki/​Papers/​Swoboda2016.pdf|Paper]]+