Probabilistic watershed: sampling all spanning forests for seeded segmentation and semi-supervised learning
The seeded Watershed algorithm / minimax semi-supervised learning on a graph computes a minimum spanning forest which connects every pixel / unlabeled node to a seed / labeled node. We propose instead to consider all possible spanning forests and calculate, for every node, the probability of samplin...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article (Journal) Chapter/Article |
| Language: | English |
| Published: |
6 Nov 2019
|
| In: |
Arxiv
Year: 2019, Pages: 1-19 |
| DOI: | 10.48550/arXiv.1911.02921 |
| Online Access: | Verlag, lizenzpflichtig, Volltext: https://doi.org/10.48550/arXiv.1911.02921 Verlag, lizenzpflichtig, Volltext: http://arxiv.org/abs/1911.02921 |
| Author Notes: | Enrique Fita Sanmartin, Sebastian Damrich, Fred A. Hamprecht |
| Summary: | The seeded Watershed algorithm / minimax semi-supervised learning on a graph computes a minimum spanning forest which connects every pixel / unlabeled node to a seed / labeled node. We propose instead to consider all possible spanning forests and calculate, for every node, the probability of sampling a forest connecting a certain seed with that node. We dub this approach "Probabilistic Watershed". Leo Grady (2006) already noted its equivalence to the Random Walker / Harmonic energy minimization. We here give a simpler proof of this equivalence and establish the computational feasibility of the Probabilistic Watershed with Kirchhoff's matrix tree theorem. Furthermore, we show a new connection between the Random Walker probabilities and the triangle inequality of the effective resistance. Finally, we derive a new and intuitive interpretation of the Power Watershed. |
|---|---|
| Item Description: | Gesehen am 13.07.2022 |
| Physical Description: | Online Resource |
| DOI: | 10.48550/arXiv.1911.02921 |