Deep learning for ultra-large-scale semantic segmentation of geographic 3D point clouds with missing labels
Semantic segmentation of 3D point clouds is a critical task essential for research and industry in a wide variety of domains. Most works until now use datasets where the concept of large-scale varies between 1.17 and 9,261 million points and 0.31 and 250 square kilometers. In this research, we push...
Gespeichert in:
| Hauptverfasser: | , , , , |
|---|---|
| Dokumenttyp: | Article (Journal) |
| Sprache: | Englisch |
| Veröffentlicht: |
2026
|
| In: |
IEEE access
Year: 2026, Jahrgang: 14, Pages: 485-501 |
| ISSN: | 2169-3536 |
| DOI: | 10.1109/ACCESS.2025.3647154 |
| Online-Zugang: | Verlag, kostenfrei, Volltext: https://doi.org/10.1109/ACCESS.2025.3647154 Verlag, kostenfrei, Volltext: https://ieeexplore.ieee.org/document/11311458 |
| Verfasserangaben: | Alberto M. Esmorís, Miguel Yermo, Silvia R. Alcaraz, Samuel Soutullo, Francisco F. Rivera |
| Zusammenfassung: | Semantic segmentation of 3D point clouds is a critical task essential for research and industry in a wide variety of domains. Most works until now use datasets where the concept of large-scale varies between 1.17 and 9,261 million points and 0.31 and 250 square kilometers. In this research, we push the limits of 3D deep learning by considering approximately 36,369 million points and 29,557 square kilometers in our case study. To the best of our knowledge, it is the largest geographic region ever classified with neural networks for 3D point clouds in scientific works. The main contributions of our research are: 1) The first published results on semantic segmentation of ultra-large-scale and low-resolution airborne laser scanning point clouds, 2) adapted neural networks to work under missing labels and high measurement error conditions, 3) the introduction of class ambiguity as a more robust uncertainty measurement compared to entropy, and 4) an open-source deep learning framework for 3D point clouds that enables the reproducibility of our experiments and further studies. |
|---|---|
| Beschreibung: | Gesehen am 13.01.2026 Online veröffentlicht: 22. Dezember 2025 |
| Beschreibung: | Online Resource |
| ISSN: | 2169-3536 |
| DOI: | 10.1109/ACCESS.2025.3647154 |