Learning Instance Occlusion for Panoptic Segmentation

Justin Lazarow, Kwonjoon Lee, Kunyu Shi, Zhuowen Tu

Panoptic segmentation requires segments of both "things" (countable object instances) and "stuff" (uncountable and amorphous regions) within a single output. A common approach involves the fusion of instance segmentation (for "things") and semantic segmentation (for "stuff") into a non-overlapping placement of segments, and resolves occlusions (or overlaps). However, instance ordering with detection confidence do not correlate well with natural occlusion relationship. To resolve this issue, we propose a branch that is tasked with modeling how two instance masks should overlap one another as a binary relation. Our method, named OCFusion, is lightweight but particularly effective on the "things" portion of the standard panoptic segmentation benchmarks, bringing significant gains (up to +3.2 PQ^Th and +2.0 overall PQ) on the COCO dataset --- only requiring a short amount of fine-tuning. OCFusion is trained with the ground truth relation derived automatically from the existing dataset annotations. We obtain state-of-the-art results on COCO and show competitive results on the Cityscapes panoptic segmentation benchmark.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment