Skip to content

COCO-format noisy-label benchmarks for instance segmentation (VIPER-N / COCO-N) #46

@mkimhi

Description

@mkimhi

Hi! I’m sharing a small HF collection of datasets/benchmarks for instance segmentation under noisy labels (COCO-format):

It contains:

  • VIPER (clean): images + clean COCO instances_*.json
  • VIPER-N: noisy COCO instance annotations (annotations-only)
  • COCO-N: noisy COCO instance annotations (annotations-only)

All annotations are standard COCO instances JSON, so you can evaluate by swapping instances_{train,val}2017.json paths.

If this is relevant to your project, I’d love any feedback—and if you maintain a benchmark list / eval suite, feel free to include the collection.

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions