Cross-dataset Learning for Generalizable Land Use Scene Classification

Dimitri Gominski, Valerie Gouet-Brunet, Liming Chen

    Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

    4 Citations (Scopus)
    10 Downloads (Pure)

    Abstract

    Few-shot and cross-domain land use scene classification methods propose solutions to classify unseen classes or un-seen visual distributions, but are hardly applicable to real-world situations due to restrictive assumptions. Few-shot methods involve episodic training on restrictive training subsets with small feature extractors, while cross-domain methods are only applied to common classes. The underlying challenge remains open: can we accurately classify new scenes on new datasets? In this paper, we propose a new framework for few-shot, cross-domain classification. Our retrieval-inspired approach1 exploits the interrelations in both the training and testing data to output class labels using compact descriptors. Results show that our method can accurately produce land-use predictions on unseen datasets and unseen classes, going beyond the traditional few-shot or cross-domain formulation, and allowing cross-dataset training.

    Original languageEnglish
    Title of host publicationProceedings - 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2022
    Number of pages10
    PublisherIEEE Computer Society Press
    Publication date2022
    Pages1381-1390
    ISBN (Electronic)9781665487399
    DOIs
    Publication statusPublished - 2022
    Event2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2022 - New Orleans, United States
    Duration: 19 Jun 202220 Jun 2022

    Conference

    Conference2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2022
    Country/TerritoryUnited States
    CityNew Orleans
    Period19/06/202220/06/2022
    SeriesIEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
    Volume2022-June
    ISSN2160-7508

    Bibliographical note

    Publisher Copyright:
    © 2022 IEEE.

    Cite this