Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.11851/7469
Full metadata record
DC FieldValueLanguage
dc.contributor.authorDemirci, Muhammed Fatih-
dc.contributor.authorShokoufandeh, Ali-
dc.contributor.authorDickinson, Sven J.-
dc.date.accessioned2021-09-11T15:57:13Z-
dc.date.available2021-09-11T15:57:13Z-
dc.date.issued2009en_US
dc.identifier.issn0162-8828-
dc.identifier.issn1939-3539-
dc.identifier.urihttps://doi.org/10.1109/TPAMI.2008.267-
dc.identifier.urihttps://hdl.handle.net/20.500.11851/7469-
dc.description.abstractLearning a class prototype from a set of exemplars is an important challenge facing researchers in object categorization. Although the problem is receiving growing interest, most approaches assume a one-to-one correspondence among local features, restricting their ability to learn true abstractions of a shape. In this paper, we present a new technique for learning an abstract shape prototype from a set of exemplars whose features are in many-to-many correspondence. Focusing on the domain of 2D shape, we represent a silhouette as a medial axis graph whose nodes correspond to "parts" defined by medial branches and whose edges connect adjacent parts. Given a pair of medial axis graphs, we establish a many-to-many correspondence between their nodes to find correspondences among articulating parts. Based on these correspondences, we recover the abstracted medial axis graph along with the positional and radial attributes associated with its nodes. We evaluate the abstracted prototypes in the context of a recognition task.en_US
dc.description.sponsorshipTUBITAKTurkiye Bilimsel ve Teknolojik Arastirma Kurumu (TUBITAK) [107E208]; US Office of Naval ResearchOffice of Naval Research; US National Science Foundation (NSF)National Science Foundation (NSF); NSERCNatural Sciences and Engineering Research Council of Canada (NSERC); PREA; NSFNational Science Foundation (NSF); CITOen_US
dc.description.sponsorshipF. Demirci gratefully acknowledges the support of TUBITAK grant no. 107E208, A. Shokoufandeh the support of the US Office of Naval Research and US National Science Foundation (NSF), and S. Dickinson the support of NSERC, PREA, NSF, and CITO.en_US
dc.language.isoenen_US
dc.publisherIEEE Computer Socen_US
dc.relation.ispartofIEEE Transactions On Pattern Analysis And Machine Intelligenceen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectShape abstractionen_US
dc.subjectmedial axis graphsen_US
dc.subjectprototype learningen_US
dc.subjectmany-to-many graph matchingen_US
dc.titleSkeletal Shape Abstraction from Examplesen_US
dc.typeArticleen_US
dc.departmentFaculties, Faculty of Engineering, Department of Computer Engineeringen_US
dc.departmentFakülteler, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümütr_TR
dc.identifier.volume31en_US
dc.identifier.issue5en_US
dc.identifier.startpage944en_US
dc.identifier.endpage952en_US
dc.identifier.wosWOS:000264144500014en_US
dc.identifier.scopus2-s2.0-64849114188en_US
dc.institutionauthorDemirci, Muhammed Fatih-
dc.identifier.pmid19299866en_US
dc.identifier.doi10.1109/TPAMI.2008.267-
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.scopusqualityQ1-
item.fulltextNo Fulltext-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.languageiso639-1en-
item.cerifentitytypePublications-
item.openairetypeArticle-
item.grantfulltextnone-
crisitem.author.dept02.3. Department of Computer Engineering-
Appears in Collections:Bilgisayar Mühendisliği Bölümü / Department of Computer Engineering
PubMed İndeksli Yayınlar Koleksiyonu / PubMed Indexed Publications Collection
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Show simple item record



CORE Recommender

SCOPUSTM   
Citations

46
checked on Apr 20, 2024

WEB OF SCIENCETM
Citations

31
checked on Apr 20, 2024

Page view(s)

56
checked on Apr 22, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.