• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Fan, Haoqi (Fan, Haoqi.) | Zhang, Yuanshi (Zhang, Yuanshi.) | Zuo, Guoyu (Zuo, Guoyu.) (Scholars:左国玉)

Indexed by:

EI Scopus

Abstract:

In this paper, we proposed a segmentation approach that not only segment an interest object but also label different semantic parts of the object, where a discriminative model is presented to describe an object in real world images as multiply, disparate and correlative parts. We propose a multi-stage segmentation approach to make inference on the segments of an object. Then we train it under the latent structural SVM learning framework. Then, we showed that our method boost an average increase of about 5% on ETHZ Shape Classes Dataset and 4% on INRIA horses dataset. Finally, extensive experiments of intricate occlusion on INRIA horses dataset show that the approach have a state of the art performance in the condition of occlusion and deformation. Copyright © 2014 SCITEPRESS - Science and Technology Publications. All rights reserved.

Keyword:

Support vector machines Computer vision Semantics Image segmentation

Author Community:

  • [ 1 ] [Fan, Haoqi]Department of Computer Science, Beijing University of Technology, Beijing, China
  • [ 2 ] [Zhang, Yuanshi]Department of Statistics, Columbia University, NY, United States
  • [ 3 ] [Zuo, Guoyu]School of Electronics Information and Control Engineering, Beijing University of Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2014

Volume: 2

Page: 486-493

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 1

Online/Total:733/5324415
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.