Mid Sweden University

miun.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Low-rank robust online distance/similarity learning based on the rescaled hinge loss
Hakim Sabzevari University, Sabzevar, Iran.
2022 (English)In: Applied intelligence (Boston), ISSN 0924-669X, E-ISSN 1573-7497, Vol. 53, no 1, p. 634-657Article in journal (Refereed) Published
Abstract [en]

An important challenge in metric learning is scalability to both size and dimension of input data. Online metric learning algorithms are proposed to address this challenge. Existing methods are commonly based on Passive/Aggressive (PA) approach. Hence, they can rapidly process large volumes of data with an adaptive learning rate. However, these algorithms are based on the Hinge loss and so are not robust against outliers and label noise. We address the challenges by formulating the online Distance/Similarity learning problem with the robust Rescaled Hinge loss function. The proposed model is rather general and can be applied to any PA-based online Distance/Similarity algorithm. To achieve scalability to data dimension, we propose low-rank online Distance/Similarity methods that learn a rectangular projection matrix instead of a full Mahalanobis matrix. The low-rank approaches not only reduce the computational cost but also keep the discrimination power of the learned metrics. Also, current online methods usually assume training triplets or pairwise constraints exist in advance. However, this assumption does not hold, and generating triplets using available batch sampling methods is both time and space consuming. We address this issue by developing an efficient, yet effective robust one-pass triplet construction algorithm. We conduct several experiments on datasets from various applications. The results confirm that the proposed methods significantly outperform state-of-the-art online metric learning methods in the presence of label noise and outliers by a large margin.

Place, publisher, year, edition, pages
Springer Nature , 2022. Vol. 53, no 1, p. 634-657
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:miun:diva-51063DOI: 10.1007/s10489-022-03419-1Scopus ID: 2-s2.0-85128473307OAI: oai:DiVA.org:miun-51063DiVA, id: diva2:1849197
Available from: 2024-04-05 Created: 2024-04-05 Last updated: 2024-04-11Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Seyed Jalaleddin, Mousavirad

Search in DiVA

By author/editor
Zabihzadeh, DavoodSeyed Jalaleddin, Mousavirad
In the same journal
Applied intelligence (Boston)
Computer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 19 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf