[go: up one dir, main page]

Skip to main content
Log in

Relaxed group low rank regression model for multi-class classification

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Least squares regression is an effective multi-classification method; however, in practical applications, many models based on the least squares regression method are significantly affected by noise (and outliers). Therefore, effectively reducing the adverse effects of noise is conducive to obtaining a better classification performance. Besides, preserving the intrinsic characteristics of samples to the greatest extent possible is beneficial for improving the discriminative ability of the model. Based on this analysis, we propose the relaxed group low-rank regression model for multi-class classification. The model effectively captures the hidden structural information of samples by exploiting the group low-rank constraint. Meanwhile, with the group low-rank constraint and the graph embedding constraint, the proposed method has more tolerance to noise (and outliers). The feature matrix with the L21-norm and the graph embedding constraint complement each other to capture the intrinsic characteristics of the samples. In addition, a sparsity error term with the L21 norm is utilized to relax the strict target label matrix. These factors guarantee that the original samples are converted into a more compact and discriminative characteristic space. Finally, we compare the proposed model with various popular algorithms on several benchmark datasets. The experimental results demonstrate that the performance of the proposed method outperforms those of state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Notes

  1. http://www2.ece.ohio-state.edu/∼aleix/ARdatabase.html

  2. http://www.anefian.com/research/face_reco.htm

  3. http://fei.edu.br/cet/facedatabase.html

  4. http://www.flintbox.com/public/project/4742/

  5. http://vis-www.cs.umass.edu/lfw/

References

  1. Asgharian L, Ebrahimnezhad H (2020) How many sample points are sufficient for 3D model surface representation and accurate mesh simplification? Multimed Tools Appl 79:29595–29620. https://doi.org/10.1007/s11042-020-09395-3

    Article  Google Scholar 

  2. Cai J, Candes E, Shen Z (2010) A singular value Thresholding algorithm for matrix completion. SIAM J Optim 20(4):1956–1982

    Article  MathSciNet  Google Scholar 

  3. Cai X, Ding C, Nie F, Huang H (2013) On the equivalent of low-rank linear regressions and linear discriminant analysis based regressions. In: Proceedings of the ACM SIGKDD international conference on knowledge discovery and data mining, pp. 1124–1132

  4. Chang K, Hsieh C, Lin C (2008) Coordinate descent method for large-scale L2-loss linear support vector machines. J Mach Learn Res 9:1369–1398

    MathSciNet  MATH  Google Scholar 

  5. Chen Y, Xu W, Zuo J, Yang K (2019) The fire recognition algorithm using dynamic feature fusion and IV-SVM classifier. Clust Comput 22:7665–7675

    Article  Google Scholar 

  6. Chen Y, Xiong J, Xu W, Zuo J (2019) A novel online incremental and decremental learning algorithm based on variable support vector machine. Clust Comput 22:7435–7445

    Article  Google Scholar 

  7. Chen Y, Wang J, Liu S, Chen X, Xiong J, Xie J, Yang K (2019) Multiscale fast correlation filtering tracking algorithm based on a feature fusion model. Concurrency and Computation-Practice and Experience. https://doi.org/10.1002/cpe.5533

  8. Chen Y, Wang J, Xia R, Zhang Q, Cao Z, Yang K (2019) The visual object tracking algorithm research based on adaptive combination kernel. J Ambient Intell Humaniz Comput 10(12):4855–4867

    Article  Google Scholar 

  9. Chen Y, Wang J, Chen X, Zhu M, Yang K, Wang Z, Xia R (2019) Single-image super-resolution algorithm based on structural self-similarity and deformation block features. IEEE Access 7:58791–58801

    Article  Google Scholar 

  10. Chen Y, Tao J, Zhang Q, Yang K, Chen X, Xiong J, Xia R, Xie J (2020) Saliency detection via the improved hierarchical principal component analysis method. Wirel Commun Mob Comput 2020:1–12. https://doi.org/10.1155/2020/8822777

    Article  Google Scholar 

  11. Chen Y, Tao J, Liu L, Xiong J, Xia R, Xie J, Zhang Q, Yang K (2020) Research of improving semantic image segmentation based on a feature fusion model. J Ambient Intell Humaniz Comput. https://doi.org/10.1007/s12652-020-02066-z

  12. He X, Niyogi P (2003) Locality preserving projections, In: Proceedings of Conference on Advances in Neural Information Processing Systems (NIPS), pp 234–241

  13. Hosmer D, Lemeshow J, Sturdivant R (2013) Applied logistic regression. Wiley, New York

    Book  Google Scholar 

  14. Li Y, Ngom A (2013) Nonnegative least-squares methods for the classification of high-dimensional biological data. IEEE/ACM trans. Comput Biol Bioinf 10(2):447–456

    Google Scholar 

  15. Lin Z, Chen M, Ma Y (2010) The augmented Lagrange multiplier method for exact recovery of corrupted low-rank matrice. arXiv preprint arXiv:1009.5055

  16. Lu X, Ma C, Ni B, et al (2018) Deep regression tracking with shrinkage loss. In: European conference on computer vision, pp. 369–386

  17. Lu X, Ma C, Ni B, Yang X (2019) Adaptive region proposal with channel regularization for robust object tracking. IEEE Transactions on Circuits and Systems for Video Technology:1. https://doi.org/10.1109/TCSVT.2019.2944654

  18. Nene S, Nayar S, Murase H (1996) Columbia object image library (COIL-100). Technical report CUCS-006-96

  19. Nie F, Huang H, Cai X, Ding C (2010) Efficient and robust feature selection via joint L2,1-norms minimization, In: Proc. Adv. Neural Inf. Process. Syst. pp 1813–1821

  20. Nie F, Xiang S, Liu Y, Hou C, Zhang C (2012) Orthogonal vs. uncorrelated least squares discriminant analysis for feature extraction. Pattern Recognit. Lett. 33(5):485–491

    Google Scholar 

  21. Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition arXiv preprint arXiv: 1409.1556

  22. Wang L, Pan C (2018) Groupwise retargeted least-squares regression. IEEE Trans Neural Netw 29(4):1352–1358

    Article  MathSciNet  Google Scholar 

  23. Wei L, Wang X, Wu A, Zhou R, Zhu C (2018) Robust subspace segmentation by self-representation constrained low-rank representation. Neural Process Lett 48:1671–1691. https://doi.org/10.1007/s11063-018-9783-y

    Article  Google Scholar 

  24. Wen J, Xu Y, Li Z, Ma Z, Xu Y (2018) Inter-class sparsity based discriminative least square regression. Neural Netw 102:36–47

    Article  Google Scholar 

  25. Wright J, Yang A, Ganesh A, Sastry S, Ma Y (2009) Robust face recognition via sparse representation. IEEE Trans Pattern Anal Mach Intell 31(2):210–227

    Article  Google Scholar 

  26. Xiang S, Zhu Y, Shen X, Ye J (2012) Optimal exact least squares rank minimization. In: Proceedings of the ACM SIGKDD international conference on knowledge discovery and data mining, pp. 480–488

  27. Xiang S, Nie F, Meng G, Pan C, Zhang C (2012) Discriminative least squares regression for multiclass classification and feature selection. IEEE Trans Neural Netw 23(11):1738–1754

    Article  Google Scholar 

  28. Xu Y, Fang X, Zhu Q, Chen Y, You J, Liu H (2014) Modified minimum squared error algorithm for robust classification and face recognition experiments. Neurocomputing 135:253–261

    Article  Google Scholar 

  29. Xu Y, Fang X, Wu J, Li X, Zhang D (2016) Discriminative transfer subspace learning via low-rank and sparse representation. IEEE Trans Image Process 25(2):850–863

    Article  MathSciNet  Google Scholar 

  30. Yuan H, Zheng J, Lai L et al (2018) A constrained least squares regression model. Inf Sci 429:247–259

    Article  MathSciNet  Google Scholar 

  31. Zhang L, Yang M, Feng X (2011) Sparse representation or collaborative representation: which helps face recognition? In: IEEE international conference on computer vision, Barcelona, Spain, pp. 471–478

  32. Zhang X, Wang L, Xiang S, Liu C (2015) Retargeted least squares regression algorithm. IEEE Trans Neural Netw 26(9):2206–2213

    Article  MathSciNet  Google Scholar 

  33. Zhang Z, Xu Y, Yang J, Li X, Zhang D (2015) A survey of sparse representation: algorithms and applications. IEEE Access 3:490–530

    Article  Google Scholar 

  34. Zhang Z, Lai Z, Xu Y, Shao L, Wu J, Xie GS (2017) Discriminative elastic-net regularized linear regression. IEEE Trans Image Process 26(3):1466–1481

    Article  MathSciNet  Google Scholar 

  35. Zhao H, Wang Z, Nie F (2016) Orthogonal least squares regression for feature extraction. Neurocomputing 216:200–207

    Article  Google Scholar 

  36. Zheng W, Xin M, Wang X, Wang B (2014) A novel speech emotion recognition method via incomplete sparse least square regression. Signal process. Lett. https://doi.org/10.1109/LSP.2014.2308954. 1–1

Download references

Acknowledgments

This paper is supported by the Graduate Innovation Foundation of Jiangsu Province under Grant No. KYLX16_0781, the Natural Science Foundation of Jiangsu Province under Grants No. BK20181340, the 111 Project under Grants No. B12018, and PAPD of Jiangsu Higher Education Institutions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongwei Ge.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, S., Ge, H., Yang, J. et al. Relaxed group low rank regression model for multi-class classification. Multimed Tools Appl 80, 9459–9477 (2021). https://doi.org/10.1007/s11042-020-10080-8

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-020-10080-8

Keywords

Navigation