• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索
High Impact Results & Cited Count Trend for Year Keyword Cloud and Partner Relationship

Query:

学者姓名:张海斌

Refining:

Source

Submit Unfold

Co-Author

Submit Unfold

Language

Submit

Clean All

Sort by:
Default
  • Default
  • Title
  • Year
  • WOS Cited Count
  • Impact factor
  • Ascending
  • Descending
< Page ,Total 9 >
A TIGHT BOUND OF MODIFIED ITERATIVE HARD THRESHOLDING ALGORITHM FOR COMPRESSED SENSING SCIE
期刊论文 | 2023 , 68 (5) , 623-642 | APPLICATIONS OF MATHEMATICS
Abstract&Keyword Cite

Abstract :

We provide a theoretical study of the iterative hard thresholding with partially known support set (IHT-PKS) algorithm when used to solve the compressed sensing recovery problem. Recent work has shown that IHT-PKS performs better than the traditional IHT in reconstructing sparse or compressible signals. However, less work has been done on analyzing the performance guarantees of IHT-PKS. In this paper, we improve the current RIP-based bound of IHT-PKS algorithm from delta(3s-2k) < 1/root 32 approximate to 0.1768 to delta(3s-2k) < root 5-1/4 approximate to 0.309, where delta(3s-2k) is the restricted isometric constant of the measurement matrix. We also present the conditions for stable reconstruction using the IHT mu -PKS algorithm which is a general form of IHT-PKS. We further apply the algorithm on Least Squares Support Vector Machines (LS-SVM), which is one of the most popular tools for regression and classification learning but confronts the loss of sparsity problem. After the sparse representation of LS-SVM is presented by compressed sensing, we exploit the support of bias term in the LS-SVM model with the IHT mu -PKS algorithm. Experimental results on classification problems show that IHT mu -PKS outperforms other approaches to computing the sparse LS-SVM classifier.

Keyword :

signal reconstruction signal reconstruction iterative hard thresholding iterative hard thresholding least squares support vector machine least squares support vector machine classification problem classification problem

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Ma, Jinyao , Zhang, Haibin , Yang, Shanshan et al. A TIGHT BOUND OF MODIFIED ITERATIVE HARD THRESHOLDING ALGORITHM FOR COMPRESSED SENSING [J]. | APPLICATIONS OF MATHEMATICS , 2023 , 68 (5) : 623-642 .
MLA Ma, Jinyao et al. "A TIGHT BOUND OF MODIFIED ITERATIVE HARD THRESHOLDING ALGORITHM FOR COMPRESSED SENSING" . | APPLICATIONS OF MATHEMATICS 68 . 5 (2023) : 623-642 .
APA Ma, Jinyao , Zhang, Haibin , Yang, Shanshan , Jiang, Jiaojiao . A TIGHT BOUND OF MODIFIED ITERATIVE HARD THRESHOLDING ALGORITHM FOR COMPRESSED SENSING . | APPLICATIONS OF MATHEMATICS , 2023 , 68 (5) , 623-642 .
Export to NoteExpress RIS BibTex
Efficient dual ADMMs for sparse compressive sensing MRI reconstruction SCIE
期刊论文 | 2023 , 97 (2) , 207-231 | MATHEMATICAL METHODS OF OPERATIONS RESEARCH
Abstract&Keyword Cite

Abstract :

Magnetic Resonance Imaging (MRI) is a kind of medical imaging technology used for diagnostic imaging of diseases, but its image quality may be suffered by the long acquisition time. The compressive sensing (CS) based strategy may decrease the reconstruction time greatly, but it needs efficient reconstruction algorithms to produce high-quality and reliable images. This paper focuses on the algorithmic improvement for the sparse reconstruction of CS-MRI, especially considering a non-smooth convex minimization problem which is composed of the sum of a total variation regularization term and a l(1)-norm term of the wavelet transformation. The partly motivation of targeting the dual problem is that the dual variables are involved in relatively low dimensional subspace. Instead of solving the primal model as usual, we turn our attention to its associated dual model composed of three variable blocks and two separable non-smooth function blocks. However, the directly extended alternating direction method of multipliers (ADMM) must be avoided because it may be divergent, although it usually performs well numerically. In order to solve the problem, we employ a symmetric Gauss-Seidel (sGS) technique based ADMM. Compared with the directly extended ADMM, this method only needs one additional iteration, but its convergence can be guaranteed theoretically. Besides, we also propose a generalized variant of ADMM because this method has been illustrated to be efficient for solving semidefinite programming in the past few years. Finally, we do extensive experiments on MRI reconstruction using some simulated and real MRI images under different sampling patterns and ratios. The numerical results demonstrate that the proposed algorithms significantly achieve high reconstruction accuracies with fast computational speed.

Keyword :

Symmetric Gauss-Seidel iteration Symmetric Gauss-Seidel iteration Alternating direction method of multipliers Alternating direction method of multipliers Non-smooth convex minimization Non-smooth convex minimization Magnetic resonance imaging Magnetic resonance imaging Compressive sensing Compressive sensing

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Ding, Yanyun , Li, Peili , Xiao, Yunhai et al. Efficient dual ADMMs for sparse compressive sensing MRI reconstruction [J]. | MATHEMATICAL METHODS OF OPERATIONS RESEARCH , 2023 , 97 (2) : 207-231 .
MLA Ding, Yanyun et al. "Efficient dual ADMMs for sparse compressive sensing MRI reconstruction" . | MATHEMATICAL METHODS OF OPERATIONS RESEARCH 97 . 2 (2023) : 207-231 .
APA Ding, Yanyun , Li, Peili , Xiao, Yunhai , Zhang, Haibin . Efficient dual ADMMs for sparse compressive sensing MRI reconstruction . | MATHEMATICAL METHODS OF OPERATIONS RESEARCH , 2023 , 97 (2) , 207-231 .
Export to NoteExpress RIS BibTex
A bi-criteria algorithm for online non-monotone maximization problems: DR-submodular plus concave SCIE
期刊论文 | 2023 , 979 | THEORETICAL COMPUTER SCIENCE
Abstract&Keyword Cite

Abstract :

In this paper, we study a class of online non-monotone maximization problems under general constraint set. In each round, the function fed back by the environment is of composite structure: the sum of a DR-submodular function and a concave function. This setting covers a wide range of applications. We propose a Frank-Wolfe type online algorithm for solving the considered problem. In our algorithm, there is no need to have the ability to obtain exact gradients of the revealed functions. Adopting the Lyapunov function approach and variance reduction technique, the algorithm is shown to have the bi-criteria competitive ratio (1/4 , 3/8) with sub-linear regret under selecting suitable parameters.(c) 2023 Elsevier B.V. All rights reserved.

Keyword :

Bi-criteria competitive ratio Bi-criteria competitive ratio DR-submodular DR-submodular Concave Concave Regret Regret Online optimization Online optimization

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Feng, Junkai , Yang, Ruiqi , Zhang, Haibin et al. A bi-criteria algorithm for online non-monotone maximization problems: DR-submodular plus concave [J]. | THEORETICAL COMPUTER SCIENCE , 2023 , 979 .
MLA Feng, Junkai et al. "A bi-criteria algorithm for online non-monotone maximization problems: DR-submodular plus concave" . | THEORETICAL COMPUTER SCIENCE 979 (2023) .
APA Feng, Junkai , Yang, Ruiqi , Zhang, Haibin , Zhang, Zhenning . A bi-criteria algorithm for online non-monotone maximization problems: DR-submodular plus concave . | THEORETICAL COMPUTER SCIENCE , 2023 , 979 .
Export to NoteExpress RIS BibTex
Online Non-monotone DR-Submodular Maximization: 1/4 Approximation Ratio and Sublinear Regret CPCI-S
期刊论文 | 2022 , 13595 , 118-125 | COMPUTING AND COMBINATORICS, COCOON 2022
WoS CC Cited Count: 2
Abstract&Keyword Cite

Abstract :

In an era of data explosion and uncertain information, online optimization becomes a more and more powerful framework. And online DR-submodular maximization is an important subclass because its wide aplications in machine learning, statistics, etc., and significance for exploring general non-convex problems. In this paper, we focus on the online non-monotone DR-submodular maximizaition under general constraint set, and propose a meta-Frank-Wolfe online algorithm with appropriately choosing parameters. Based on the Lyapunov function approach in [8] and variance reduction technique in [16], we show that the proposed online algorithm attains sublinear regret against a 1/4 approximation ratio to the best fixed action in hindsight.

Keyword :

Variance reduction Variance reduction Regret Regret Approximation ratio Approximation ratio Online optimization Online optimization DR-submodularity DR-submodularity

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Feng, Junkai , Yang, Ruiqi , Zhang, Haibin et al. Online Non-monotone DR-Submodular Maximization: 1/4 Approximation Ratio and Sublinear Regret [J]. | COMPUTING AND COMBINATORICS, COCOON 2022 , 2022 , 13595 : 118-125 .
MLA Feng, Junkai et al. "Online Non-monotone DR-Submodular Maximization: 1/4 Approximation Ratio and Sublinear Regret" . | COMPUTING AND COMBINATORICS, COCOON 2022 13595 (2022) : 118-125 .
APA Feng, Junkai , Yang, Ruiqi , Zhang, Haibin , Zhang, Zhenning . Online Non-monotone DR-Submodular Maximization: 1/4 Approximation Ratio and Sublinear Regret . | COMPUTING AND COMBINATORICS, COCOON 2022 , 2022 , 13595 , 118-125 .
Export to NoteExpress RIS BibTex
An inertial Douglas-Rachford splitting algorithm for nonconvex and nonsmooth problems SCIE
期刊论文 | 2021 , 35 (17) | CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE
WoS CC Cited Count: 1
Abstract&Keyword Cite

Abstract :

In the fields of wireless communication and data processing, there are varieties of mathematical optimization problems, especially nonconvex and nonsmooth problems. For these problems, one of the biggest difficulties is how to improve the speed of solution. To this end, here we mainly focused on a minimization optimization model that is nonconvex and nonsmooth. Firstly, an inertial Douglas-Rachford splitting (IDRS) algorithm was established, which incorporate the inertial technology into the framework of the Douglas-Rachford splitting algorithm. Then, we illustrated the iteration sequence generated by the proposed IDRS algorithm converges to a stationary point of the nonconvex nonsmooth optimization problem with the aid of the Kurdyka-Lojasiewicz property. Finally, a series of numerical experiments were carried out to prove the effectiveness of our proposed algorithm from the perspective of signal recovery. The results are implicit that the proposed IDRS algorithm outperforms another algorithm.

Keyword :

nonconvex and nonsmooth optimization nonconvex and nonsmooth optimization Kurdyka-Lojasiewicz property Kurdyka-Lojasiewicz property Douglas-Rachford splitting Douglas-Rachford splitting inertial inertial

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Feng, Junkai , Zhang, Haibin , Zhang, Kaili et al. An inertial Douglas-Rachford splitting algorithm for nonconvex and nonsmooth problems [J]. | CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE , 2021 , 35 (17) .
MLA Feng, Junkai et al. "An inertial Douglas-Rachford splitting algorithm for nonconvex and nonsmooth problems" . | CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE 35 . 17 (2021) .
APA Feng, Junkai , Zhang, Haibin , Zhang, Kaili , Zhao, Pengfei . An inertial Douglas-Rachford splitting algorithm for nonconvex and nonsmooth problems . | CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE , 2021 , 35 (17) .
Export to NoteExpress RIS BibTex
A Novel Adaptive Differential Privacy Algorithm for Empirical Risk Minimization SCIE
期刊论文 | 2021 , 38 (05) | ASIA-PACIFIC JOURNAL OF OPERATIONAL RESEARCH
WoS CC Cited Count: 1
Abstract&Keyword Cite

Abstract :

Privacy-preserving empirical risk minimization model is crucial for the increasingly frequent setting of analyzing personal data, such as medical records, financial records, etc. Due to its advantage of a rigorous mathematical definition, differential privacy has been widely used in privacy protection and has received much attention in recent years of privacy protection. With the advantages of iterative algorithms in solving a variety of problems, like empirical risk minimization, there have been various works in the literature that target differentially private iteration algorithms, especially the adaptive iterative algorithm. However, the solution of the final model parameters is imprecise because of the vast privacy budget spending on the step size search. In this paper, we first proposed a novel adaptive differential privacy algorithm that does not require the privacy budget for step size determination. Then, through the theoretical analyses, we prove that our proposed algorithm satisfies differential privacy, and their solutions achieve sufficient accuracy by infinite steps. Furthermore, numerical analysis is performed based on real-world databases. The results indicate that our proposed algorithm outperforms existing algorithms for model fitting in terms of accuracy.

Keyword :

iteration algorithm iteration algorithm empirical risk minimization empirical risk minimization Differential privacy Differential privacy

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Zhang, Kaili , Zhang, Haibin , Zhao, Pengfei et al. A Novel Adaptive Differential Privacy Algorithm for Empirical Risk Minimization [J]. | ASIA-PACIFIC JOURNAL OF OPERATIONAL RESEARCH , 2021 , 38 (05) .
MLA Zhang, Kaili et al. "A Novel Adaptive Differential Privacy Algorithm for Empirical Risk Minimization" . | ASIA-PACIFIC JOURNAL OF OPERATIONAL RESEARCH 38 . 05 (2021) .
APA Zhang, Kaili , Zhang, Haibin , Zhao, Pengfei , Chen, Haibin . A Novel Adaptive Differential Privacy Algorithm for Empirical Risk Minimization . | ASIA-PACIFIC JOURNAL OF OPERATIONAL RESEARCH , 2021 , 38 (05) .
Export to NoteExpress RIS BibTex
Selected Papers from the 1st International Symposium on Thermal-Fluid Dynamics (ISTFD2019) SCIE
期刊论文 | 2021 , 43 (8-10) , 655-657 | HEAT TRANSFER ENGINEERING
WoS CC Cited Count: 1
Abstract&Keyword Cite

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Bai, Bofeng , Zhang, Haibin , Cheng, Lixin et al. Selected Papers from the 1st International Symposium on Thermal-Fluid Dynamics (ISTFD2019) [J]. | HEAT TRANSFER ENGINEERING , 2021 , 43 (8-10) : 655-657 .
MLA Bai, Bofeng et al. "Selected Papers from the 1st International Symposium on Thermal-Fluid Dynamics (ISTFD2019)" . | HEAT TRANSFER ENGINEERING 43 . 8-10 (2021) : 655-657 .
APA Bai, Bofeng , Zhang, Haibin , Cheng, Lixin , Ghajar, Afshin J. . Selected Papers from the 1st International Symposium on Thermal-Fluid Dynamics (ISTFD2019) . | HEAT TRANSFER ENGINEERING , 2021 , 43 (8-10) , 655-657 .
Export to NoteExpress RIS BibTex
Sparse multiple instance learning with non -convex penalty SCIE
期刊论文 | 2020 , 391 , 142-156 | NEUROCOMPUTING
WoS CC Cited Count: 6
Abstract&Keyword Cite

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Zhang, Yuqi , Zhang, Haibin , Tian, Yingjie . Sparse multiple instance learning with non -convex penalty [J]. | NEUROCOMPUTING , 2020 , 391 : 142-156 .
MLA Zhang, Yuqi et al. "Sparse multiple instance learning with non -convex penalty" . | NEUROCOMPUTING 391 (2020) : 142-156 .
APA Zhang, Yuqi , Zhang, Haibin , Tian, Yingjie . Sparse multiple instance learning with non -convex penalty . | NEUROCOMPUTING , 2020 , 391 , 142-156 .
Export to NoteExpress RIS BibTex
A Modified Nonlinear Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization SCIE
期刊论文 | 2020 , 185 (1) , 223-238 | JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS
WoS CC Cited Count: 7
Abstract&Keyword Cite

Abstract :

Nonlinear conjugate gradient methods are among the most preferable and effortless methods to solve smooth optimization problems. Due to their clarity and low memory requirements, they are more desirable for solving large-scale smooth problems. Conjugate gradient methods make use of gradient and the previous direction information to determine the next search direction, and they require no numerical linear algebra. However, the utility of nonlinear conjugate gradient methods has not been widely employed in solving nonsmooth optimization problems. In this paper, a modified nonlinear conjugate gradient method, which achieves the global convergence property and numerical efficiency, is proposed to solve large-scale nonsmooth convex problems. The new method owns the search direction, which generates sufficient descent property and belongs to a trust region. Under some suitable conditions, the global convergence of the proposed algorithm is analyzed for nonsmooth convex problems. The numerical efficiency of the proposed algorithm is tested and compared with some existing methods on some large-scale nonsmooth academic test problems. The numerical results show that the new algorithm has a very good performance in solving large-scale nonsmooth problems.

Keyword :

Conjugate gradient method Conjugate gradient method Global convergence Global convergence Moreau-Yosida regularization Moreau-Yosida regularization Nonsmooth large-scale problems Nonsmooth large-scale problems

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Woldu, Tsegay Giday , Zhang, Haibin , Zhang, Xin et al. A Modified Nonlinear Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization [J]. | JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS , 2020 , 185 (1) : 223-238 .
MLA Woldu, Tsegay Giday et al. "A Modified Nonlinear Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization" . | JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS 185 . 1 (2020) : 223-238 .
APA Woldu, Tsegay Giday , Zhang, Haibin , Zhang, Xin , Fissuh, Yemane Hailu . A Modified Nonlinear Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization . | JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS , 2020 , 185 (1) , 223-238 .
Export to NoteExpress RIS BibTex
The Analysis of Alternating Minimization Method for Double Sparsity Constrained Optimization Problem SCIE
期刊论文 | 2020 , 37 (4) | ASIA-PACIFIC JOURNAL OF OPERATIONAL RESEARCH
WoS CC Cited Count: 2
Abstract&Keyword Cite

Abstract :

This work analyzes the alternating minimization (AM) method for solving double sparsity constrained minimization problem, where the decision variable vector is split into two blocks. The objective function is a separable smooth function in terms of the two blocks. We analyze the convergence of the method for the non-convex objective function and prove a rate of convergence of the norms of the partial gradient mappings. Then, we establish a non-asymptotic sub-linear rate of convergence under the assumption of convexity and the Lipschitz continuity of the gradient of the objective function. To solve the sub-problems of the AM method, we adopt the so-called iterative thresholding method and study their analytical properties. Finally, some future works are discussed.

Keyword :

convergence rate convergence rate double sparsity constrained problem double sparsity constrained problem Alternating minimization Alternating minimization partial gradient mappings partial gradient mappings smooth function smooth function

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Gao, Huan , Li, Yingyi , Zhang, Haibin . The Analysis of Alternating Minimization Method for Double Sparsity Constrained Optimization Problem [J]. | ASIA-PACIFIC JOURNAL OF OPERATIONAL RESEARCH , 2020 , 37 (4) .
MLA Gao, Huan et al. "The Analysis of Alternating Minimization Method for Double Sparsity Constrained Optimization Problem" . | ASIA-PACIFIC JOURNAL OF OPERATIONAL RESEARCH 37 . 4 (2020) .
APA Gao, Huan , Li, Yingyi , Zhang, Haibin . The Analysis of Alternating Minimization Method for Double Sparsity Constrained Optimization Problem . | ASIA-PACIFIC JOURNAL OF OPERATIONAL RESEARCH , 2020 , 37 (4) .
Export to NoteExpress RIS BibTex
10| 20| 50 per page
< Page ,Total 9 >

Export

Results:

Selected

to

Format:
Online/Total:1425/5220807
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.