## SEARCH

#### Institution

##### ( see all 662)

- Wayne State University 18 (%)
- Chinese Academy of Sciences 13 (%)
- Dalian University of Technology 13 (%)
- Nanjing Normal University 13 (%)
- Shanghai Normal University 13 (%)

#### Author

##### ( see all 1517)

- Mordukhovich, Boris S. 16 (%)
- Zhu, Detong 10 (%)
- Curtis, Frank E. 9 (%)
- Robinson, Daniel P. 9 (%)
- Fang, Liang 8 (%)

#### Subject

##### ( see all 83)

- Mathematics [x] 674 (%)
- Mathematics of Computing 276 (%)
- Calculus of Variations and Optimal Control; Optimization 211 (%)
- Optimization 199 (%)
- Numerical Analysis 192 (%)

## CURRENTLY DISPLAYING:

Most articles

Fewest articles

Showing 1 to 10 of 674 matching Articles
Results per page:

## On the Convergence Properties of a Second-Order Augmented Lagrangian Method for Nonlinear Programming Problems with Inequality Constraints

### Journal of Optimization Theory and Applications (2015-11-18): 1-18 , November 18, 2015

The objective of this paper is to conduct a theoretical study on the convergence properties of a second-order augmented Lagrangian method for solving nonlinear programming problems with both equality and inequality constraints. Specifically, we utilize a specially designed generalized Newton method to furnish the second-order iteration of the multipliers and show that when the linear independent constraint qualification and the strong second-order sufficient condition hold, the method employed in this paper is locally convergent and possesses a superlinear rate of convergence, although the penalty parameter is fixed and/or the strict complementarity fails.

## Automatic Differentiation: Point and Interval Taylor Operators

### Encyclopedia of Optimization (2001-01-01): 113-118 , January 01, 2001

## An SQP algorithm for mathematical programs with nonlinear complementarity constraints

### Applied Mathematics and Mechanics (2009-05-01) 30: 659-668 , May 01, 2009

In this paper, we describe a successive approximation and smooth sequential quadratic programming (SQP) method for mathematical programs with nonlinear complementarity constraints (MPCC). We introduce a class of smooth programs to approximate the MPCC. Using an *l*_{1} penalty function, the line search assures global convergence, while the superlinear convergence rate is shown under the strictly complementary and second-order sufficient conditions. Moreover, we prove that the current iterated point is an exact stationary point of the mathematical programs with equilibrium constraints (MPEC) when the algorithm terminates finitely.

## Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function- and derivative-evaluation complexity

### Mathematical Programming (2011-12-01) 130: 295-319 , December 01, 2011

An Adaptive Regularisation framework using Cubics (ARC) was proposed for unconstrained optimization and analysed in Cartis, Gould and Toint (Part I, Math Program, doi: 10.1007/s10107-009-0286-5 , 2009), generalizing at the same time an unpublished method due to Griewank (Technical Report NA/12, 1981, DAMTP, University of Cambridge), an algorithm by Nesterov and Polyak (Math Program 108(1):177–205, 2006) and a proposal by Weiser, Deuflhard and Erdmann (Optim Methods Softw 22(3):413–431, 2007). In this companion paper, we further the analysis by providing worst-case global iteration complexity bounds for ARC and a second-order variant to achieve approximate first-order, and for the latter second-order, criticality of the iterates. In particular, the second-order ARC algorithm requires at most $${\mathcal{O}(\epsilon^{-3/2})}$$ iterations, or equivalently, function- and gradient-evaluations, to drive the norm of the gradient of the objective below the desired accuracy $${\epsilon}$$ , and $${\mathcal{O}(\epsilon^{-3})}$$ iterations, to reach approximate nonnegative curvature in a subspace. The orders of these bounds match those proved for Algorithm 3.3 of Nesterov and Polyak which minimizes the cubic model globally on each iteration. Our approach is more general in that it allows the cubic model to be solved only approximately and may employ approximate Hessians.

## Semidefinite representation of convex sets

### Mathematical Programming (2010-03-01) 122: 21-64 , March 01, 2010

Let
$${S =\{x\in \mathbb{R}^n: g_1(x)\geq 0, \ldots, g_m(x)\geq 0\}}$$
be a semialgebraic set defined by multivariate polynomials *g*_{i}(*x*). Assume *S* is convex, compact and has nonempty interior. Let
$${S_i =\{x\in \mathbb{R}^n:\, g_i(x)\geq 0\}}$$
, and ∂ *S* (resp. ∂ *S*_{i}) be the boundary of *S* (resp. *S*_{i}). This paper, as does the subject of semidefinite programming (SDP), concerns linear matrix inequalities (LMIs). The set *S* is said to have an LMI representation if it equals the set of solutions to some LMI and it is known that some convex *S* may not be LMI representable (Helton and Vinnikov in Commun Pure Appl Math 60(5):654–674, 2007). A question arising from Nesterov and Nemirovski (SIAM studies in applied mathematics. Society for Industrial and Applied Mathematics (SIAM), Philadelphia, 1994), see Helton and Vinnikov in Commun Pure Appl Math 60(5):654–674, 2007 and Nemirovski in Plenary lecture, International Congress of Mathematicians (ICM), Madrid, Spain, 2006, is: given a subset *S* of
$${\mathbb{R}^n}$$
, does there exist an LMI representable set Ŝ in some higher dimensional space
$${ \mathbb{R}^{n+N}}$$
whose projection down onto
$${\mathbb{R}^n}$$
equals *S*. Such *S* is called semidefinite representable or SDP representable. This paper addresses the SDP representability problem. The following are the main contributions of this paper: (i) assume *g*_{i}(*x*) are all concave on *S*. If the positive definite Lagrange Hessian condition holds, i.e., the Hessian of the Lagrange function for optimization problem of minimizing any nonzero linear function *ℓ*^{T}*x* on *S* is positive definite at the minimizer, then *S* is SDP representable. (ii) If each *g*_{i}(*x*) is either sos-concave ( − ∇^{2}*g*_{i}(*x*) = *W*(*x*)^{T}*W*(*x*) for some possibly nonsquare matrix polynomial *W*(*x*)) or strictly quasi-concave on *S*, then *S* is SDP representable. (iii) If each *S*_{i} is either sos-convex or poscurv-convex (*S*_{i} is compact convex, whose boundary has positive curvature and is nonsingular, i.e., ∇*g*_{i}(*x*) ≠ 0 on ∂ *S*_{i} ∩ *S*), then *S* is SDP representable. This also holds for *S*_{i} for which ∂ *S*_{i} ∩ *S* extends smoothly to the boundary of a poscurv-convex set containing *S*. (iv) We give the complexity of Schmüdgen and Putinar’s matrix Positivstellensatz, which are critical to the proofs of (i)–(iii).

## Entropy Function-Based Algorithms for Solving a Class of Nonconvex Minimization Problems

### Journal of the Operations Research Society of China (2015-12-01) 3: 441-458 , December 01, 2015

Recently, the $$l_p$$ minimization problem ( $$p\in (0,\,1)$$ ) for sparse signal recovery has been studied a lot because of its efficiency. In this paper, we propose a general smoothing algorithmic framework based on the entropy function for solving a class of $$l_p$$ minimization problems, which includes the well-known unconstrained $$l_2$$ – $$l_p$$ problem as a special case. We show that any accumulation point of the sequence generated by the proposed algorithm is a stationary point of the $$l_p$$ minimization problem, and derive a lower bound for the nonzero entries of the stationary point of the smoothing problem. We implement a specific version of the proposed algorithm which indicates that the entropy function-based algorithm is effective.

## Optimality conditions and finite convergence of Lasserre’s hierarchy

### Mathematical Programming (2014-08-01) 146: 97-121 , August 01, 2014

Lasserre’s hierarchy is a sequence of semidefinite relaxations for solving polynomial optimization problems globally. This paper studies the relationship between optimality conditions in nonlinear programming theory and finite convergence of Lasserre’s hierarchy. Our main results are: (i) Lasserre’s hierarchy has finite convergence when the constraint qualification, strict complementarity and second order sufficiency conditions hold at every global minimizer, under the standard archimedean condition; the proof uses a result of Marshall on boundary hessian conditions. (ii) These optimality conditions are all satisfied at every local minimizer if a finite set of polynomials, which are in the coefficients of input polynomials, do not vanish at the input data (i.e., they hold in a Zariski open set). This implies that, under archimedeanness, Lasserre’s hierarchy has finite convergence generically.

## Deriving robust counterparts of nonlinear uncertain inequalities

### Mathematical Programming (2015-02-01) 149: 265-299 , February 01, 2015

In this paper we provide a systematic way to construct the robust counterpart of a nonlinear uncertain inequality that is concave in the uncertain parameters. We use convex analysis (support functions, conjugate functions, Fenchel duality) and conic duality in order to convert the robust counterpart into an explicit and computationally tractable set of constraints. It turns out that to do so one has to calculate the support function of the uncertainty set and the concave conjugate of the nonlinear constraint function. Conveniently, these two computations are completely independent. This approach has several advantages. First, it provides an easy structured way to construct the robust counterpart both for linear and nonlinear inequalities. Second, it shows that for new classes of uncertainty regions and for new classes of nonlinear optimization problems tractable counterparts can be derived. We also study some cases where the inequality is nonconcave in the uncertain parameters.

## A globally and quadratically convergent algorithm for general nonlinear programming problems

### Computing (1981-06-01) 26: 141-153 , June 01, 1981

This paper presents an algorithm for the minimization of a nonlinear objective function subject to nonlinear inequality and equality constraints. The proposed method has the two distinguishing properties that, under weak assumptions, it converges to a Kuhn-Tucker point for the problem and under somewhat stronger assumptions, the rate of convergence is quadratic. The method is similar to a recent method proposed by Rosen in that it begins by using a penalty function approach to generate a point in a neighborhood of the optimum and then switches to Robinson's method. The new method has two new features not shared by Rosen's method. First, a correct choice of penalty function parameters is constructed automatically, thus guaranteeing global convergence to a stationary point. Second, the linearly constrained subproblems solved by the Robinson method normally contain linear inequality constraints while for the method presented here, only linear equality constraints are required. That is, in a certain sense, the new method “knows” which of the linear inequality constraints will be active in the subproblems. The subproblems may thus be solved in an especially efficient manner.

Preliminary computational results are presented.

## A new derivative-free SCG-type projection method for nonlinear monotone equations with convex constraints

### Journal of Applied Mathematics and Computing (2016-11-04): 1-22 , November 04, 2016

Based on a modified line search scheme, this paper presents a new derivative-free projection method for solving nonlinear monotone equations with convex constraints, which can be regarded as an extension of the scaled conjugate gradient method and the projection method. Under appropriate conditions, the global convergence and linear convergence rate of the proposed method is proven. Preliminary numerical results are also reported to show that this method is promising.