## SEARCH

#### Institution

##### ( see all 904)

- Dalian University of Technology 34 (%)
- Qufu Normal University 23 (%)
- The Hong Kong Polytechnic University 23 (%)
- Shanghai University 20 (%)
- Chongqing Normal University 17 (%)

#### Author

##### ( see all 1538)

- Sun, Min 13 (%)
- Qi, Liqun 11 (%)
- Zhu, Detong 11 (%)
- Jeyakumar, V. 10 (%)
- Zalmai, G. J. 10 (%)

#### Subject

##### ( see all 80)

- Mathematics [x] 1031 (%)
- Optimization 348 (%)
- Calculus of Variations and Optimal Control; Optimization 345 (%)
- Mathematics of Computing 311 (%)
- Applications of Mathematics 301 (%)

## CURRENTLY DISPLAYING:

Most articles

Fewest articles

Showing 1 to 10 of 1031 matching Articles
Results per page:

## Study of a One-Dimensional Optimal Control Problem with a Purely State-Dependent Cost

### Differential Equations and Dynamical Systems (2016-06-25): 1-19 , June 25, 2016

A one-dimensional optimal control problem with a state-dependent cost and a unimodular integrand is considered. It is shown that, under some standard assumptions, this problem can be solved without using the Pontryagin maximum principle, by simple methods of the classical analysis, basing on the Tchyaplygin comparison theorem. However, in some modifications of the problem, the usage of Pontryagin’s maximum principle is preferable. The optimal synthesis for the problem and for its modifications is obtained.

## On the Convergence Properties of a Second-Order Augmented Lagrangian Method for Nonlinear Programming Problems with Inequality Constraints

### Journal of Optimization Theory and Applications (2015-11-18): 1-18 , November 18, 2015

The objective of this paper is to conduct a theoretical study on the convergence properties of a second-order augmented Lagrangian method for solving nonlinear programming problems with both equality and inequality constraints. Specifically, we utilize a specially designed generalized Newton method to furnish the second-order iteration of the multipliers and show that when the linear independent constraint qualification and the strong second-order sufficient condition hold, the method employed in this paper is locally convergent and possesses a superlinear rate of convergence, although the penalty parameter is fixed and/or the strict complementarity fails.

## Automatic Differentiation: Point and Interval Taylor Operators

### Encyclopedia of Optimization (2001-01-01): 113-118 , January 01, 2001

## Modified projection method for solving a system of monotone equations with convex constraints

### Journal of Applied Mathematics and Computing (2010-12-01) 34: 47-56 , December 01, 2010

In this paper, we propose a modified projection method for solving a system of monotone equations with convex constraints. At each iteration of the method, we first solve a system of linear equations approximately, and then perform a projection of the initial point onto the intersection set of the feasible set and two half spaces containing the current iterate to obtain the next one. The iterate sequence generated by the proposed algorithm possesses an expansive property with regard to the initial point. Under mild condition, we show that the proposed algorithm is globally convergent. Preliminary numerical experiments are also reported.

## Bilevel Programming: Optimality Conditions and Duality

### Encyclopedia of Optimization (2001-01-01): 180-185 , January 01, 2001

## Jacobian consistency of a one-parametric class of smoothing Fischer–Burmeister functions for SOCCP

### Computational and Applied Mathematics (2016-05-28): 1-17 , May 28, 2016

The Jacobian consistency of smoothing functions plays an important role for achieving the rapid convergence of Newton methods or Newton-like methods with an appropriate parameter control. In this paper, we study the properties, derive the computable formula for the Jacobian matrix and prove the Jacobian consistency of a one-parametric class of smoothing Fischer–Burmeister functions for second-order cone complementarity problems proposed by Tang et al. (Comput Appl Math 33:655–669, 2014). Then we apply its Jacobian consistency to a smoothing Newton method with the appropriate parameter control presented by Chen et al. (Math Comput 67:519–540, 1998), and show the global convergence and local quadratic convergence of the algorithm for solving the SOCCP under rather weak assumptions.

## An SQP algorithm for mathematical programs with nonlinear complementarity constraints

### Applied Mathematics and Mechanics (2009-05-01) 30: 659-668 , May 01, 2009

In this paper, we describe a successive approximation and smooth sequential quadratic programming (SQP) method for mathematical programs with nonlinear complementarity constraints (MPCC). We introduce a class of smooth programs to approximate the MPCC. Using an *l*_{1} penalty function, the line search assures global convergence, while the superlinear convergence rate is shown under the strictly complementary and second-order sufficient conditions. Moreover, we prove that the current iterated point is an exact stationary point of the mathematical programs with equilibrium constraints (MPEC) when the algorithm terminates finitely.

## Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function- and derivative-evaluation complexity

### Mathematical Programming (2011-12-01) 130: 295-319 , December 01, 2011

An Adaptive Regularisation framework using Cubics (ARC) was proposed for unconstrained optimization and analysed in Cartis, Gould and Toint (Part I, Math Program, doi: 10.1007/s10107-009-0286-5 , 2009), generalizing at the same time an unpublished method due to Griewank (Technical Report NA/12, 1981, DAMTP, University of Cambridge), an algorithm by Nesterov and Polyak (Math Program 108(1):177–205, 2006) and a proposal by Weiser, Deuflhard and Erdmann (Optim Methods Softw 22(3):413–431, 2007). In this companion paper, we further the analysis by providing worst-case global iteration complexity bounds for ARC and a second-order variant to achieve approximate first-order, and for the latter second-order, criticality of the iterates. In particular, the second-order ARC algorithm requires at most $${\mathcal{O}(\epsilon^{-3/2})}$$ iterations, or equivalently, function- and gradient-evaluations, to drive the norm of the gradient of the objective below the desired accuracy $${\epsilon}$$ , and $${\mathcal{O}(\epsilon^{-3})}$$ iterations, to reach approximate nonnegative curvature in a subspace. The orders of these bounds match those proved for Algorithm 3.3 of Nesterov and Polyak which minimizes the cubic model globally on each iteration. Our approach is more general in that it allows the cubic model to be solved only approximately and may employ approximate Hessians.

## Parallelized hybrid optimization methods for nonsmooth problems using NOMAD and linesearch

### Computational and Applied Mathematics (2017-09-08): 1-36 , September 08, 2017

Two parallelized hybrid methods are presented for single-function optimization problems with side constraints. The optimization problems are difficult not only due to possible existence of local minima and nonsmoothness of functions, but also due to the fact that objective function and constraint values for a solution vector can only be obtained by querying a black box whose execution requires considerable computational effort. Examples are optimization problems in Engineering where objective function and constraint values are computed via complex simulation programs, and where local minima exist and smoothness of functions is not assured. The hybrid methods consist of the well-known method NOMAD and two new methods called DENCON and DENPAR that are based on the linesearch scheme CS-DFN. The hybrid methods compute for each query a set of solution vectors that are evaluated in parallel. The hybrid methods have been tested on a set of difficult optimization problems produced by a certain seeding scheme for multiobjective optimization. We compare computational results with solution by NOMAD, DENCON, and DENPAR as stand-alone methods. It turns out that among the stand-alone methods, NOMAD is significantly better than DENCON and DENPAR. However, the hybrid methods are definitely better than NOMAD.