Pattern Search Methods for Linearly Constrained Minimization in the Presence of Degeneracy

Pattern Search Methods for Linearly Constrained Minimization in the Presence of Degeneracy
Author :
Publisher :
Total Pages : 19
Release :
ISBN-10 : OCLC:227894472
ISBN-13 :
Rating : 4/5 (72 Downloads)

Synopsis Pattern Search Methods for Linearly Constrained Minimization in the Presence of Degeneracy by :

This paper deals with generalized pattern search (GPS) algorithms for linearly constrained optimization. At each iteration, the GPS algorithm generates a set of directions that conforms to the geometry of any nearby linear constrains, and this is used to define the POLL set for that iteration. The contribution of this paper is to provide a detailed algorithm for constructing the set of directions at a current iterate whether or not the constraints are degenerate. The main difficulty in the degenerate case is in classifying constraints as redundant and nonredundant . We give a short survey of the main definitions and methods concerning redundancy and propose an approach, which may be useful for other active set algorithms, to identify the nonredundant constraints.

Second Order Behavior of Pattern Search

Second Order Behavior of Pattern Search
Author :
Publisher :
Total Pages : 17
Release :
ISBN-10 : OCLC:227896872
ISBN-13 :
Rating : 4/5 (72 Downloads)

Synopsis Second Order Behavior of Pattern Search by :

Abstract. Previous analyses of pattern search algorithms for unconstrained and linearly constrained minimization have focused on proving convergence of a subsequence of iterates to a limit point satisfying either directional or first-order necessary conditions for optimality, depending on the smoothness of the objective function in a neighborhood of the limit point. Even though pattern search methods require no derivative information, we are able to prove some limited directional second-order results. Although not as strong as classical second-order necessary conditions, these results are stronger than the first order conditions that many gradient-based methods satisfy. Under fairly mild conditions, we can eliminate from consideration all strict local maximizers and an entire class of saddle points.

Introduction to Optimization Methods

Introduction to Optimization Methods
Author :
Publisher : Springer Science & Business Media
Total Pages : 214
Release :
ISBN-10 : 9789400957053
ISBN-13 : 940095705X
Rating : 4/5 (53 Downloads)

Synopsis Introduction to Optimization Methods by : P. Adby

During the last decade the techniques of non-linear optim ization have emerged as an important subject for study and research. The increasingly widespread application of optim ization has been stimulated by the availability of digital computers, and the necessity of using them in the investigation of large systems. This book is an introduction to non-linear methods of optimization and is suitable for undergraduate and post graduate courses in mathematics, the physical and social sciences, and engineering. The first half of the book covers the basic optimization techniques including linear search methods, steepest descent, least squares, and the Newton-Raphson method. These are described in detail, with worked numerical examples, since they form the basis from which advanced methods are derived. Since 1965 advanced methods of unconstrained and constrained optimization have been developed to utilise the computational power of the digital computer. The second half of the book describes fully important algorithms in current use such as variable metric methods for unconstrained problems and penalty function methods for constrained problems. Recent work, much of which has not yet been widely applied, is reviewed and compared with currently popular techniques under a few generic main headings. vi PREFACE Chapter I describes the optimization problem in mathemat ical form and defines the terminology used in the remainder of the book. Chapter 2 is concerned with single variable optimization. The main algorithms of both search and approximation methods are developed in detail since they are an essential part of many multi-variable methods.

Implicit Filtering

Implicit Filtering
Author :
Publisher : SIAM
Total Pages : 171
Release :
ISBN-10 : 9781611971897
ISBN-13 : 1611971896
Rating : 4/5 (97 Downloads)

Synopsis Implicit Filtering by : C. T. Kelley

A description of the implicit filtering algorithm, its convergence theory and a new MATLAB® implementation.

Analysis of Generalized Pattern Searches

Analysis of Generalized Pattern Searches
Author :
Publisher :
Total Pages : 16
Release :
ISBN-10 : OCLC:74287985
ISBN-13 :
Rating : 4/5 (85 Downloads)

Synopsis Analysis of Generalized Pattern Searches by :

This paper contains a new convergence analysis for the Lewis and Torezon GPS class of pattern search methods for linearly constrained optimization. The analysis is motivated by the desire to understand the behavior of the algorithm under hypotheses more consistent with properties satisfied in practice for a class of problems, discussed at various points in the paper, for which these methods are successful. Specifically, even if the objective function is discontinuous or extended valued, the methods find a limit point with some minimizing properties. Simple examples show that the strength of the optimality conditions at a limit point does not depend only on the algorithm, but also on the directions it uses, and on the smoothness of the objective at the limit point in question. This contribution of this paper is to provide a simple convergence analysis that supplies detail about the relation of optimality conditions to objective smoothness properties, and the defining directions for the algorithm, and it gives older results as easy corollaries.

Handbook of Parallel Computing and Statistics

Handbook of Parallel Computing and Statistics
Author :
Publisher : CRC Press
Total Pages : 560
Release :
ISBN-10 : 1420028685
ISBN-13 : 9781420028683
Rating : 4/5 (85 Downloads)

Synopsis Handbook of Parallel Computing and Statistics by : Erricos John Kontoghiorghes

Technological improvements continue to push back the frontier of processor speed in modern computers. Unfortunately, the computational intensity demanded by modern research problems grows even faster. Parallel computing has emerged as the most successful bridge to this computational gap, and many popular solutions have emerged based on its concepts

Introduction to Derivative-Free Optimization

Introduction to Derivative-Free Optimization
Author :
Publisher : SIAM
Total Pages : 276
Release :
ISBN-10 : 9780898716689
ISBN-13 : 0898716683
Rating : 4/5 (89 Downloads)

Synopsis Introduction to Derivative-Free Optimization by : Andrew R. Conn

The first contemporary comprehensive treatment of optimization without derivatives. This text explains how sampling and model techniques are used in derivative-free methods and how they are designed to solve optimization problems. It is designed to be readily accessible to both researchers and those with a modest background in computational mathematics.