Convex Analysis and Optimization in Hadamard Spaces

Convex Analysis and Optimization in Hadamard Spaces
Author :
Publisher : Walter de Gruyter GmbH & Co KG
Total Pages : 194
Release :
ISBN-10 : 9783110361629
ISBN-13 : 3110361620
Rating : 4/5 (29 Downloads)

Synopsis Convex Analysis and Optimization in Hadamard Spaces by : Miroslav Bacak

In the past two decades, convex analysis and optimization have been developed in Hadamard spaces. This book represents a first attempt to give a systematic account on the subject. Hadamard spaces are complete geodesic spaces of nonpositive curvature. They include Hilbert spaces, Hadamard manifolds, Euclidean buildings and many other important spaces. While the role of Hadamard spaces in geometry and geometric group theory has been studied for a long time, first analytical results appeared as late as in the 1990s. Remarkably, it turns out that Hadamard spaces are appropriate for the theory of convex sets and convex functions outside of linear spaces. Since convexity underpins a large number of results in the geometry of Hadamard spaces, we believe that its systematic study is of substantial interest. Optimization methods then address various computational issues and provide us with approximation algorithms which may be useful in sciences and engineering. We present a detailed description of such an application to computational phylogenetics. The book is primarily aimed at both graduate students and researchers in analysis and optimization, but it is accessible to advanced undergraduate students as well.

Convex Analysis and Optimization

Convex Analysis and Optimization
Author :
Publisher : Athena Scientific
Total Pages : 560
Release :
ISBN-10 : 9781886529458
ISBN-13 : 1886529450
Rating : 4/5 (58 Downloads)

Synopsis Convex Analysis and Optimization by : Dimitri Bertsekas

A uniquely pedagogical, insightful, and rigorous treatment of the analytical/geometrical foundations of optimization. The book provides a comprehensive development of convexity theory, and its rich applications in optimization, including duality, minimax/saddle point theory, Lagrange multipliers, and Lagrangian relaxation/nondifferentiable optimization. It is an excellent supplement to several of our books: Convex Optimization Theory (Athena Scientific, 2009), Convex Optimization Algorithms (Athena Scientific, 2015), Nonlinear Programming (Athena Scientific, 2016), Network Optimization (Athena Scientific, 1998), and Introduction to Linear Optimization (Athena Scientific, 1997). Aside from a thorough account of convex analysis and optimization, the book aims to restructure the theory of the subject, by introducing several novel unifying lines of analysis, including: 1) A unified development of minimax theory and constrained optimization duality as special cases of duality between two simple geometrical problems. 2) A unified development of conditions for existence of solutions of convex optimization problems, conditions for the minimax equality to hold, and conditions for the absence of a duality gap in constrained optimization. 3) A unification of the major constraint qualifications allowing the use of Lagrange multipliers for nonconvex constrained optimization, using the notion of constraint pseudonormality and an enhanced form of the Fritz John necessary optimality conditions. Among its features the book: a) Develops rigorously and comprehensively the theory of convex sets and functions, in the classical tradition of Fenchel and Rockafellar b) Provides a geometric, highly visual treatment of convex and nonconvex optimization problems, including existence of solutions, optimality conditions, Lagrange multipliers, and duality c) Includes an insightful and comprehensive presentation of minimax theory and zero sum games, and its connection with duality d) Describes dual optimization, the associated computational methods, including the novel incremental subgradient methods, and applications in linear, quadratic, and integer programming e) Contains many examples, illustrations, and exercises with complete solutions (about 200 pages) posted at the publisher's web site http://www.athenasc.com/convexity.html

Introduction to Optimization and Hadamard Semidifferential Calculus, Second Edition

Introduction to Optimization and Hadamard Semidifferential Calculus, Second Edition
Author :
Publisher : SIAM
Total Pages : 446
Release :
ISBN-10 : 9781611975963
ISBN-13 : 1611975964
Rating : 4/5 (63 Downloads)

Synopsis Introduction to Optimization and Hadamard Semidifferential Calculus, Second Edition by : Michel C. Delfour

This second edition provides an enhanced exposition of the long-overlooked Hadamard semidifferential calculus, first introduced in the 1920s by mathematicians Jacques Hadamard and Maurice René Fréchet. Hadamard semidifferential calculus is possibly the largest family of nondifferentiable functions that retains all the features of classical differential calculus, including the chain rule, making it a natural framework for initiating a large audience of undergraduates and non-mathematicians into the world of nondifferentiable optimization. Introduction to Optimization and Hadamard Semidifferential Calculus, Second Edition builds upon its prior edition’s foundations in Hadamard semidifferential calculus, showcasing new material linked to convex analysis and nonsmooth optimization. It presents a modern treatment of optimization and Hadamard semidifferential calculus while remaining at a level that is accessible to undergraduate students, and challenges students with exercises related to problems in such fields as engineering, mechanics, medicine, physics, and economics. Answers are supplied in Appendix B. Students of mathematics, physics, engineering, economics, and other disciplines that demand a basic knowledge of mathematical analysis and linear algebra will find this a fitting primary or companion resource for their studies. This textbook has been designed and tested for a one-term course at the undergraduate level. In its full version, it is appropriate for a first-year graduate course and as a reference.

Convex Analysis and Nonlinear Optimization

Convex Analysis and Nonlinear Optimization
Author :
Publisher : Springer Science & Business Media
Total Pages : 316
Release :
ISBN-10 : 9780387312569
ISBN-13 : 0387312560
Rating : 4/5 (69 Downloads)

Synopsis Convex Analysis and Nonlinear Optimization by : Jonathan Borwein

Optimization is a rich and thriving mathematical discipline, and the underlying theory of current computational optimization techniques grows ever more sophisticated. This book aims to provide a concise, accessible account of convex analysis and its applications and extensions, for a broad audience. Each section concludes with an often extensive set of optional exercises. This new edition adds material on semismooth optimization, as well as several new proofs.

Mathematical Programming and Game Theory

Mathematical Programming and Game Theory
Author :
Publisher : Springer
Total Pages : 234
Release :
ISBN-10 : 9789811330599
ISBN-13 : 981133059X
Rating : 4/5 (99 Downloads)

Synopsis Mathematical Programming and Game Theory by : S.K. Neogy

This book discusses recent developments in mathematical programming and game theory, and the application of several mathematical models to problems in finance, games, economics and graph theory. All contributing authors are eminent researchers in their respective fields, from across the world. This book contains a collection of selected papers presented at the 2017 Symposium on Mathematical Programming and Game Theory at New Delhi during 9–11 January 2017. Researchers, professionals and graduate students will find the book an essential resource for current work in mathematical programming, game theory and their applications in finance, economics and graph theory. The symposium provides a forum for new developments and applications of mathematical programming and game theory as well as an excellent opportunity to disseminate the latest major achievements and to explore new directions and perspectives.

Processing, Analyzing and Learning of Images, Shapes, and Forms: Part 2

Processing, Analyzing and Learning of Images, Shapes, and Forms: Part 2
Author :
Publisher : Elsevier
Total Pages : 706
Release :
ISBN-10 : 9780444641410
ISBN-13 : 0444641416
Rating : 4/5 (10 Downloads)

Synopsis Processing, Analyzing and Learning of Images, Shapes, and Forms: Part 2 by :

Processing, Analyzing and Learning of Images, Shapes, and Forms: Part 2, Volume 20, surveys the contemporary developments relating to the analysis and learning of images, shapes and forms, covering mathematical models and quick computational techniques. Chapter cover Alternating Diffusion: A Geometric Approach for Sensor Fusion, Generating Structured TV-based Priors and Associated Primal-dual Methods, Graph-based Optimization Approaches for Machine Learning, Uncertainty Quantification and Networks, Extrinsic Shape Analysis from Boundary Representations, Efficient Numerical Methods for Gradient Flows and Phase-field Models, Recent Advances in Denoising of Manifold-Valued Images, Optimal Registration of Images, Surfaces and Shapes, and much more. - Covers contemporary developments relating to the analysis and learning of images, shapes and forms - Presents mathematical models and quick computational techniques relating to the topic - Provides broad coverage, with sample chapters presenting content on Alternating Diffusion and Generating Structured TV-based Priors and Associated Primal-dual Methods

Convex and Set-Valued Analysis

Convex and Set-Valued Analysis
Author :
Publisher : Walter de Gruyter GmbH & Co KG
Total Pages : 209
Release :
ISBN-10 : 9783110460308
ISBN-13 : 3110460300
Rating : 4/5 (08 Downloads)

Synopsis Convex and Set-Valued Analysis by : Aram V. Arutyunov

This textbook is devoted to a compressed and self-contained exposition of two important parts of contemporary mathematics: convex and set-valued analysis. In the first part, properties of convex sets, the theory of separation, convex functions and their differentiability, properties of convex cones in finite- and infinite-dimensional spaces are discussed. The second part covers some important parts of set-valued analysis. There the properties of the Hausdorff metric and various continuity concepts of set-valued maps are considered. The great attention is paid also to measurable set-valued functions, continuous, Lipschitz and some special types of selections, fixed point and coincidence theorems, covering set-valued maps, topological degree theory and differential inclusions. Contents: Preface Part I: Convex analysis Convex sets and their properties The convex hull of a set. The interior of convex sets The affine hull of sets. The relative interior of convex sets Separation theorems for convex sets Convex functions Closedness, boundedness, continuity, and Lipschitz property of convex functions Conjugate functions Support functions Differentiability of convex functions and the subdifferential Convex cones A little more about convex cones in infinite-dimensional spaces A problem of linear programming More about convex sets and convex hulls Part II: Set-valued analysis Introduction to the theory of topological and metric spaces The Hausdorff metric and the distance between sets Some fine properties of the Hausdorff metric Set-valued maps. Upper semicontinuous and lower semicontinuous set-valued maps A base of topology of the spaceHc(X) Measurable set-valued maps. Measurable selections and measurable choice theorems The superposition set-valued operator The Michael theorem and continuous selections. Lipschitz selections. Single-valued approximations Special selections of set-valued maps Differential inclusions Fixed points and coincidences of maps in metric spaces Stability of coincidence points and properties of covering maps Topological degree and fixed points of set-valued maps in Banach spaces Existence results for differential inclusions via the fixed point method Notation Bibliography Index

Convex Optimization & Euclidean Distance Geometry

Convex Optimization & Euclidean Distance Geometry
Author :
Publisher : Meboo Publishing USA
Total Pages : 776
Release :
ISBN-10 : 9780976401308
ISBN-13 : 0976401304
Rating : 4/5 (08 Downloads)

Synopsis Convex Optimization & Euclidean Distance Geometry by : Jon Dattorro

The study of Euclidean distance matrices (EDMs) fundamentally asks what can be known geometrically given onlydistance information between points in Euclidean space. Each point may represent simply locationor, abstractly, any entity expressible as a vector in finite-dimensional Euclidean space.The answer to the question posed is that very much can be known about the points;the mathematics of this combined study of geometry and optimization is rich and deep.Throughout we cite beacons of historical accomplishment.The application of EDMs has already proven invaluable in discerning biological molecular conformation.The emerging practice of localization in wireless sensor networks, the global positioning system (GPS), and distance-based pattern recognitionwill certainly simplify and benefit from this theory.We study the pervasive convex Euclidean bodies and their various representations.In particular, we make convex polyhedra, cones, and dual cones more visceral through illustration, andwe study the geometric relation of polyhedral cones to nonorthogonal bases biorthogonal expansion.We explain conversion between halfspace- and vertex-descriptions of convex cones,we provide formulae for determining dual cones,and we show how classic alternative systems of linear inequalities or linear matrix inequalities and optimality conditions can be explained by generalized inequalities in terms of convex cones and their duals.The conic analogue to linear independence, called conic independence, is introducedas a new tool in the study of classical cone theory; the logical next step in the progression:linear, affine, conic.Any convex optimization problem has geometric interpretation.This is a powerful attraction: the ability to visualize geometry of an optimization problem.We provide tools to make visualization easier.The concept of faces, extreme points, and extreme directions of convex Euclidean bodiesis explained here, crucial to understanding convex optimization.The convex cone of positive semidefinite matrices, in particular, is studied in depth.We mathematically interpret, for example,its inverse image under affine transformation, and we explainhow higher-rank subsets of its boundary united with its interior are convex.The Chapter on "Geometry of convex functions",observes analogies between convex sets and functions:The set of all vector-valued convex functions is a closed convex cone.Included among the examples in this chapter, we show how the real affinefunction relates to convex functions as the hyperplane relates to convex sets.Here, also, pertinent results formultidimensional convex functions are presented that are largely ignored in the literature;tricks and tips for determining their convexityand discerning their geometry, particularly with regard to matrix calculus which remains largely unsystematizedwhen compared with the traditional practice of ordinary calculus.Consequently, we collect some results of matrix differentiation in the appendices.The Euclidean distance matrix (EDM) is studied,its properties and relationship to both positive semidefinite and Gram matrices.We relate the EDM to the four classical axioms of the Euclidean metric;thereby, observing the existence of an infinity of axioms of the Euclidean metric beyondthe triangle inequality. We proceed byderiving the fifth Euclidean axiom and then explain why furthering this endeavoris inefficient because the ensuing criteria (while describing polyhedra)grow linearly in complexity and number.Some geometrical problems solvable via EDMs,EDM problems posed as convex optimization, and methods of solution arepresented;\eg, we generate a recognizable isotonic map of the United States usingonly comparative distance information (no distance information, only distance inequalities).We offer a new proof of the classic Schoenberg criterion, that determines whether a candidate matrix is an EDM. Our proofrelies on fundamental geometry; assuming, any EDM must correspond to a list of points contained in some polyhedron(possibly at its vertices) and vice versa.It is not widely known that the Schoenberg criterion implies nonnegativity of the EDM entries; proved here.We characterize the eigenvalues of an EDM matrix and then devisea polyhedral cone required for determining membership of a candidate matrix(in Cayley-Menger form) to the convex cone of Euclidean distance matrices (EDM cone); \ie,a candidate is an EDM if and only if its eigenspectrum belongs to a spectral cone for EDM^N.We will see spectral cones are not unique.In the chapter "EDM cone", we explain the geometric relationship betweenthe EDM cone, two positive semidefinite cones, and the elliptope.We illustrate geometric requirements, in particular, for projection of a candidate matrixon a positive semidefinite cone that establish its membership to the EDM cone. The faces of the EDM cone are described,but still open is the question whether all its faces are exposed as they are for the positive semidefinite cone.The classic Schoenberg criterion, relating EDM and positive semidefinite cones, isrevealed to be a discretized membership relation (a generalized inequality, a new Farkas''''''''-like lemma)between the EDM cone and its ordinary dual. A matrix criterion for membership to the dual EDM cone is derived thatis simpler than the Schoenberg criterion.We derive a new concise expression for the EDM cone and its dual involvingtwo subspaces and a positive semidefinite cone."Semidefinite programming" is reviewedwith particular attention to optimality conditionsof prototypical primal and dual conic programs,their interplay, and the perturbation method of rank reduction of optimal solutions(extant but not well-known).We show how to solve a ubiquitous platonic combinatorial optimization problem from linear algebra(the optimal Boolean solution x to Ax=b)via semidefinite program relaxation.A three-dimensional polyhedral analogue for the positive semidefinite cone of 3X3 symmetricmatrices is introduced; a tool for visualizing in 6 dimensions.In "EDM proximity"we explore methods of solution to a few fundamental and prevalentEuclidean distance matrix proximity problems; the problem of finding that Euclidean distance matrix closestto a given matrix in the Euclidean sense.We pay particular attention to the problem when compounded with rank minimization.We offer a new geometrical proof of a famous result discovered by Eckart \& Young in 1936 regarding Euclideanprojection of a point on a subset of the positive semidefinite cone comprising all positive semidefinite matriceshaving rank not exceeding a prescribed limit rho.We explain how this problem is transformed to a convex optimization for any rank rho.

Algorithms for Solving Common Fixed Point Problems

Algorithms for Solving Common Fixed Point Problems
Author :
Publisher : Springer
Total Pages : 320
Release :
ISBN-10 : 9783319774374
ISBN-13 : 3319774379
Rating : 4/5 (74 Downloads)

Synopsis Algorithms for Solving Common Fixed Point Problems by : Alexander J. Zaslavski

This book details approximate solutions to common fixed point problems and convex feasibility problems in the presence of perturbations. Convex feasibility problems search for a common point of a finite collection of subsets in a Hilbert space; common fixed point problems pursue a common fixed point of a finite collection of self-mappings in a Hilbert space. A variety of algorithms are considered in this book for solving both types of problems, the study of which has fueled a rapidly growing area of research. This monograph is timely and highlights the numerous applications to engineering, computed tomography, and radiation therapy planning. Totaling eight chapters, this book begins with an introduction to foundational material and moves on to examine iterative methods in metric spaces. The dynamic string-averaging methods for common fixed point problems in normed space are analyzed in Chapter 3. Dynamic string methods, for common fixed point problems in a metric space are introduced and discussed in Chapter 4. Chapter 5 is devoted to the convergence of an abstract version of the algorithm which has been called component-averaged row projections (CARP). Chapter 6 studies a proximal algorithm for finding a common zero of a family of maximal monotone operators. Chapter 7 extends the results of Chapter 6 for a dynamic string-averaging version of the proximal algorithm. In Chapters 8 subgradient projections algorithms for convex feasibility problems are examined for infinite dimensional Hilbert spaces.